Read below how Metaflow has improved over time.
We take backwards compatibility very seriously. In the vast majority of cases you can upgrade Metaflow without expecting changes in your existing code. In the rare cases when breaking changes are absolutely necessary, usually due to bug fixes, you can take a look at minor breaking changes below before you upgrade.
The Metaflow 2.0.3 release is a minor patch release.
Ability to specify S3 endpoint
Executing on AWS Batch
You can now use the
current singleton (documented here) to access the names of the parameters passed into your flow. As an example:
for var in current.parameter_names:print("Parameter %s has value %s" % (var, getattr(self, var))
This addresses #137.
A few issues were addressed to improve the usability of Metaflow. In particular,
show now properly respects indentation making the description of steps and flows more readable. This addresses #92. Superfluous print messages were also suppressed when executing on AWS batch with the local metadata provider (#152).
A smaller, newer and standalone Conda installer is now used resulting in faster and more reliable Conda bootstrapping (#123).
We now check for the command line
--datastore-root prior to using the environment variable
METAFLOW_DATASTORE_SYSROOT_S3 when determining the S3 root (#134). This release also fixes an issue where using the local Metadata provider with AWS batch resulted in incorrect directory structure in the
.metaflow directory (#141).
metaflow configure [import|export] for importing/exporting Metaflow configurations.
Fix a docker registry parsing bug in AWS Batch.
Fix various typos in Metaflow tutorials.
First Open Source Release.
Read the blogpost announcing the release