Metaflow

Latest version: v2.13.9

Safety actively analyzes 702100 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 29 of 29

2.0.4

- [Improvements](v2.0.4_improvements)
- Expose `retry_count` in [`Current`](https://docs.metaflow.org/metaflow/tagging#accessing-current-ids-in-a-flow)
- Mute superfluous `ThrottleExceptions` in AWS Batch job logs
- [Bug Fixes](v2.0.4_bugfixes)
- Set proper thresholds for retrying `DescribeJobs` API for AWS Batch
- Explicitly override `PYTHONNOUSERSITE` for `conda` environments
- Preempt AWS Batch job log collection when the job fails to get into a `RUNNING` state

The Metaflow 2.0.4 release is a minor patch release.

<a name="v2.0.4_improvements"></a> Improvements
Expose `retry_count` in `Current`
You can now use the [`current`](https://docs.metaflow.org/metaflow/tagging#accessing-current-ids-in-a-flow) singleton to access the `retry_count` of your task. The first attempt of the task will have `retry_count` as 0 and subsequent retries will increment the `retry_count`. As an example:
python
retry
step
def my_step(self):
from metaflow import current
print("retry_count: %s" % current.retry_count)
self.next(self.a)


Mute superfluous `ThrottleExceptions` in AWS Batch job logs
The AWS Logs API for `get_log_events` has a global hard limit on 10 requests per sec. While we have retry logic in place to respect this limit, some of the `ThrottleExceptions` usually end up in the job logs causing confusion to the end-user. This release addresses this issue (also documented in 184).

<a name="v2.0.3_bugfixes"></a> Bug Fixes
Set proper thresholds for retrying `DescribeJobs` API for AWS Batch
The AWS Batch API for `describe_jobs` throws `ThrottleExceptions` when managing a flow with a very wide `for-each` step. This release adds retry behavior with backoffs to add proper resiliency (addresses 138).

Explicitly override `PYTHONNOUSERSITE` for `conda` environments
In certain user environments, to properly isolate `conda` environments, we have to explicitly override `PYTHONNOUSERSITE` rather than simply relying on `python -s` (addresses 178).

Preempt AWS Batch job log collection when the job fails to get into a `RUNNING` state
Fixes a bug where if the AWS Batch job crashes before entering the `RUNNING` state (often due to incorrect IAM perms), the previous log collection behavior would fail to print the correct error message making it harder to debug the issue (addresses 185).

2.0.3

- [Improvements](v2.0.3_improvements)
- Parameter listing
- Ability to specify S3 endpoint
- Usability improvements
- [Performance](v2.0.3_performance)
- Conda
- [Bug Fixes](v2.0.3_bugfixes)
- Executing on AWS Batch

The Metaflow 2.0.3 release is a minor patch release.

<a name="v2.0.3_improvements"></a> Improvements
Parameter listing
You can now use the `current` singleton (documented [here](https://docs.metaflow.org/metaflow/tagging#accessing-current-ids-in-a-flow)) to access the names of the parameters passed into your flow. As an example:
python
for var in current.parameter_names:
print("Parameter %s has value %s" % (var, getattr(self, var))
`
This addresses 137.

Usability improvements
A few issues were addressed to improve the usability of Metaflow. In particular, `show` now properly respects indentation making the description of steps and flows more readable. This addresses 92. Superfluous print messages were also suppressed when executing on AWS batch with the local metadata provider (152).
<a name="v2.0.3_performance"></a> Performance
Conda
A smaller, newer and standalone Conda installer is now used resulting in faster and more reliable Conda bootstrapping (123).
<a name="v2.0.3_bugfixes"></a> Bug Fixes
Executing on AWS Batch
We now check for the command line `--datastore-root` prior to using the environment variable `METAFLOW_DATASTORE_SYSROOT_S3` when determining the S3 root (134). This release also fixes an issue where using the local Metadata provider with AWS batch resulted in incorrect directory structure in the `.metaflow` directory (141).

2.0.2

Bug Fixes
- [Pin](https://github.com/Netflix/metaflow/pull/107) click to v7.0 or greater
- [Add](https://github.com/Netflix/metaflow/pull/118) checks to conda-package metadata to guard against .conda packages

2.0.1

Enhancements
- [Introduce](https://github.com/Netflix/metaflow/pull/53) `metaflow configure [import|export]` for importing/exporting Metaflow configurations.
- [Revamp](https://github.com/Netflix/metaflow/pull/59) `metaflow configure aws` command to address usability [concerns](https://github.com/Netflix/metaflow/issues/44).
- [Handle](https://github.com/Netflix/metaflow/pull/56) keyboard interrupts for Batch jobs [more gracefully for large fan-outs](https://github.com/Netflix/metaflow/issues/54).

Bug Fixes
- [Fix](https://github.com/Netflix/metaflow/pull/62) a docker registry parsing bug in AWS Batch.
- Fix various typos in Metaflow tutorials.

2.0.0

Hello World!

First Open Source Release.

Page 29 of 29

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.