Metaflow

Latest version: v2.13

Safety actively analyzes 693883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 25 of 28

2.2.11

The Metaflow 2.2.11 release is a minor patch release.
- [Bug Fixes](2.2.11_bugs)
- [Fix regression that broke compatibility with Python 2.7](498)
- [Fix a corner case when converting options to CL arguments](495)
- [Fix a bug in case of a hard crash in a step](494)
- [The Conda environment now delegates to the default environment properly for get_environment_info](493)

<a name="v2.2.11_bugs"></a>Bug Fixes
[Fix regression that broke compatibility with Python 2.7](498)
`shlex.quote`, introduced in 493, is not compatible with Python 2.7. `pipes.quote` is now used for Python 2.7.

[Fix a corner case when converting options to CL arguments](495)
Some plugins may need to escape shell variables when using them in command lines. This patch allows this to work.

[Fix a bug in case of a hard crash in a step](494)
In some cases, a hard crash in a step would cause the status of the step to not be properly reported.
[The Conda environment now delegates to the default environment properly for get_environment_info](493)
The Conda environment now delegates `get_environment_info` to the `DEFAULT_ENVIRONMENT` as opposed to the `MetaflowEnvironment`. This does not change the current default behavior.

2.2.10

The Metaflow 2.2.10 release is a minor patch release.
- [Features](2.2.10_features)
- [AWS Logs Group, Region and Stream are now available in metadata for tasks executed on AWS Batch](478)
- [Execution logs are now available for all tasks in Metaflow universe](449)
- [Bug Fixes](2.2.10_bugs)
- [Fix regression with `ping/` endpoint for Metadata service](483)
- [Fix the behaviour of `--namespace=` CLI args when executing a flow](https://gitter.im/metaflow_org/community?at=605decca68921b62f48a4190)



<a name="v2.2.10_features"></a> Features
[AWS Logs Group, Region and Stream are now available in metadata for tasks executed on AWS Batch](478)
For tasks that execute on AWS Batch, Metaflow now records the location where the AWS Batch instance writes the container logs in AWS Logs. This can be handy in locating the logs through the client API -

Step('Flow/42/a').task.metadata_dict['aws-batch-awslogs-group']
Step('Flow/42/a').task.metadata_dict['aws-batch-awslogs-region']
Step('Flow/42/a').task.metadata_dict['aws-batch-awslogs-stream']

PR: 478

[Execution logs are now available for all tasks in Metaflow universe](449)
All Metaflow runtime/task logs are now published via a sidecar process to the datastore. The user-visible logs on the console are streamed directly from the datastore. For Metaflow's integrations with the cloud (AWS at the moment), the compute tasks logs (AWS Batch) are directly written by Metaflow into the datastore (Amazon S3) independent of where the flow is launched from (User's laptop or AWS Step Functions). This has multiple benefits
- Metaflow no longer relies on AWS Cloud Watch for fetching the AWS Batch execution logs to the console - AWS Cloud Watch has rather low global API limits which have caused multiple issues in the past for our users
- Logs for AWS Step Functions executions are now also available in Amazon S3 and can be easily fetched by simply doing `python flow.py logs 42/start` or `Step('Flow/42/start').task.stdout`. PR: 449

<a name="v2.2.10_bugs"></a> Bug Fixes
[Fix regression with `ping/` endpoint for Metadata service](483)
Fix a regression introduced in `v2.2.9` where the endpoint responsible for ascertaining the version of the deployed Metadata service was erroneously moved to `ping/` from `ping` PR: 484

[Fix the behaviour of `--namespace=` CLI args when executing a flow](https://gitter.im/metaflow_org/community?at=605decca68921b62f48a4190)
`python flow.py run --namespace=` now correctly makes the global namespace visible within the flow execution. PR: 461

2.2.9

The Metaflow 2.2.9 release is a minor patch release.
- [Bug Fixes](2.2.9_bugs)
- [Remove pinned pylint dependency](https://gitter.im/metaflow_org/community?at=60622af8940f1d555e277c12)
- [Improve handling of `/` in image parameter for batch](https://gitter.im/metaflow_org/community?at=5f80e21d02e81701b0106c6d)
- List custom FlowSpec parameters in the intended order

<a name="v2.2.9_bugs"></a> Bugs
[Remove pinned pylint dependency](https://gitter.im/metaflow_org/community?at=60622af8940f1d555e277c12)
Pylint dependency was unpinned and made floating. See PR 462.

[Improve handling of `/` in image parameter for batch](https://gitter.im/metaflow_org/community?at=5f80e21d02e81701b0106c6d)
You are now able to specify docker images of the form `foo/bar/baz:tag` in the batch decorator. See PR 466.

List custom FlowSpec parameters in the intended order
The order in which parameters are specified by the user in the FlowSpec is now preserved when displaying them with `--help`. See PR 456.

2.2.8

The Metaflow 2.2.8 release is a minor patch release.
- [Bug Fixes](2.2.8_bugs)
- [Fix `environment` behavior for conflicting attribute values](https://gitter.im/metaflow_org/community?at=604a2bfb44f5a454a46cc7f8)
- [Fix `environment is not callable` error when using `environment`](https://gitter.im/metaflow_org/community?at=6048a07d823b6654d296d62d)

<a name="v2.2.8_bugs"></a> Bugs
[Fix `environment` behavior for conflicting attribute values](https://gitter.im/metaflow_org/community?at=604a2bfb44f5a454a46cc7f8)
Metaflow was incorrectly handling environment variables passed through the `environment` decorator in some specific instances. When `environment` decorator is specified over multiple steps, the actual environment that's available to any step is the union of attributes of all the `environment` decorators; which is incorrect behavior. For example, in the following workflow -

from metaflow import FlowSpec, step, batch, environment
import os
class LinearFlow(FlowSpec):
environment(vars={'var':os.getenv('var_1')})
step
def start(self):
print(os.getenv('var'))
self.next(self.a)
environment(vars={'var':os.getenv('var_2')})
step
def a(self):
print(os.getenv('var'))
self.next(self.end)
step
def end(self):
pass
if __name__ == '__main__':
LinearFlow()


var_1=foo var_2=bar python flow.py run

will result in

Metaflow 2.2.7.post10+gitb7d4c48 executing LinearFlow for user:savin
Validating your flow...
The graph looks good!
Running pylint...
Pylint is happy!
2021-03-12 20:46:04.161 Workflow starting (run-id 6810):
2021-03-12 20:46:04.614 [6810/start/86638 (pid 10997)] Task is starting.
2021-03-12 20:46:06.783 [6810/start/86638 (pid 10997)] foo
2021-03-12 20:46:07.815 [6810/start/86638 (pid 10997)] Task finished successfully.
2021-03-12 20:46:08.390 [6810/a/86639 (pid 11003)] Task is starting.
2021-03-12 20:46:10.649 [6810/a/86639 (pid 11003)] foo
2021-03-12 20:46:11.550 [6810/a/86639 (pid 11003)] Task finished successfully.
2021-03-12 20:46:12.145 [6810/end/86640 (pid 11009)] Task is starting.
2021-03-12 20:46:15.382 [6810/end/86640 (pid 11009)] Task finished successfully.
2021-03-12 20:46:15.563 Done!

Note the output for the step `a` which should have been `bar`. PR 452 fixes the issue.

[Fix `environment is not callable` error when using `environment`](https://gitter.im/metaflow_org/community?at=6048a07d823b6654d296d62d)
Using `environment` would often result in an error from `pylint` - `E1102: environment is not callable (not-callable)`. Users were getting around this issue by launching their flows with `--no-pylint`. PR 451 fixes this issue.

2.2.7

The Metaflow 2.2.7 release is a minor patch release.
- [Bug Fixes](2.2.7_bugs)
- [Handle for-eaches properly for AWS Step Functions workflows running on AWS Fargate](https://gitter.im/metaflow_org/community?at=601f56d955359c58bf28ef1a)

<a name="v2.2.7_bugs"></a> Bugs
[Handle for-eaches properly for AWS Step Functions workflows running on AWS Fargate](https://gitter.im/metaflow_org/community?at=601f56d955359c58bf28ef1a)
Workflows orchestrated by AWS Step Functions were failing to properly execute `for-each` steps on AWS Fargate. The culprit was lack of access to instance metadata for ECS. Metaflow instantiates a connection to Amazon DynamoDB to keep track of `for-each` cardinality. This connection requires knowledge of the region that the job executes in and is made available via [instance metadata](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-instance-metadata.html) on EC2; which unfortunately is not available on ECS (for AWS Fargate). This fix introduces the necessary checks for inferring the region correctly for tasks executing on AWS Fargate. Note that after the recent changes to [Amazon S3's consistency model](https://aws.amazon.com/blogs/aws/amazon-s3-update-strong-read-after-write-consistency/), the Amazon DynamoDB dependency is no longer needed and will be done away in a subsequent release. PR: #436

2.2.6

The Metaflow 2.2.6 release is a minor patch release.
- [Features](2.2.6_features)
- [Support AWS Fargate as compute backend for Metaflow tasks launched on AWS Batch](398)
- [Support `shared_memory`, `max_swap`, `swappiness` attributes for Metaflow tasks launched on AWS Batch](407)
- [Support wider very-wide workflows on top of AWS Step Functions](403)
- [Bug Fixes](2.2.6_bugs)
- [Assign tags to `Run` objects generated through AWS Step Functions executions](384)
- [Pipe all workflow set-up logs to `stderr`](378)
- [Handle null assignment to `IncludeFile` properly](418)

<a name="v2.2.6_features"></a> Features
[Support AWS Fargate as compute backend for Metaflow tasks launched on AWS Batch](398)
At [AWS re:invent 2020, AWS announced support for AWS Fargate](https://aws.amazon.com/blogs/aws/new-fully-serverless-batch-computing-with-aws-batch-support-for-aws-fargate/) as a compute backend (in addition to EC2) for AWS Batch. With this feature, Metaflow users can now submit their Metaflow jobs to AWS Batch Job Queues which are connected to AWS Fargate Compute Environments as well. By setting the environment variable - `METAFLOW_ECS_FARGATE_EXECUTION_ROLE `, users can configure the ecsTaskExecutionRole for the AWS Batch container and AWS Fargate agent. PR: #402

[Support `shared_memory`, `max_swap`, `swappiness` attributes for Metaflow tasks launched on AWS Batch](407)
The `batch` decorator now supports `shared_memory`, `max_swap`, `swappiness` attributes for Metaflow tasks launched on AWS Batch to provide a greater degree of control for memory management. PR: 408

[Support wider very-wide workflows on top of AWS Step Functions](403)
The tag `metaflow_version:` and `runtime:` is now available for all packaged executions and remote executions as well. This ensures that every run logged by Metaflow will have `metaflow_version` and `runtime` system tags available. PR: 403

<a name="v2.2.6_bugs"></a> Bug Fixes
[Assign tags to `Run` objects generated through AWS Step Functions executions](384)
`Run` objects generated by flows executed on top of AWS Step Functions were missing the tags assigned to the flow; even though the tags were correctly persisted to tasks. This release fixes and brings inline the tagging behavior as observed with local flow executions. PR: 386

[Pipe all workflow set-up logs to `stderr`](378)
Execution set-up logs for `conda` and `IncludeFile` were being piped to `stdout` which made manipulating the output of commands like `python flow.py step-functions create --only-json` a bit difficult. This release moves the workflow set-up logs to `stderr`. PR: 379

[Handle null assignment to `IncludeFile` properly](418)
A workflow executed without a required `IncludeFile` parameter would fail when the parameter was referenced inside the flow. This release fixes the issue by assigning a null value to the parameter in such cases. PR: 421

Page 25 of 28

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.