Metaflow

Latest version: v2.13

Safety actively analyzes 693883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 16 of 28

2.8.1

- Features
- [Add ec2 instance metadata in `task.metadata_dict` when a task executes on AWS Batch](features.1)
- [Display Metaflow UI URL on the terminal when a flow is executed either via `run` or `resume`](features.2)

Features
<a id='features.1'></a>Add ec2 instance metadata in `task.metadata_dict` when a task executes on AWS Batch
With this release, `task.metadata_dict` will include the fields - `ec2-instance-id`, `ec2-instance-type`, `ec2-region`, and `ec2-availability-zone` whenever the Metaflow task is executed on AWS Batch and the [task container has access to ec2 metadata magic URL](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/configuring-IMDS-new-instances.html#configure-IMDS-new-instances--turn-off-instance-metadata).

<a id='features.2'></a>Display Metaflow UI URL on the terminal when a flow is executed either via `run` or `resume`
With this release, if the Metaflow config (in `~/.metaflow_config`) includes a reference to the deployed Metaflow UI (assigned to `METAFLOW_UI_URL`), the user-facing logs in the terminal will indicate the direct link to the relevant `run view` in the Metaflow UI.

<img width="992" alt="Screen Shot 2023-03-15 at 12 46 01 PM" src="https://user-images.githubusercontent.com/763451/225425538-05cca220-a6d2-4c49-b01e-cc191d3e58a2.png">


In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org) or open a GitHub issue.

2.8.0

- Features
- [Introduce capability to schedule Metaflow flows with Apache Airflow](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-airflow)

Features
[Introduce capability to schedule Metaflow flows with Apache Airflow](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-airflow)
With this release, we are introducing an integration with [Apache Airflow](https://airflow.apache.org/) similar to our integrations with [AWS Step Functions](https://aws.amazon.com/step-functions/) and [Argo Workflows](https://argoproj.github.io/argo-workflows/) where Metaflow users can [easily deploy & schedule their DAGs](https://docs.metaflow.org/going-to-production-with-metaflow/scheduling-metaflow-flows) by simply executing

python myflow.py airflow create mydag.py

which will create an Airflow DAG for them. With this feature, Metaflow users can now enjoy all the features of Metaflow on top of Apache Airflow - including a more user-friendly and productive development API for data scientists and data engineers, without needing to change anything in your existing pipelines or operational playbooks, as described in [its announcement blog post](https://outerbounds.com/blog/better-airflow-with-metaflow/). To learn how to deploy and operate the integration, see [Using Airflow with Metaflow](https://outerbounds.com/engineering/operations/airflow/).

When running on Airflow, Metaflow code works exactly as it does locally: No changes are required in the code. With this integration, Metaflow users can [inspect](https://docs.metaflow.org/metaflow/client) their flows deployed on Apache Airflow as before and [debug and reproduce](https://docs.metaflow.org/metaflow/debugging#reproducing-production-issues-locally) results from Apache Airflow on their local laptop or within a notebook. All tasks are run on Kubernetes respecting the resources decorator as if the kubernetes decorator was added to all steps, as explained in [Executing Tasks Remotely](https://docs.metaflow.org/scaling/remote-tasks/introduction#safeguard-flags).

The main benefits of using Metaflow with Airflow are:
- You get to use the human-friendly API of Metaflow to define and test workflows. Almost all features of Metaflow work with Airflow out of the box, except nested _foreaches_, which are not yet supported by Airflow, and batch as the current integration only supports kubernetes at the moment.
- You can deploy Metaflow flows to your existing Airflow server without having to change anything operationally. From Airflow's point of view, Metaflow flows look like any other Airflow DAG.
- If you want to consider moving to another orchestrator supported by Metaflow, you can test them easily just by changing one command to deploy to [Argo Workflows](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-argo-workflows) or [AWS Step Functions](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-aws-step-functions).

In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org) or open a GitHub issue.

2.7.23

What's Changed
* New MF configs for Argo Workflows by jackie-ob in https://github.com/Netflix/metaflow/pull/1267
* Added typing information for all public APIs by romain-intel in https://github.com/Netflix/metaflow/pull/1158
* When packaging `metaflow_extensions`, add an empty `__init__.py` file. by romain-intel in https://github.com/Netflix/metaflow/pull/1276
* Replace instances of "INFO" with a constant by romain-intel in https://github.com/Netflix/metaflow/pull/1275
* Fix an issue with threading and the escape hatch. by romain-intel in https://github.com/Netflix/metaflow/pull/1274


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.7.22...2.7.23

2.7.22

What's Changed
* Test metaflow.s3 on multiple Python versions across Linux and MacOS by savingoyal in https://github.com/Netflix/metaflow/pull/1246
* fix metaflow.s3 tests by savingoyal in https://github.com/Netflix/metaflow/pull/1248
* support timezone when scheduling with Argo workflows by amerberg in https://github.com/Netflix/metaflow/pull/1250
* Airflow V2 PR (Foreach + Sensors + GCP + MWAA Support) by valayDave in https://github.com/Netflix/metaflow/pull/1256
* Expose Kubernetes Node IP in task metadata by savingoyal in https://github.com/Netflix/metaflow/pull/1254
* Fix timezone code in Argo Workflows by jackie-ob in https://github.com/Netflix/metaflow/pull/1258
* Implement secrets, with AWS support by jackie-ob in https://github.com/Netflix/metaflow/pull/1251
* Expose AWS instance metadata for kubernetes by savingoyal in https://github.com/Netflix/metaflow/pull/1263
* Fix an issue with configurations for env escape extensions by romain-intel in https://github.com/Netflix/metaflow/pull/1264
* Bump version to 2.7.22 by savingoyal in https://github.com/Netflix/metaflow/pull/1247


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.7.21...2.7.22

2.7.21

What's Changed
* Fix extension support on Python 3.5 by romain-intel in https://github.com/Netflix/metaflow/pull/1245


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.7.20...2.7.21

2.7.20

What's Changed
* Bug/long card by obgibson in https://github.com/Netflix/metaflow/pull/1233
* Restrict token permissions for test jobs by romain-intel in https://github.com/Netflix/metaflow/pull/1238
* Since we're now providing type annotations, add typed marker as per PEP-561 by oavdeev in https://github.com/Netflix/metaflow/pull/1239
* Allow configuration of plugins/cmds using METAFLOW_ENABLED_* variable by romain-intel in https://github.com/Netflix/metaflow/pull/1212
* Bump setup.py to 2.7.20 by romain-intel in https://github.com/Netflix/metaflow/pull/1244

Incompatible change
If you are using the unsupported Metaflow Extensions mechanism, you may have to change them slightly. Please see https://github.com/Netflix/metaflow-extensions-template/blob/master/CHANGES.md for more details.


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.7.19...2.7.20

Page 16 of 28

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.