Metaflow

Latest version: v2.13

Safety actively analyzes 693883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 28

2.9.12

Known issues
The annotations feature introduced in this release has an issue where project, flow_name or user annotations are not being populated for Kubernetes. This has been reverted in the next release.

Features

Custom annotations for K8S and Argo Workflows
This release enables users to add custom annotations to the Kubernetes resources that Flows create. The annotations can be configured much in the same way as custom labels

1. Globally with an environment variable. For example with
sh
export METAFLOW_KUBERNETES_ANNOTATIONS="first=A,second=B"

2. At a step level by passing a dictionary to the Kubernetes decorator.
python
kubernetes(annotations={"first": "A", "second": "B"})



What's Changed
* Adds custom annotations via env variables by tylerpotts in https://github.com/Netflix/metaflow/pull/1442
* Pass the user-defined executable to environment's `executable` by romain-intel in https://github.com/Netflix/metaflow/pull/1454
* Remove validate_environment from task lifecycle by savingoyal in https://github.com/Netflix/metaflow/pull/1507
* Fix/863 - Improve error message in metaflow.S3 class when DATATOOLS_S3ROOT is not configured. by tfurmston in https://github.com/Netflix/metaflow/pull/1491
* Fix an issue where 0 was not considered False for extension debug opt… by romain-intel in https://github.com/Netflix/metaflow/pull/1511
* Bump version to 2.9.12 by saikonen in https://github.com/Netflix/metaflow/pull/1514


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.11...2.9.12

2.9.11

Bug Fix

Fix regression for batch decorator introduced by v2.9.10
This release reverts a validation fix introduced in 2.9.10, which prevented executions of Metaflow tasks on AWS Batch

What's Changed
* Revert "fix: validate required configuration for Batch" by savingoyal in https://github.com/Netflix/metaflow/pull/1486
* Bump version to 2.9.11 by savingoyal in https://github.com/Netflix/metaflow/pull/1487

**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.10...2.9.11

2.9.10

Features

Introduce PagerDuty support for workflows running on Argo Workflows
With this release, Metaflow users can get events on PagerDuty when their workflows succeed or fail on Argo Workflows.
Setting up the notifications is similar to the existing Slack notifications support

1. Follow [these](https://support.pagerduty.com/docs/services-and-integrations#create-a-generic-events-api-integration) instructions on PagerDuty to set up an Events API V2 integration for your PagerDuty service
2. You should be able to view the required integration key from the Events API V2 dropdown
3. To enable notifications on PagerDuty when your Metaflow flow running on Argo Workflows succeeds or fails, deploy it using the --notify-on-error or --notify-on-success flags:

python flow.py argo-workflows create --notify-on-error --notify-on-success --notify-pager-duty-integration-key <pager-duty-integration-key>

4. You can also set the following environment variable instead of specifying --notify-slack-webhook-url on the CLI everytime

METAFLOW_ARGO_WORKFLOWS_CREATE_NOTIFY_PAGER_DUTY_INTEGRATION_KEY=<pager-duty-integration-key>

5. Next time the flow fails or succeeds, you should receive a new event on PagerDuty under Incidents (Flow failed) or Changes (Flow succeeded)

What's Changed
* fix: validate required configuration for Batch by saikonen in https://github.com/Netflix/metaflow/pull/1483
* feature: add PagerDuty support for Argo Workflows by saikonen in https://github.com/Netflix/metaflow/pull/1478
* Bump version to 2.9.10 by saikonen in https://github.com/Netflix/metaflow/pull/1484


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.9...2.9.10

2.9.9

Improvements

Fixes a bug with the S3 operations affecting `conda` with some S3 providers
This release fixes a bug with the `conda` bootstrapping process. There was an issue with the `ServerSideEncryption` support, that affected some of the S3 operations when using S3 providers that do not implement the encryption headers (for example MinIO).
Affected operations were all that handle multiple files at once:

get_many / get_all / get_recursive / put_many / info_many

which are used as part of bootstrapping a `conda` environment when executing remotely.


What's Changed
* fix: s3 op bug with ServerSideEncryption by saikonen in https://github.com/Netflix/metaflow/pull/1479
* Bump version to 2.9.9 by saikonen in https://github.com/Netflix/metaflow/pull/1480


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.8...2.9.9

2.9.8

Improvements

Fixes bug with Argo events parameters
This release fixes an issue with mapping values with spaces from the Argo events payload to flow parameters.

What's Changed
* sanitize / in secret names by oavdeev in https://github.com/Netflix/metaflow/pull/1470
* chore: upgrade packages in cards plugin by saikonen in https://github.com/Netflix/metaflow/pull/1473
* fix: Argo events parameters with spaces by saikonen in https://github.com/Netflix/metaflow/pull/1475
* allow to customize env var name in `secrets` by oavdeev in https://github.com/Netflix/metaflow/pull/1474
* Bump version to 2.9.8 by saikonen in https://github.com/Netflix/metaflow/pull/1476


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.7...2.9.8

2.9.7

Features

New commands for managing Argo Workflows through the CLI
This release includes new commands for managing workflows on Argo Workflows.
When needed, commands can be authorized by supplying a production token with `--authorize`.

`argo-workflows delete`
A deployed workflow can be deleted through the CLI with

python flow.py argo-workflows delete


`argo-workflows terminate`
A run can be terminated mid-execution through the CLI with

python flow.py argo-workflows terminate RUN_ID


`argo-workflows suspend/unsuspend`
A run can be suspended temporarily with

python flow.py argo-workflows suspend RUN_ID

Note that the suspended flow will show up as failed on Metaflow-UI after a period, due to this also suspending the heartbeat process. Unsuspending will resume the flow and its status will show as running again. This can be done with

python flow.py argo-workflows unsuspend RUN_ID




Improvements

Faster Job completion checks for Kubernetes
Previously the status for tasks running on Kubernetes was determined through the pod status, which can take a while to update after the last container finishes. This release changes the status checks to use container statuses directly instead.

What's Changed
* Job completion check based on container status. by shrinandj in https://github.com/Netflix/metaflow/pull/1369
* feature: add argo workflows suspend command by saikonen in https://github.com/Netflix/metaflow/pull/1420
* feature: add delete and terminate for argo workflows by saikonen in https://github.com/Netflix/metaflow/pull/1307
* Bump version to 2.9.7 by saikonen in https://github.com/Netflix/metaflow/pull/1467


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.6...2.9.7

Page 13 of 28

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.