Metaflow

Latest version: v2.13

Safety actively analyzes 693883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 14 of 28

2.9.6

Features

AWS Step Function state machines can now be deleted through the CLI
This release introduces the command `step-functions delete` for deleting state machines through the CLI.

For a regular flow
bash
python flow.py step-functions delete


For another users project branch
Comment out the `project` decorator from the flow file, as we do not allow using `--name` with projects.
bash
python project_flow.py step-functions --name project_a.user.saikonen.ProjectFlow delete


For a production or custom branch flow
bash
python project_flow.py --production step-functions delete
or
python project_flow.py --branch custom step-functions delete

add `--authorize PRODUCTION_TOKEN` to the command if you do not have the correct production token locally

Improvements

Fixes a bug with the S3 server side encryption feature with some S3 compliant providers.
This release fixes an issue with the S3 server side encryption support, where some S3 compliant providers do not respond with the expected encryption method in the payload. This bug specifically affected regular operation when using MinIO.

Fixes support for `--with environment` in Airflow
Fixes a bug with the Airflow support for environment variables, where the env values set in the environment decorator could get overwritten.

What's Changed
* [bugfix] support `--with environment` in Airflow by valayDave in https://github.com/Netflix/metaflow/pull/1459
* feat: sfn delete workflow (with prod token validation and messaging) by stevenhoelscher, saikonen in https://github.com/Netflix/metaflow/pull/1379
* [bugfix]: Optional check for encryption in s3op response by valayDave in https://github.com/Netflix/metaflow/pull/1460
* Bump version to 2.9.6 by saikonen in https://github.com/Netflix/metaflow/pull/1461


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.5...2.9.6

2.9.5

Features

Ability to choose server side encryption method for S3 uploads
There is now the possibility to choose which server side encryption method to use for S3 uploads by setting an environment variable `METAFLOW_S3_SERVER_SIDE_ENCRYPTION` with an appropriate value, for example `aws:kms` or `AES256`

Improvements

Fixes double quotes with Parameters on Argo Workflows
This release fixes an issue where using parameters on Argo Workflows caused the values to be unnecessarily quoted.

In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org/) or open a GitHub issue.

What's Changed
* feat: ability to use ServerSideEncryption for S3 uploads by zendesk-klross in https://github.com/Netflix/metaflow/pull/1436
* fix quoting issue with argo by savingoyal in https://github.com/Netflix/metaflow/pull/1456
* Bump version to 2.9.5 by saikonen in https://github.com/Netflix/metaflow/pull/1457

New Contributors
* zendesk-klross made their first contribution in https://github.com/Netflix/metaflow/pull/1436

**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.4...2.9.5

2.9.4

Improvements

Fix using email addresses as usernames for Argo Workflows
Using an email address as the username when deploying with a `project` decorator to Argo Workflows is now possible. This release fixes an issue with some generated resources containing characters that are not permitted in names of Argo Workflow resources.

The `secrets` decorator now supports assuming roles
This release adds the capability to assume specific roles when accessing secrets with the `secrets` decorator. The role for accessing a secret can be defined in the following ways

As a global default
By setting the `METAFLOW_DEFAULT_SECRET_ROLE` environment variable, this role will be assumed when accessing any secret specified in the decorator.

As a global option in the decorator
This will assume the role `secret-iam-role` for accessing all of the secrets in the sources list.
python
secrets(
sources=["first-secret-source", "second-secret-source"],
role="secret-iam-role"
)


Or on a per secret basis
Assuming a different role based on the secret in question can be done as well
python
secrets(
sources=[
{"type": "aws-secrets-manager", "id": "first-secret-source", "role": "first-secret-role"},
{"type": "aws-secrets-manager", "id": "second-secret-source", "role": "second-secret-role"}
]
)


In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org/) or open a GitHub issue.

What's Changed
* [OBP] support assuming roles to read secrets by jackie-ob in https://github.com/Netflix/metaflow/pull/1418
* fix two docstrings that make API docs unhappy by tuulos in https://github.com/Netflix/metaflow/pull/1441
* Properly validate a config value against the type of its default by romain-intel in https://github.com/Netflix/metaflow/pull/1426
* Add additional options to trigger and trigger_on_finish by romain-intel in https://github.com/Netflix/metaflow/pull/1398
* Wrap errors importing over the escape hatch as ImportError by romain-intel in https://github.com/Netflix/metaflow/pull/1446
* Setting default time for files in code package to Dec 3, 2019 by pjoshi30 in https://github.com/Netflix/metaflow/pull/1445
* Fix issue with handling of exceptions in the escape hatch by romain-intel in https://github.com/Netflix/metaflow/pull/1444
* fix: support email in argo workflow names by saikonen in https://github.com/Netflix/metaflow/pull/1448
* fix: email naming support for argo events by saikonen in https://github.com/Netflix/metaflow/pull/1450
* bump version to 2.9.4 by saikonen in https://github.com/Netflix/metaflow/pull/1451


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.3...2.9.4

2.9.3

Improvements
Ignore duplicate Metaflow Extensions packages
Duplicate Metaflow Extensions packages were not properly ignored in all cases. This release fixes this and will allow the loading of extensions even if they are present in duplicate form in your sys.path.

Fix package leaks for the environment escape
In some cases, packages from the outside environment (non Conda) could leak into the Conda environment when using the environment escape functionality. This release addresses this issue and ensures that no spurious packages are imported in the Conda environment.

In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org) or open a GitHub issue.

What's Changed
* Update README.md by savingoyal in https://github.com/Netflix/metaflow/pull/1431
* Add labels and fix argo by dhpollack in https://github.com/Netflix/metaflow/pull/1360
* Update KubernetesDecorator class docstring to include persistent_volume_claims by tfurmston in https://github.com/Netflix/metaflow/pull/1435
* Properly ignore a duplicate metaflow extension package in sys.path by romain-intel in https://github.com/Netflix/metaflow/pull/1437
* Fix an issue with the escape hatch that could cause outside packages to "leak" by romain-intel in https://github.com/Netflix/metaflow/pull/1439
* Bump version to 2.9.3 by romain-intel in https://github.com/Netflix/metaflow/pull/1440


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.2...2.9.3

2.9.2

- Features
- [Introduce support for _image pull policy_ for kubernetes](features.1)

Features
<a id='features.1'></a>Introduce support for _image pull policy_ for kubernetes
With this release, Metaflow users can specify [_image pull policy_](https://kubernetes.io/docs/concepts/containers/images/#image-pull-policy) for their workloads through the kubernetes decorator for Metaflow tasks.

kubernetes(image='foo:tag', image_pull_policy='Always') Allowed values are Always, IfNotPresent, Never
step
def train(self):
...
...


If an _image pull policy_ is not specified, and the tag for the container image is _:latest_ or the tag for the container image is not specified, _image pull policy_ is automatically set to _Always_.

If an _image pull policy_ is not specified, and the tag for the container image is specified as a value that is not _:latest_, _image pull policy_ is automatically set to _IfNotPresent_.


In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org) or open a GitHub issue.

---

What's Changed
* introduce support for intra-cluster webhook url by savingoyal in https://github.com/Netflix/metaflow/pull/1417
* add and improve docstrings for event-triggering by tuulos in https://github.com/Netflix/metaflow/pull/1419
* Update readme by emattia in https://github.com/Netflix/metaflow/pull/1397
* Update README.md by savingoyal in https://github.com/Netflix/metaflow/pull/1422
* fix includefile for argo-workflows by savingoyal in https://github.com/Netflix/metaflow/pull/1428
* feature: support configuring image pull policy for Kubernetes and Argo Workflows by saikonen in https://github.com/Netflix/metaflow/pull/1427
* fix error message by savingoyal in https://github.com/Netflix/metaflow/pull/1429
* Update to 2.9.2 by savingoyal in https://github.com/Netflix/metaflow/pull/1430


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.9.1...2.9.2

2.9.1

- Features
- [Introduce Slack notifications support for workflow running on Argo Workflows](features.1)

Features
<a id='features.1'></a>Introduce Slack notifications support for workflow running on Argo Workflows
With this release, Metaflow users can get notified on Slack when their workflows succeed or fail on Argo Workflows. Using this feature is quite straightforward
1. Follow [these instructions](https://api.slack.com/messaging/webhooks) on Slack to set up incoming webhooks for your Slack workspace.
2. You should now have a webhook URL that Slack provides. Here is an example webhook:

https://hooks.slack.com/services/T0XXXXXXXXX/B0XXXXXXXXX/qZXXXXXX

3. To enable notifications on Slack when your Metaflow flow running on Argo Workflows succeeds or fails, deploy it using the _--notify-on-error_ or _--notify-on-success_ flags:

python flow.py argo-workflows create --notify-on-error --notify-on-success --notify-slack-webhook-url <slack-webhook-url>

4. You can also set `METAFLOW_ARGO_WORKFLOWS_CREATE_NOTIFY_SLACK_WEBHOOK_URL=<slack-webhook-url>` in your environment instead of specifying _--notify-slack-webhook-url_ on the CLI everytime.
5. Next time your workflow succeeds or fails on Argo Workflows, you will get a helpful notification on Slack.

FAQ

_I deployed my workflow following the instructions above, but I haven’t received any notifications yet?_

This issue may very well happen if you are running Kubernetes v1.24 or newer.

Since v1.24, Kubernetes stopped automatically creating a secret for every serviceAccount. Argo Workflows relies on the existence of these secrets to run lifecycle hooks responsible for the emission of these notifications.

Follow these steps for explicitly creating a secret for the service account that responsible for executing Argo Workflows steps:

1. Run the following command, replacing _service-account.name_ with the _serviceAccount_ in your deployment. Also change the name of the secret to correctly reflect the name of the _serviceAccount _for which this secret is

cat <<EOF | kubectl apply -f -
apiVersion: v1
kind: Secret
metadata:
name: default-sa-token change according to the name of the sa
annotations:
kubernetes.io/service-account.name: default replace with your sa
type: kubernetes.io/service-account-token
EOF

3. Edit the _serviceAccount_ object so as to add the name of the above secret in it. You can use _kubectl edit_ for this. The _serviceAccount_ yaml should look like the following

$ kubectl edit sa default -n mynamespace
...
apiVersion: v1
kind: ServiceAccount
metadata:
creationTimestamp: "2023-05-05T20:58:58Z"
name: default
namespace: jobs-default
resourceVersion: "6739507"
uid: 4a708eff-d6ba-4dd8-80ee-8fb3c4c1e1c7
secrets:
- name: default-sa-token should match the secret above

4. That’s it! Try executing your workflow again on Argo Workflows. If you are still running into issues, reach out to us!


In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org) or open a GitHub issue.

---

What's Changed
* feature: add argo events environment variables to `metaflow configure kubernetes` by saikonen in https://github.com/Netflix/metaflow/pull/1405
* handle whitespaces in argo events parameters by savingoyal in https://github.com/Netflix/metaflow/pull/1408
* Add back comment for argo workflows by savingoyal in https://github.com/Netflix/metaflow/pull/1409
* Support ArgoEvent object with kubernetes by savingoyal in https://github.com/Netflix/metaflow/pull/1410
* Print workflow template location as part of argo-workflows create by savingoyal in https://github.com/Netflix/metaflow/pull/1411


**Full Changelog**: https://github.com/Netflix/metaflow/compare/2.8.6...2.9.0

Page 14 of 28

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.