- Features
- [Introduce capability to schedule Metaflow flows with Apache Airflow](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-airflow)
Features
[Introduce capability to schedule Metaflow flows with Apache Airflow](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-airflow)
With this release, we are introducing an integration with [Apache Airflow](https://airflow.apache.org/) similar to our integrations with [AWS Step Functions](https://aws.amazon.com/step-functions/) and [Argo Workflows](https://argoproj.github.io/argo-workflows/) where Metaflow users can [easily deploy & schedule their DAGs](https://docs.metaflow.org/going-to-production-with-metaflow/scheduling-metaflow-flows) by simply executing
python myflow.py airflow create mydag.py
which will create an Airflow DAG for them. With this feature, Metaflow users can now enjoy all the features of Metaflow on top of Apache Airflow - including a more user-friendly and productive development API for data scientists and data engineers, without needing to change anything in your existing pipelines or operational playbooks, as described in [its announcement blog post](https://outerbounds.com/blog/better-airflow-with-metaflow/). To learn how to deploy and operate the integration, see [Using Airflow with Metaflow](https://outerbounds.com/engineering/operations/airflow/).
When running on Airflow, Metaflow code works exactly as it does locally: No changes are required in the code. With this integration, Metaflow users can [inspect](https://docs.metaflow.org/metaflow/client) their flows deployed on Apache Airflow as before and [debug and reproduce](https://docs.metaflow.org/metaflow/debugging#reproducing-production-issues-locally) results from Apache Airflow on their local laptop or within a notebook. All tasks are run on Kubernetes respecting the resources decorator as if the kubernetes decorator was added to all steps, as explained in [Executing Tasks Remotely](https://docs.metaflow.org/scaling/remote-tasks/introduction#safeguard-flags).
The main benefits of using Metaflow with Airflow are:
- You get to use the human-friendly API of Metaflow to define and test workflows. Almost all features of Metaflow work with Airflow out of the box, except nested _foreaches_, which are not yet supported by Airflow, and batch as the current integration only supports kubernetes at the moment.
- You can deploy Metaflow flows to your existing Airflow server without having to change anything operationally. From Airflow's point of view, Metaflow flows look like any other Airflow DAG.
- If you want to consider moving to another orchestrator supported by Metaflow, you can test them easily just by changing one command to deploy to [Argo Workflows](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-argo-workflows) or [AWS Step Functions](https://docs.metaflow.org/production/scheduling-metaflow-flows/scheduling-with-aws-step-functions).
In case you need any assistance or have feedback for us, ping us at [chat.metaflow.org](http://chat.metaflow.org) or open a GitHub issue.