Mage-ai

Latest version: v0.9.76

Safety actively analyzes 723296 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 10 of 10

0.7.98

Not secure
Data integration

New sources

- [Couchbase](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/couchbase/README.md)

Full lists of available sources and destinations can be found here:

- Sources: [https://docs.mage.ai/data-integrations/overview#available-sources](https://docs.mage.ai/data-integrations/overview#available-sources)
- Destinations: [https://docs.mage.ai/data-integrations/overview#available-destinations](https://docs.mage.ai/data-integrations/overview#available-destinations)

Improvements on existing sources and destinations

- Support deltalake connector in Trino destination
- Fix Outreach source bookmark comparison error
- Fix Facebook Ads source “User request limit reached” error
- Show more HubSpot source sync print log statements to give the user more information on the progress and activity

Databricks integration for Spark

Mage now supports building and running Spark pipelines with remote Databricks Spark cluster.

Check out the [guide](https://docs.mage.ai/integrations/databricks) to learn about how to use Databricks Spark cluster with Mage.

![Untitled](https://media.graphassets.com/WkMFeGO0QKSsdbAOtbad)

RabbitMQ streaming source

Shout out to [Luis Salomão](https://github.com/Luishfs) for his contribution of adding the RabbitMQ streaming source to Mage! Check out the [doc](https://docs.mage.ai/guides/streaming/streaming-pipeline-rabbitmq) to set up a streaming pipeline with RabbitMQ source.

![Untitled](https://media.graphassets.com/VdX6fbhCQraolFzzn456)

DBT support for Trino

Support running Trino DBT models in Mage.

![Untitled](https://media.graphassets.com/CNCBwXQRCmw8ZxLL0dHQ)

More K8s support

- Allow customizing namespace by setting the `KUBE_NAMESPACE` environment variable.
- Support [K8s executor](https://docs.mage.ai/production/configuring-production-settings/compute-resource#kubernetes-executor) on AWS EKS cluster.

Generic block

Add a generic block that can run in a pipeline, optionally accept inputs, and optionally return outputs but not a data loader, data exporter, or transformer block.

![Untitled](https://media.graphassets.com/4x7cLH0gRtCg9N0CXDqc)

Other bug fixes & polish

- Support overriding runtime variables when clicking the Run now button on the triggers list page.

![Untitled](https://media.graphassets.com/bt1lsZ5yTLCfb9Z96MzH)

- Support MySQL SQL block
- Fix the serialization for the column that is a dictionary or list of dictionaries when saving the output dataframe of a block.
- Allow selecting multiple partition keys for Delta Lake destination.
- Support copy and paste into/from Mage terminal.

View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.7.90

Not secure
Data integration

New sources

- [Monday](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/sources/monday)
- [Commercetools](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/sources/commercetools)
- [Front app](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/sources/front)

New destinations

- [Microsoft SQL Server (destination)](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/destinations/mssql)

Full lists of available sources and destinations can be found here:

- Sources: [https://docs.mage.ai/data-integrations/overview#available-sources](https://docs.mage.ai/data-integrations/overview#available-sources)
- Destinations: [https://docs.mage.ai/data-integrations/overview#available-destinations](https://docs.mage.ai/data-integrations/overview#available-destinations)

Improvements on existing sources and destinations

- Trino destination
- Support `MERGE` command in Trino connector to handle conflict.
- Allow customizing `query_max_length` to adjust batch size.
- MSSQL source
- Fix datetime column conversion and comparison for MSSQL source.
- BigQuery destination
- Fix BigQuery error “Deadline of 600.0s exceeded while calling target function”.
- Deltalake destination
- Upgrade delta library from version from 0.6.4 to 0.7.0 to fix some errors.
- Allow datetime columns to be used as bookmark properties.
- When clicking apply button in the data integration schema table, if a bookmark column is not a valid replication key for a table or a unique column is not a valid key property for a table, don’t apply that change to that stream.

New command line tool

Mage has a newly revamped command line tool, with better formatting, clearer help commands, and more informative error messages. Kudos to community member [jlondonobo](https://github.com/jlondonobo), for your awesome contribution!

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/399c9c59-3f00-432f-983d-6dcf9c82109d/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004258Z&X-Amz-Expires=86400&X-Amz-Signature=a5a353ed09527fdd80e5944155c9c6cebb6bfa00716ce0dbbddc3bb8c24c3d9c&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/5cface86-6146-4281-aed8-6659873ebb3b/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004335Z&X-Amz-Expires=86400&X-Amz-Signature=771844e386c0ecb07ea207796db9c7237f9d3842627c769787f5a04913a26fc5&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/bcf41062-491a-43b4-8406-43b2b5d1e5e0/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004347Z&X-Amz-Expires=86400&X-Amz-Signature=e12f443d3f0dd1e31f6f4f543ebe9ef8dbc3924f20e4197adef49141c2673016&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

DBT block improvements

- Support running Redshift DBT models in Mage.
- Raise an error if there is a DBT compilation error when running DBT blocks in a pipeline.
- Fix duplicate DBT source names error with same source name across multiple `mage_sources.yml` files in different model subfolders: use only 1 sources file for. all models instead of nesting them in subfolders.

Notebook improvements

- Support editing global variables in UI: [https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-mage-editor](https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-mage-editor)
- Support creating or edit global variables in code by editing the pipeline `metadata.yaml` file. [https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-code](https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-code)
- Add a save file button when editing a file not in the pipeline notebook.

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/5e41d400-c0aa-4e34-b89e-db057419aa90/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004403Z&X-Amz-Expires=86400&X-Amz-Signature=ecb3f9d02c6249227e31e615c85d36f02e0e44e3b799655985acba1b66013f13&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

- Support Windows keyboard shortcuts: CTRL+S to save the files.
- Support uploading files through UI.

![Untitled](https://media.graphassets.com/output=format:jpg/resize=height:800,fit:max/BnSxgNEkRCSFOQaFt3HH)

Store logs in GCP Cloud Storage bucket

Besides storing logs on the local disk or AWS S3, we now add the option to store the logs in GCP Cloud Storage by adding logging config in project’s metadata.yaml like below:

sql
logging_config:
type: gcs
level: INFO
destination_config:
path_to_credentials: <path to gcp credentials json file>
bucket: <bucket name>
prefix: <prefix path>


Check out the doc for details: [https://docs.mage.ai/production/observability/logging#google-cloud-storage](https://docs.mage.ai/production/observability/logging#google-cloud-storage)

Other bug fixes & improvements

- SQL block improvements
- Support writing raw SQL to customize the create table and insert commands.
- Allow editing SQL block output table names.
- Support loading files from a directory when using `mage_ai.io.file.FileIO` . Example:

python
from mage_ai.io.file import FileIO

file_directories = ['default_repo/csvs']
FileIO().load(file_directories=file_directories)


View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.7.84

Not secure

0.7.74

Not secure
Data integration

New sources

- [Facebook Ads](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/facebook_ads/README.md)
- [HubSpot](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/hubspot/README.md)
- [Postmark](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/postmark/README.md)

Improvements on existing sources and destinations

- S3 source
- Automatically add `_s3_last_modified` column from LastModified key, and enable `_s3_last_modified` column as a bookmark property.
- Allow filtering objects using regex syntax by configuring `search_pattern` key.
- Support multiple streams by configuring a list of table configs in `table_configs` key. [https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/amazon_s3/README.md](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/amazon_s3/README.md)
- Postgres source log based replication
- Automatically add a `_mage_deleted_at` column to record the source row deletion time.
- When operation is update and unique conflict method is ignore, create a new record in destination.
- In source or destination yaml config, interpolate secret values from AWS Secrets Manager using syntax `{{ aws_secret_var('some_name_for_secret') }}` . Here is the full guide: [https://docs.mage.ai/production/configuring-production-settings/secrets#yaml](https://docs.mage.ai/production/configuring-production-settings/secrets#yaml)

Full lists of available sources and destinations can be found here:

- Sources: [https://docs.mage.ai/data-integrations/overview#available-sources](https://docs.mage.ai/data-integrations/overview#available-sources)
- Destinations: [https://docs.mage.ai/data-integrations/overview#available-destinations](https://docs.mage.ai/data-integrations/overview#available-destinations)

Customize pipeline alerts

Customize alerts to only send when pipeline fails or succeeds (or both) via `alert_on` config

yaml
notification_config:
alert_on:
- trigger_failure
- trigger_passed_sla
- trigger_success


Here are the guides for configuring the alerts

- Email alerts: [https://docs.mage.ai/production/observability/alerting-email#create-notification-config](https://docs.mage.ai/production/observability/alerting-email#create-notification-config)
- Slack alerts: [https://docs.mage.ai/production/observability/alerting-slack#update-mage-project-settings](https://docs.mage.ai/production/observability/alerting-slack#update-mage-project-settings)
- Teams alerts: [https://docs.mage.ai/production/observability/alerting-teams#update-mage-project-settings](https://docs.mage.ai/production/observability/alerting-teams#update-mage-project-settings)

Deploy Mage on AWS using AWS Cloud Development Kit (CDK)

Besides using Terraform scripts to deploy Mage to cloud, Mage now also supports managing AWS cloud resources using [AWS Cloud Development Kit](https://aws.amazon.com/cdk/) in Typescript.

Follow this [guide](https://github.com/mage-ai/mage-ai-cdk/tree/master/typescript) to deploy Mage app to AWS using AWS CDK scripts.

Stitch integration

Mage can orchestrate the sync jobs in Stitch via API integration. Check out the [guide](https://docs.mage.ai/integrations/stitch) to learn about how to trigger the jobs in Stitch and poll statuses of the jobs.

Bug fixes & polish

- Allow pressing `escape` key to close error message popup instead of having to click on the `x` button in the top right corner.
- If a data integration source has multiple streams, select all streams with one click instead of individually selecting every single stream.

![Untitled](https://media.graphassets.com/output=format:jpg/resize=height:800,fit:max/x1bIQdOZRc6jgI0Kt96n)

- Make pipeline runs pages (both the overall `/pipeline-runs` and the trigger `/pipelines/[trigger]/runs` pages) more efficient by avoiding individual requests for pipeline schedules (i.e. triggers).
- In order to avoid confusion when using the drag and drop feature to add dependencies between blocks in the dependency tree, the ports (white circles on blocks) on other blocks disappear when the drag feature is active. The dependency lines must be dragged from one block’s port onto another block itself, not another block’s port, which is what some users were doing previously.
- Fix positioning of newly added blocks. Previously when adding a new block with a custom block name, the blocks were being added to the bottom of the pipeline, so these new blocks should appear immediately after the block where it was added now.
- Popup error messages include both the stack trace and traceback to help with debugging (previously did not include the traceback).
- Update links to docs in code block comments (links were broken due to recent docs migration to a different platform).

View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.1.1

default:
MONGODB_DATABASE: database
MONGODB_HOST: host
MONGODB_PASSWORD: password
MONGODB_PORT: 27017
MONGODB_COLLECTION: collection
MONGODB_USER: user


**Data loader template**

![Untitled](https://media.graphassets.com/NXOk8CMgQcSksVVkcfvV)

**Data exporter template**

![Untitled](https://media.graphassets.com/CqqozYjqQgqpNbEUsX8a)

Support using `renv` for R block

`renv` is installed in Mage docker image by default. User can use `renv` package to manage R dependency for your project.

Doc for `renv` package: [https://cran.r-project.org/web/packages/renv/vignettes/renv.html](https://cran.r-project.org/web/packages/renv/vignettes/renv.html)

Run **streaming pipeline in separate k8s pod**

Support running streaming pipeline in k8s executor to scale up streaming pipeline execution.

It can be configured in pipeline metadata.yaml with `executor_type` field. Here is an example:

yaml
blocks:
- ...
- ...
executor_count: 1
executor_type: k8s
name: test_streaming_kafka_kafka
uuid: test_streaming_kafka_kafka


When cancelling the pipeline run in Mage UI, Mage will kill the k8s job.

DBT support for Spark

Support running Spark DBT models in Mage. Currently, only the connection method `session` is supported.

Follow this [doc](https://docs.mage.ai/integrations/spark-pyspark#standalone-spark-cluster) to set up Spark environment in Mage. Follow the instructions in [https://docs.mage.ai/tutorials/setup-dbt](https://docs.mage.ai/tutorials/setup-dbt) to set up the DBT. Here is an example DBT Spark `profiles.yml`

yaml
spark_demo:
target: dev
outputs:
dev:
type: spark
method: session
schema: default
host: local


Doc for s**taging/production deployment**

- Add doc for setting up the CI/CD pipeline to deploy Mage to staging and production environments: [https://docs.mage.ai/production/ci-cd/staging-production/github-actions](https://docs.mage.ai/production/ci-cd/staging-production/github-actions)
- Provide example Github Action template for deployment on AWS ECS: [https://github.com/mage-ai/mage-ai/blob/master/templates/github_actions/build_and_deploy_to_aws_ecs_staging_production.yml](https://github.com/mage-ai/mage-ai/blob/master/templates/github_actions/build_and_deploy_to_aws_ecs_staging_production.yml)

**Enable user authentication for** multi-development environment

Update the multi-development environment to go through the user authentication flow. Multi-development environment is used to manage development instances on cloud.

Doc for multi-development environment: [https://docs.mage.ai/developing-in-the-cloud/cloud-dev-environments/overview](https://docs.mage.ai/developing-in-the-cloud/cloud-dev-environments/overview)

Refined dependency graph

- Add buttons for zooming in/out of and resetting dependency graph.
- Add shortcut to reset dependency graph view (double-clicking anywhere on the canvas).

![zooming dep graph.gif](https://media.graphassets.com/OSZbndpKR8m450c5oYrg)

**Add new block with downstream blocks connected**

- If a new block is added between two blocks, include the downstream connection.
- If a new block is added after a block that has multiple downstream blocks, the downstream connections will not be added since it is unclear which downstream connections should be made.
- Hide Add transformer block button in integration pipelines if a transformer block already exists (Mage currently only supports 1 transformer block for integration pipelines).

Improve UI performance

- Reduce number of API requests made when refocusing browser.
- Decrease notebook CPU and memory consumption in the browser by removing unnecessary code block re-renders in Pipeline Editor.

**Add pre-commit and improve contributor friendliness**

Shout out to [Joseph Corrado](https://github.com/jcorrado76) for his contribution of adding pre-commit hooks to Mage to run code checks before committing and pushing the code.

Doc: [https://github.com/mage-ai/mage-ai/blob/master/README_dev.md](https://github.com/mage-ai/mage-ai/blob/master/README_dev.md)

C**reate method for deleting secret keys**

Shout out to [hjhdaniel](https://github.com/hjhdaniel) for his contribution of adding the method for deleting secret keys to Mage.

Example code:

python
from mage_ai.data_preparation.shared.secrets import delete_secret

delete_secret('secret_name')


**Retry block**

Retry **from selected block in integration pipeline**

If a block is selected in an integration pipeline to retry block runs, only the block runs for the selected block's stream will be ran.

Automatic retry for blocks

Mage now automatically retries blocks twice on failures (3 total attempts).

Other bug fixes & polish

- Display error popup with link to docs for “too many open files” error.

![image](https://media.graphassets.com/ViVTgPmzQ964QcuZyyu0)

- Fix DBT block limit input field: the limit entered through the UI wasn’t taking effect when previewing the model results. Fix this and set a default limit of 1000.
- Fix BigQuery table id issue for batch load.
- Fix unique conflict handling for BigQuery batch load.
- Remove startup_probe in GCP cloud run executor.
- Fix run command for AWS and GCP job runs so that job run logs can be shown in Mage UI correctly.
- Pass block configuration to `kwargs` in the method.
- Fix SQL block execution when using different schemas between upstream block and current block.

View full [Changelog](https://www.notion.so/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

Page 10 of 10

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.