Mage-ai

Latest version: v0.9.74

Safety actively analyzes 689550 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 10 of 10

0.7.74

Not secure
Data integration

New sources

- [Facebook Ads](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/facebook_ads/README.md)
- [HubSpot](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/hubspot/README.md)
- [Postmark](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/postmark/README.md)

Improvements on existing sources and destinations

- S3 source
- Automatically add `_s3_last_modified` column from LastModified key, and enable `_s3_last_modified` column as a bookmark property.
- Allow filtering objects using regex syntax by configuring `search_pattern` key.
- Support multiple streams by configuring a list of table configs in `table_configs` key. [https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/amazon_s3/README.md](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/amazon_s3/README.md)
- Postgres source log based replication
- Automatically add a `_mage_deleted_at` column to record the source row deletion time.
- When operation is update and unique conflict method is ignore, create a new record in destination.
- In source or destination yaml config, interpolate secret values from AWS Secrets Manager using syntax `{{ aws_secret_var('some_name_for_secret') }}` . Here is the full guide: [https://docs.mage.ai/production/configuring-production-settings/secrets#yaml](https://docs.mage.ai/production/configuring-production-settings/secrets#yaml)

Full lists of available sources and destinations can be found here:

- Sources: [https://docs.mage.ai/data-integrations/overview#available-sources](https://docs.mage.ai/data-integrations/overview#available-sources)
- Destinations: [https://docs.mage.ai/data-integrations/overview#available-destinations](https://docs.mage.ai/data-integrations/overview#available-destinations)

Customize pipeline alerts

Customize alerts to only send when pipeline fails or succeeds (or both) via `alert_on` config

yaml
notification_config:
alert_on:
- trigger_failure
- trigger_passed_sla
- trigger_success


Here are the guides for configuring the alerts

- Email alerts: [https://docs.mage.ai/production/observability/alerting-email#create-notification-config](https://docs.mage.ai/production/observability/alerting-email#create-notification-config)
- Slack alerts: [https://docs.mage.ai/production/observability/alerting-slack#update-mage-project-settings](https://docs.mage.ai/production/observability/alerting-slack#update-mage-project-settings)
- Teams alerts: [https://docs.mage.ai/production/observability/alerting-teams#update-mage-project-settings](https://docs.mage.ai/production/observability/alerting-teams#update-mage-project-settings)

Deploy Mage on AWS using AWS Cloud Development Kit (CDK)

Besides using Terraform scripts to deploy Mage to cloud, Mage now also supports managing AWS cloud resources using [AWS Cloud Development Kit](https://aws.amazon.com/cdk/) in Typescript.

Follow this [guide](https://github.com/mage-ai/mage-ai-cdk/tree/master/typescript) to deploy Mage app to AWS using AWS CDK scripts.

Stitch integration

Mage can orchestrate the sync jobs in Stitch via API integration. Check out the [guide](https://docs.mage.ai/integrations/stitch) to learn about how to trigger the jobs in Stitch and poll statuses of the jobs.

Bug fixes & polish

- Allow pressing `escape` key to close error message popup instead of having to click on the `x` button in the top right corner.
- If a data integration source has multiple streams, select all streams with one click instead of individually selecting every single stream.

![Untitled](https://media.graphassets.com/output=format:jpg/resize=height:800,fit:max/x1bIQdOZRc6jgI0Kt96n)

- Make pipeline runs pages (both the overall `/pipeline-runs` and the trigger `/pipelines/[trigger]/runs` pages) more efficient by avoiding individual requests for pipeline schedules (i.e. triggers).
- In order to avoid confusion when using the drag and drop feature to add dependencies between blocks in the dependency tree, the ports (white circles on blocks) on other blocks disappear when the drag feature is active. The dependency lines must be dragged from one block’s port onto another block itself, not another block’s port, which is what some users were doing previously.
- Fix positioning of newly added blocks. Previously when adding a new block with a custom block name, the blocks were being added to the bottom of the pipeline, so these new blocks should appear immediately after the block where it was added now.
- Popup error messages include both the stack trace and traceback to help with debugging (previously did not include the traceback).
- Update links to docs in code block comments (links were broken due to recent docs migration to a different platform).

View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.1.1

default:
MONGODB_DATABASE: database
MONGODB_HOST: host
MONGODB_PASSWORD: password
MONGODB_PORT: 27017
MONGODB_COLLECTION: collection
MONGODB_USER: user


**Data loader template**

![Untitled](https://media.graphassets.com/NXOk8CMgQcSksVVkcfvV)

**Data exporter template**

![Untitled](https://media.graphassets.com/CqqozYjqQgqpNbEUsX8a)

Support using `renv` for R block

`renv` is installed in Mage docker image by default. User can use `renv` package to manage R dependency for your project.

Doc for `renv` package: [https://cran.r-project.org/web/packages/renv/vignettes/renv.html](https://cran.r-project.org/web/packages/renv/vignettes/renv.html)

Run **streaming pipeline in separate k8s pod**

Support running streaming pipeline in k8s executor to scale up streaming pipeline execution.

It can be configured in pipeline metadata.yaml with `executor_type` field. Here is an example:

yaml
blocks:
- ...
- ...
executor_count: 1
executor_type: k8s
name: test_streaming_kafka_kafka
uuid: test_streaming_kafka_kafka


When cancelling the pipeline run in Mage UI, Mage will kill the k8s job.

DBT support for Spark

Support running Spark DBT models in Mage. Currently, only the connection method `session` is supported.

Follow this [doc](https://docs.mage.ai/integrations/spark-pyspark#standalone-spark-cluster) to set up Spark environment in Mage. Follow the instructions in [https://docs.mage.ai/tutorials/setup-dbt](https://docs.mage.ai/tutorials/setup-dbt) to set up the DBT. Here is an example DBT Spark `profiles.yml`

yaml
spark_demo:
target: dev
outputs:
dev:
type: spark
method: session
schema: default
host: local


Doc for s**taging/production deployment**

- Add doc for setting up the CI/CD pipeline to deploy Mage to staging and production environments: [https://docs.mage.ai/production/ci-cd/staging-production/github-actions](https://docs.mage.ai/production/ci-cd/staging-production/github-actions)
- Provide example Github Action template for deployment on AWS ECS: [https://github.com/mage-ai/mage-ai/blob/master/templates/github_actions/build_and_deploy_to_aws_ecs_staging_production.yml](https://github.com/mage-ai/mage-ai/blob/master/templates/github_actions/build_and_deploy_to_aws_ecs_staging_production.yml)

**Enable user authentication for** multi-development environment

Update the multi-development environment to go through the user authentication flow. Multi-development environment is used to manage development instances on cloud.

Doc for multi-development environment: [https://docs.mage.ai/developing-in-the-cloud/cloud-dev-environments/overview](https://docs.mage.ai/developing-in-the-cloud/cloud-dev-environments/overview)

Refined dependency graph

- Add buttons for zooming in/out of and resetting dependency graph.
- Add shortcut to reset dependency graph view (double-clicking anywhere on the canvas).

![zooming dep graph.gif](https://media.graphassets.com/OSZbndpKR8m450c5oYrg)

**Add new block with downstream blocks connected**

- If a new block is added between two blocks, include the downstream connection.
- If a new block is added after a block that has multiple downstream blocks, the downstream connections will not be added since it is unclear which downstream connections should be made.
- Hide Add transformer block button in integration pipelines if a transformer block already exists (Mage currently only supports 1 transformer block for integration pipelines).

Improve UI performance

- Reduce number of API requests made when refocusing browser.
- Decrease notebook CPU and memory consumption in the browser by removing unnecessary code block re-renders in Pipeline Editor.

**Add pre-commit and improve contributor friendliness**

Shout out to [Joseph Corrado](https://github.com/jcorrado76) for his contribution of adding pre-commit hooks to Mage to run code checks before committing and pushing the code.

Doc: [https://github.com/mage-ai/mage-ai/blob/master/README_dev.md](https://github.com/mage-ai/mage-ai/blob/master/README_dev.md)

C**reate method for deleting secret keys**

Shout out to [hjhdaniel](https://github.com/hjhdaniel) for his contribution of adding the method for deleting secret keys to Mage.

Example code:

python
from mage_ai.data_preparation.shared.secrets import delete_secret

delete_secret('secret_name')


**Retry block**

Retry **from selected block in integration pipeline**

If a block is selected in an integration pipeline to retry block runs, only the block runs for the selected block's stream will be ran.

Automatic retry for blocks

Mage now automatically retries blocks twice on failures (3 total attempts).

Other bug fixes & polish

- Display error popup with link to docs for “too many open files” error.

![image](https://media.graphassets.com/ViVTgPmzQ964QcuZyyu0)

- Fix DBT block limit input field: the limit entered through the UI wasn’t taking effect when previewing the model results. Fix this and set a default limit of 1000.
- Fix BigQuery table id issue for batch load.
- Fix unique conflict handling for BigQuery batch load.
- Remove startup_probe in GCP cloud run executor.
- Fix run command for AWS and GCP job runs so that job run logs can be shown in Mage UI correctly.
- Pass block configuration to `kwargs` in the method.
- Fix SQL block execution when using different schemas between upstream block and current block.

View full [Changelog](https://www.notion.so/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

Page 10 of 10

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.