Mage-ai

Latest version: v0.9.74

Safety actively analyzes 682404 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 9 of 10

0.8.15

Not secure
Allow pipeline to keep running even if other unrelated blocks fail

Mage pipeline used to stop running if any of the block run failed. A setting was added to continue running the pipeline even if a block in the pipeline fails during the execution.

Check out the [doc](https://docs.mage.ai/guides/triggering-pipelines#additional-trigger-settings) to learn about the additional settings of a trigger.

![Untitled](https://media.graphassets.com/PbeX0dHzThvq0vQWiTUd)

Sync project with Github

If you have your pipeline data stored in a remote repository in Github, you can sync your local project with the remote repository through Mage.

Follow the [doc](https://docs.mage.ai/production/data-sync/github#sync-data-with-github) to set up the sync with Github.

![Untitled](https://media.graphassets.com/Ilxav8tuTuSV3jh4PGA9)

Data integration pipeline

Edit bookmark property values for data integration pipeline from the UI

Edit bookmark property values from UI. User can edit the bookmark values, which will be used as a bookmark for the next sync. The bookmark values will automatically update to the last record synced after the next sync is completed. Check out the [doc](https://docs.mage.ai/guides/data-integration-pipeline#editing-bookmark-property-values) to learn about how to edit bookmark property values.

![Untitled](https://media.graphassets.com/8gIKGluREuYA7v8GkNJr)

Improvements on existing sources and destinations

- Use TEXT instead of VARCHAR with character limit as the column type in Postgres destination
- Show a loader on a data integration pipeline while the list of sources and destinations are still loading

![Untitled](https://media.graphassets.com/JJEYTY4MQJrqstbQqC4Z)


Streaming pipeline

Deserialize Protobuf messages in Kafka’s streaming source

Specify the Protobuf schema class path in the Kafka source config so that Mage can deserialize the Protobuf messages from Kafka.

Doc: [https://docs.mage.ai/guides/streaming/sources/kafka#deserialize-message-with-protobuf-schema](https://docs.mage.ai/guides/streaming/sources/kafka#deserialize-message-with-protobuf-schema)

Add Kafka as streaming destination

Doc: [https://docs.mage.ai/guides/streaming/destinations/kafka](https://docs.mage.ai/guides/streaming/destinations/kafka)

Ingest data to Redshift via Kinesis

Mage doesn’t directly stream data into Redshift. Instead, Mage can stream data to Kinesis. You can configure streaming ingestion for your Amazon Redshift cluster and create a materialized view using SQL statements.

Doc: [https://docs.mage.ai/guides/streaming/destinations/redshift](https://docs.mage.ai/guides/streaming/destinations/redshift)

**Cancel all running pipeline runs for a pipeline**

Add the button to cancel all running pipeline runs for a pipeline.

![Untitled](https://media.graphassets.com/3kwCrK7XSSOWnDm19QMy)

![Untitled](https://media.graphassets.com/4kA8TqgTRn2Zgcw3LHjd)

Other bug fixes & polish

- For the viewer role, don’t show the edit options for the pipeline
- Show “Positional arguments for decorated function” preview for custom blocks

![Untitled](https://media.graphassets.com/kMZOOVrgTemmjI0OG43G)

- Disable notebook keyboard shortcuts when typing in input fields in the sidekick


View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.8.11

Not secure
Configure callbacks on block success or failure

- Add callbacks to run after your block succeeds or fails. You can add a callback by clicking “Add callback” in the “More actions” menu of the block (the three dot icon in the top right).
- For more information about callbacks, check out [the Mage documentation](https://docs.mage.ai/guides/blocks/callbacks)

![Configure-callbacks](https://media.graphassets.com/vNVNabsITcgvsry8bNGA)


Backfill improvements

- Show preview of total pipeline runs created and timestamps of pipeline runs that will be created before starting backfill.
- Misc UX improvements with the backfills pages (e.g. disabling or hiding irrelevant items depending on backfill status, updating backfill table columns that previously weren't updating as needed)

![backfill-improvements](https://media.graphassets.com/1AXnpFp4R639qs9ZMgFL)

Dynamic block improvements

- Support dynamic block to dynamic block
- Block outputs for dynamic blocks don’t show when clicking on the block run

DBT improvements

- View DBT block run sample model outputs
- Compile + preview, show compiled SQL, run/test/build model options, view lineage for single model, and more.
- When clicking a DBT block in the block runs view, show a sample query result of the model
- Only create upstream source if its used
- Don’t create upstream block SQL table unless DBT block reference it.

Handle multi-line pasting in terminal

[https://user-images.githubusercontent.com/78053898/221713029-f1557230-cf79-477b-a0d4-6164eac0624d.mp4](https://user-images.githubusercontent.com/78053898/221713029-f1557230-cf79-477b-a0d4-6164eac0624d.mp4)

File browser improvements

- Upload files and create new files in the root project directory
- Rename and delete any file from file browser

Other bug fixes & polish

- Show pipeline editor main content header on Firefox. The header for the Pipeline Editor main content was hidden for Firefox browsers specifically (which prevented users from being able to change their pipeline names on Firefox).

![Untitled](https://media.graphassets.com/zFOZhyyAStiIuqRxccia)


- Make retry run popup fully visible. Fix issue with Retry pipeline run button popup being cutoff.

![Untitled](https://media.graphassets.com/gVe6Zs1zTcqiNujZDRps)


- Add alert with details on how to allow clipboard paste in insecure contexts

![Untitled](https://media.graphassets.com/lsTTo2dNRJWdJYeP7O8N)
- Show canceling status only for pipeline run being canceled. When multiple runs were being canceled, the status for other runs was being updated to "canceling" even though those runs weren't being canceled.
![Untitled](https://media.graphassets.com/output=format:jpg/Ag7a5YRhRJGcyKOo7owR)
- Remove table prop from destination config. The `table` property is not needed in the data integration destination config templates when building integration pipelines through the UI, so they've been removed.
- Update data loader, transformer, and data exporter templates to not require DataFrame.
- Fix PyArrow issue
- Fix data integration destination row syncing count
- Fix emoji encode for BigQuery destination
- Fix dask memory calculation issue
- Fix Nan being display for runtime value on Syns page
- Odd formatting on Trigger edit page dropdowns (e.g. Fequency) on Windows
- Not fallback to empty pipeline when failing to reading pipeline yaml

View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.8.3

Not secure
User login, management, authentication, roles, and permissions

User login and user level permission control is supported in mage-ai version 0.8.0 and above.

![Untitled](https://media.graphassets.com/X4YcoIbMRZygg4AbEf3S)

![Untitled](https://media.graphassets.com/2DC7F2SsuoCzQj2yGotg)

Setting the environment variable `REQUIRE_USER_AUTHENTICATION` to 1 to turn on user authentication.

Check out the doc to learn more about user authentication and permission control: [https://docs.mage.ai/production/authentication/overview](https://docs.mage.ai/production/authentication/overview)

Data integration

New sources

- [Datadog](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/datadog/README.md)

New destinations

- [AWS Redshift](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/redshift/README.md)

Full lists of available sources and destinations can be found here:

- Sources: [https://docs.mage.ai/data-integrations/overview#available-sources](https://docs.mage.ai/data-integrations/overview#available-sources)
- Destinations: [https://docs.mage.ai/data-integrations/overview#available-destinations](https://docs.mage.ai/data-integrations/overview#available-destinations)

Improvements on existing sources and destinations

- Update Couchbase source to support more unstructured data.
- Make all columns optional in the data integration source schema table settings UI; don’t force the checkbox to be checked and disabled.
- Batch fetch records in Facebook Ads streams to reduce number of requests.

Add connection credential secrets through the UI and store encrypted in Mage’s database

In various surfaces in Mage, you may be asked to input config for certain integrations such as cloud databases or services. In these cases, you may need to input a password or an api key, but you don’t want it to be shown in plain text. To get around this issue, we created a way to store your secrets in the Mage database.

![Untitled](https://media.graphassets.com/q3AyuRluQ22458CkOEK0)

Check out the doc to learn more about secrets management in Mage: [https://docs.mage.ai/development/secrets/secrets](https://docs.mage.ai/development/secrets/secrets)

Configure max number of concurrent block runs

Mage now supports limiting the number of concurrent block runs by customizing queue config, which helps avoid mage server being overloaded by too many block runs. User can configure the maximum number of concurrent block runs in project’s metadata.yaml via `queue_config`.

yaml
queue_config:
concurrency: 100


Add triggers list page and terminal tab

- Add a dedicated page to show all triggers.

![Untitled](https://media.graphassets.com/ub9fuag7TK6q4lXyyUll)

- Add a link to the terminal in the main dashboard left vertical navigation and show the terminal in the main view of the dashboard.

![Untitled](https://media.graphassets.com/5ythVf1eRKmFLb8DcYe9)

Support running PySpark pipeline locally

Support running PySpark pipelines locally without custom code and settings.

If you have your Spark cluster running locally, you can just build your standard batch pipeline with PySpark code same as other Python pipelines. Mage handles data passing between blocks automatically for Spark DataFrames. You can use `kwargs['spark']` in Mage blocks to access the Spark session.

Other bug fixes & polish

- Add MySQL data exporter template
- Add MySQL data loader template
- Upgrade Pandas version to 1.5.3
- Improve K8s executor
- Pass environment variables to k8s job pods
- Use the same image from main mage server in k8s job pods
- Store and return sample block output for large json object
- Support [SASL authentication](https://github.com/mage-ai/mage-ai/blob/master/mage_ai/data_preparation/templates/data_loaders/streaming/kafka.yaml#L15-L20) with Confluent Cloud Kafka in streaming pipeline


View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.7.98

Not secure
Data integration

New sources

- [Couchbase](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/couchbase/README.md)

Full lists of available sources and destinations can be found here:

- Sources: [https://docs.mage.ai/data-integrations/overview#available-sources](https://docs.mage.ai/data-integrations/overview#available-sources)
- Destinations: [https://docs.mage.ai/data-integrations/overview#available-destinations](https://docs.mage.ai/data-integrations/overview#available-destinations)

Improvements on existing sources and destinations

- Support deltalake connector in Trino destination
- Fix Outreach source bookmark comparison error
- Fix Facebook Ads source “User request limit reached” error
- Show more HubSpot source sync print log statements to give the user more information on the progress and activity

Databricks integration for Spark

Mage now supports building and running Spark pipelines with remote Databricks Spark cluster.

Check out the [guide](https://docs.mage.ai/integrations/databricks) to learn about how to use Databricks Spark cluster with Mage.

![Untitled](https://media.graphassets.com/WkMFeGO0QKSsdbAOtbad)

RabbitMQ streaming source

Shout out to [Luis Salomão](https://github.com/Luishfs) for his contribution of adding the RabbitMQ streaming source to Mage! Check out the [doc](https://docs.mage.ai/guides/streaming/streaming-pipeline-rabbitmq) to set up a streaming pipeline with RabbitMQ source.

![Untitled](https://media.graphassets.com/VdX6fbhCQraolFzzn456)

DBT support for Trino

Support running Trino DBT models in Mage.

![Untitled](https://media.graphassets.com/CNCBwXQRCmw8ZxLL0dHQ)

More K8s support

- Allow customizing namespace by setting the `KUBE_NAMESPACE` environment variable.
- Support [K8s executor](https://docs.mage.ai/production/configuring-production-settings/compute-resource#kubernetes-executor) on AWS EKS cluster.

Generic block

Add a generic block that can run in a pipeline, optionally accept inputs, and optionally return outputs but not a data loader, data exporter, or transformer block.

![Untitled](https://media.graphassets.com/4x7cLH0gRtCg9N0CXDqc)

Other bug fixes & polish

- Support overriding runtime variables when clicking the Run now button on the triggers list page.

![Untitled](https://media.graphassets.com/bt1lsZ5yTLCfb9Z96MzH)

- Support MySQL SQL block
- Fix the serialization for the column that is a dictionary or list of dictionaries when saving the output dataframe of a block.
- Allow selecting multiple partition keys for Delta Lake destination.
- Support copy and paste into/from Mage terminal.

View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.7.90

Not secure
Data integration

New sources

- [Monday](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/sources/monday)
- [Commercetools](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/sources/commercetools)
- [Front app](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/sources/front)

New destinations

- [Microsoft SQL Server (destination)](https://github.com/mage-ai/mage-ai/tree/master/mage_integrations/mage_integrations/destinations/mssql)

Full lists of available sources and destinations can be found here:

- Sources: [https://docs.mage.ai/data-integrations/overview#available-sources](https://docs.mage.ai/data-integrations/overview#available-sources)
- Destinations: [https://docs.mage.ai/data-integrations/overview#available-destinations](https://docs.mage.ai/data-integrations/overview#available-destinations)

Improvements on existing sources and destinations

- Trino destination
- Support `MERGE` command in Trino connector to handle conflict.
- Allow customizing `query_max_length` to adjust batch size.
- MSSQL source
- Fix datetime column conversion and comparison for MSSQL source.
- BigQuery destination
- Fix BigQuery error “Deadline of 600.0s exceeded while calling target function”.
- Deltalake destination
- Upgrade delta library from version from 0.6.4 to 0.7.0 to fix some errors.
- Allow datetime columns to be used as bookmark properties.
- When clicking apply button in the data integration schema table, if a bookmark column is not a valid replication key for a table or a unique column is not a valid key property for a table, don’t apply that change to that stream.

New command line tool

Mage has a newly revamped command line tool, with better formatting, clearer help commands, and more informative error messages. Kudos to community member [jlondonobo](https://github.com/jlondonobo), for your awesome contribution!

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/399c9c59-3f00-432f-983d-6dcf9c82109d/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004258Z&X-Amz-Expires=86400&X-Amz-Signature=a5a353ed09527fdd80e5944155c9c6cebb6bfa00716ce0dbbddc3bb8c24c3d9c&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/5cface86-6146-4281-aed8-6659873ebb3b/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004335Z&X-Amz-Expires=86400&X-Amz-Signature=771844e386c0ecb07ea207796db9c7237f9d3842627c769787f5a04913a26fc5&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/bcf41062-491a-43b4-8406-43b2b5d1e5e0/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004347Z&X-Amz-Expires=86400&X-Amz-Signature=e12f443d3f0dd1e31f6f4f543ebe9ef8dbc3924f20e4197adef49141c2673016&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

DBT block improvements

- Support running Redshift DBT models in Mage.
- Raise an error if there is a DBT compilation error when running DBT blocks in a pipeline.
- Fix duplicate DBT source names error with same source name across multiple `mage_sources.yml` files in different model subfolders: use only 1 sources file for. all models instead of nesting them in subfolders.

Notebook improvements

- Support editing global variables in UI: [https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-mage-editor](https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-mage-editor)
- Support creating or edit global variables in code by editing the pipeline `metadata.yaml` file. [https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-code](https://docs.mage.ai/production/configuring-production-settings/runtime-variable#in-code)
- Add a save file button when editing a file not in the pipeline notebook.

![Untitled](https://s3.us-west-2.amazonaws.com/secure.notion-static.com/5e41d400-c0aa-4e34-b89e-db057419aa90/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAT73L2G45EIPT3X45%2F20230207%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20230207T004403Z&X-Amz-Expires=86400&X-Amz-Signature=ecb3f9d02c6249227e31e615c85d36f02e0e44e3b799655985acba1b66013f13&X-Amz-SignedHeaders=host&response-content-disposition=filename%3D%22Untitled.png%22&x-id=GetObject)

- Support Windows keyboard shortcuts: CTRL+S to save the files.
- Support uploading files through UI.

![Untitled](https://media.graphassets.com/output=format:jpg/resize=height:800,fit:max/BnSxgNEkRCSFOQaFt3HH)

Store logs in GCP Cloud Storage bucket

Besides storing logs on the local disk or AWS S3, we now add the option to store the logs in GCP Cloud Storage by adding logging config in project’s metadata.yaml like below:

sql
logging_config:
type: gcs
level: INFO
destination_config:
path_to_credentials: <path to gcp credentials json file>
bucket: <bucket name>
prefix: <prefix path>


Check out the doc for details: [https://docs.mage.ai/production/observability/logging#google-cloud-storage](https://docs.mage.ai/production/observability/logging#google-cloud-storage)

Other bug fixes & improvements

- SQL block improvements
- Support writing raw SQL to customize the create table and insert commands.
- Allow editing SQL block output table names.
- Support loading files from a directory when using `mage_ai.io.file.FileIO` . Example:

python
from mage_ai.io.file import FileIO

file_directories = ['default_repo/csvs']
FileIO().load(file_directories=file_directories)


View full [Changelog](https://www.notion.so/mageai/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.7.84

Not secure

Page 9 of 10

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.