Mage-ai

Latest version: v0.9.74

Safety actively analyzes 682404 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 7 of 10

0.8.86

Not secure
![Image](https://media.giphy.com/media/roh7bs2cEW2ReFgYRN/giphy-downsized.gif)

Replicate blocks

Support reusing same block multiple times in a single pipeline.

Doc: https://docs.mage.ai/design/blocks/replicate-blocks

![Untitled](https://media.graphassets.com/ufT5VQlTTFK70zZZAHkF)

![Untitled](https://media.graphassets.com/kwFLC5WDSiGq0wTwXKB5)

Spark on Yarn

Support running Spark code on Yarn cluster with Mage.

Doc: https://docs.mage.ai/integrations/spark-pyspark#hadoop-and-yarn-cluster-for-spark

Customize retry config

Mage supports configuring automatic retry for block runs with the following ways

1. Add `retry_config` to project’s `metadata.yaml`. This `retry_config` will be applied to all block runs.
2. Add `retry_config` to the block config in pipeline’s `metadata.yaml`. The block level `retry_config` will override the global `retry_config`.

Example config:

yaml
retry_config:
Number of retry times
retries: 0
Initial delay before retry. If exponential_backoff is true,
the delay time is multiplied by 2 for the next retry
delay: 5
Maximum time between the first attempt and the last retry
max_delay: 60
Whether to use exponential backoff retry
exponential_backoff: true


Doc: https://docs.mage.ai/orchestration/pipeline-runs/retrying-block-runs#automatic-retry

DBT improvements

- When running DBT block with language YAML, interpolate and merge the user defined --vars in the block’s code into the variables that Mage automatically constructs
- Example block code of different formats

bash
--select demo/models --vars '{"demo_key": "demo_value", "date": 20230101}'
--select demo/models --vars {"demo_key":"demo_value","date":20230101}
--select demo/models --vars '{"global_var": {{ test_global_var }}, "env_var": {{ test_env_var }}}'
--select demo/models --vars {"refresh":{{page_refresh}},"env_var":{{env}}}


- Doc: https://docs.mage.ai/dbt/run-single-model#adding-variables-when-running-a-yaml-dbt-block
- Support `dbt_project.yml` custom project names and custom profile names that are different than the DBT folder name
- Allow user to configure block to run DBT snapshot

Dynamic SQL block

Support using dynamic child blocks for SQL blocks

Doc: https://docs.mage.ai/design/blocks/dynamic-blocks#dynamic-sql-blocks

Run blocks concurrently in separate containers on Azure

If your Mage app is deployed on Microsoft Azure with Mage’s **[terraform scripts](https://github.com/mage-ai/mage-ai-terraform-templates/tree/master/azure)**, you can choose to launch separate Azure container instances to execute blocks.

Doc: https://docs.mage.ai/production/configuring-production-settings/compute-resource#azure-container-instance-executor

Run the scheduler and the web server in separate containers or pods

- Run scheduler only: `mage start project_name --instance-type scheduler`
- Run web server only: `mage start project_name --instance-type web_server`
- web server can be run in multiple containers or pods
- Run both server and scheduler: `mage start project_name --instance-type server_and_scheduler`

Support all operations on folder

Support “Add”, “Rename”, “Move”, “Delete” operations on folder.

![Untitled](https://media.graphassets.com/YUnvzFbR2SBZ61y1Eton)

Configure environments for triggers in code

Allow specifying `envs` value to apply triggers only in certain environments.

Example:

yaml
triggers:
- name: test_example_trigger_in_prod
schedule_type: time
schedule_interval: "daily"
start_time: 2023-01-01
status: active
envs:
- prod
- name: test_example_trigger_in_dev
schedule_type: time
schedule_interval: "hourly"
start_time: 2023-03-01
status: inactive
settings:
skip_if_previous_running: true
allow_blocks_to_fail: true
envs:
- dev


Doc: https://docs.mage.ai/guides/triggers/configure-triggers-in-code#create-and-configure-triggers

Replace current logs table with virtualized table for better UI performance

- Use virtual table to render logs so that loading thousands of rows won't slow down browser performance.
- Fix formatting of logs table rows when a log is selected (the log detail side panel would overly condense the main section, losing the place of which log you clicked).
- Pin logs page header and footer.
- Tested performance using Lighthouse Chrome browser extension, and performance increased 12 points.

Other bug fixes & polish

- Add indices to schedule models to speed up DB queries.
- “Too many open files issue”
- Check for "Too many open files" error on all pages calling "displayErrorFromReadResponse" util method (e.g. pipeline edit page), not just Pipelines Dashboard.

![Untitled](https://media.graphassets.com/vnPZfdyQiid2ocrw4cQJ)

- Update terraform scripts to set the `ULIMIT_NO_FILE` environment variable to increase maximum number of open files in Mage deployed on AWS, GCP and Azure.
- Fix git_branch resource blocking page loads. The `git clone` command could cause the entire app to hang if the host wasn't added to known hosts. `git clone` command is updated to run as a separate process with the timeout, so it won't block the entire app if it's stuck.
- Fix bug: when adding a block in between blocks in pipeline with two separate root nodes, the downstream connections are removed.
- Fix DBT error: `KeyError: 'file_path'`. Check for `file_path` before calling `parse_attributes` method to avoid KeyError.
- Improve the coding experience when working with Snowflake data provider credentials. Allow more flexibility in Snowflake SQL block queries. Doc: https://docs.mage.ai/integrations/databases/Snowflake#methods-for-configuring-database-and-schema
- Pass parent block’s output and variables to its callback blocks.
- Fix missing input field and select field descriptions in charts.
- Fix bug: Missing values template chart doesn’t render.
- Convert `numpy.ndarray` to `list` if column type is list when fetching input variables for blocks.
- Fix runtime and global variables not available in the keyword arguments when executing block with upstream blocks from the edit pipeline page.

View full [Changelog](https://www.notion.so/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.8.83

Not secure
![image](https://media.giphy.com/media/U2Olc7gWU5pFzpDufV/giphy-downsized.gif)

Support more complex streaming pipeline

More complex streaming pipeline is supported in Mage now. You can use more than transformer and more than one sinks in the streaming pipeline.

Here is an example streaming pipeline with multiple transformers and sinks.

![Untitled](https://media.graphassets.com/4ZWHyLpgTESu0rB1sXnP)

Doc for streaming pipeline: [https://docs.mage.ai/guides/streaming/overview](https://docs.mage.ai/guides/streaming/overview)

Custom Spark configuration

Allow using custom Spark configuration to create Spark session used in the pipeline.

yaml
spark_config:
Application name
app_name: 'my spark app'
Master URL to connect to
e.g., spark_master: 'spark://host:port', or spark_master: 'yarn'
spark_master: 'local'
Executor environment variables
e.g., executor_env: {'PYTHONPATH': '/home/path'}
executor_env: {}
Jar files to be uploaded to the cluster and added to the classpath
e.g., spark_jars: ['/home/path/example1.jar']
spark_jars: []
Path where Spark is installed on worker nodes,
e.g. spark_home: '/usr/lib/spark'
spark_home: null
List of key-value pairs to be set in SparkConf
e.g., others: {'spark.executor.memory': '4g', 'spark.executor.cores': '2'}
others: {}


Doc for running PySpark pipeline: [https://docs.mage.ai/integrations/spark-pyspark#standalone-spark-cluster](https://docs.mage.ai/integrations/spark-pyspark#standalone-spark-cluster)

Data integration pipeline

DynamoDB source

New data integration source DynamoDB is added.

Doc: [https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/dynamodb/README.md](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/dynamodb/README.md)

Bug fixes

- Use `timestamptz` as data type for datetime column in Postgres destination.
- Fix BigQuery batch load error.

**Show file browser outside edit pipeline**

Improved the file editor of Mage so that user can edit the files without going into a pipeline.

![Untitled](https://media.graphassets.com/gsAACptRJa3IZ94tRjAR)

**Add all file operations**

![Untitled](https://media.graphassets.com/ItoUlNwmSq6dXnRHYcrT)

**Speed up writing block output to disk**

Mage uses Polars to speed up writing block output (DataFrame) to disk, reducing the time of fetching and writing a DataFrame with 2 million rows from 90s to 15s.

**Add default `.gitignore`**

Mage automatically adds the default `.gitignore` file when initializing project


.DS_Store
.file_versions
.gitkeep
.log
.logs/
.preferences.yaml
.variables/
__pycache__/
docker-compose.override.yml
logs/
mage-ai.db
mage_data/
secrets/


Other bug fixes & polish

- Include trigger URL in slack alert.

![Untitled](https://media.graphassets.com/0Uzv5GaWSd2B5192P4N9)

- Fix race conditions for multiple runs within one second
- If DBT block is language YAML, hide the option to add upstream dbt refs
- Include event_variables in individual pipeline run retry
- Callback block
- Include parent block uuid in callback block kwargs
- Pass parent block’s output and variables to its callback blocks
- Delete GCP cloud run job after it's completed.
- Limit the code block output from print statements to avoid sending excessively large payload request bodies when saving the pipeline.
- Lock typing extension version to fix error `TypeError: Instance and class checks can only be used with runtime protocols`.
- Fix git sync and also updates how we save git settings for users in the backend.
- Fix MySQL ssh tunnel: close ssh tunnel connection after testing connection.

View full [Changelog](https://www.notion.so/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.8.78

Not secure
![image](https://media.giphy.com/media/z9g6xLr5C0H1m/giphy.gif)

MongoDB code templates

Add code templates to fetch data from and export data to MongoDB.

Example MongoDB config in `io_config.yaml` :

yaml

0.8.75

Not secure
![Image](https://media.giphy.com/media/nqU7bHru9egnz9joLZ/giphy.gif)

Polars integration

Support using [Polars](https://www.pola.rs/) DataFrame in Mage blocks.

![Untitled](https://media.graphassets.com/FBfwDhuzSnyc2O3pLyMS)

**Opsgenie integration**

Shout out to [Sergio Santiago](https://github.com/sergioasantiago) for his contribution of integrating [Opsgenie](https://www.atlassian.com/software/opsgenie) as an alerting option in Mage.

Doc: [https://docs.mage.ai/production/observability/alerting-opsgenie](https://docs.mage.ai/production/observability/alerting-opsgenie)

![Untitled](https://media.graphassets.com/cBTZzY7QHmNpt94GOiS5)

Data integration

Speed up exporting data to BigQuery destination

Add support for using batch load jobs instead of the query API in BigQuery destination. You can enable it by setting `use_batch_load` to `true` in BigQuery destination config.

When loading ~150MB data to BigQuery, using batch loading reduces the time from 1 hour to around 2 minutes (**30x** the speed).

Doc: [https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/destinations/bigquery/README.md](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/destinations/bigquery/README.md)

Microsoft SQL Server destination improvements

- Support ALTER command to add new columns
- Support MERGE command with multiple unique columns (use AND to connect the columns)
- Add MSSQL config fields to `io_config.yaml`
- Support multiple keys in MSSQL destination

Other improvements

- Fix parsing int timestamp in intercom source.
- Remove the “Execute” button from transformer block in data integration pipelines.
- Support using ssh tunnel for MySQL source with private key content

Git integration improvements

- Pass in Git settings through environment variables
- Doc: [https://docs.mage.ai/production/data-sync/git#git-settings-as-environment-variables](https://docs.mage.ai/production/data-sync/git#git-settings-as-environment-variables)
- Use `git switch` to switch branches
- Fix git ssh key generation
- Save cwd as repo path if user leaves the field blank

Update disable notebook edit mode to allow certain operations

Add another value to `DISABLE_NOTEBOOK_EDIT_ACCESS` environment variable to allow users to create secrets, variables, and run blocks.

The available values are

- 0: this is the same as omitting the variable
- 1: no edit/execute access to the notebook within Mage. Users will not be able to use the notebook to edit pipeline content, execute blocks, create secrets, or create variables.
- 2: no edit access for pipelines. Users will not be able to edit pipeline/block metadata or content.

Doc: [https://docs.mage.ai/production/configuring-production-settings/overview#read-only-access](https://docs.mage.ai/production/configuring-production-settings/overview#read-only-access)

**Update Admin abilities**

- Allow admins to view and update existing users' usernames/emails/roles/passwords (except for owners and other admins).
- Admins can only view Viewers/Editors and adjust their roles between those two.
- Admins cannot create or delete users (only owners can).
- Admins cannot make other users owners or admins (only owners can).

**Retry block runs from specific block**

For standard python pipelines, retry block runs from a selected block. The selected block and all downstream blocks will be re-ran after clicking the `Retry from selected block` button.

![Untitled](https://media.graphassets.com/027UMEirSXS8AOovYzi5)

Other bug fixes & polish

- Fix terminal user authentication. Update terminal authentication to happen on message.
- Fix a potential authentication issue for the Google Cloud PubSub publisher client
- Dependency graph improvements
- Update dependency graph connection depending on port side
- Show all ports for data loader and exporter blocks in dependency graph

![Untitled](https://media.graphassets.com/T03gZS58TrUuI339O8IC)

- DBT
- Support DBT alias and schema model config
- Fix `limit` property in DBT block PUT request payload.
- Retry pipeline run
- Fix bug: Individual pipeline run retries does not work on sqlite.
- Allow bulk retry runs when DISABLE_NOTEBOOK_EDIT_ACCESS enabled
- Fix bug: Retried pipeline runs and errors don’t appear in Backfill detail page.
- Fix bug: When Mage fails to fetch a pipeline due to a backend exception, it doesn't show the actual error. It uses "undefined" in the pipeline url instead, which makes it hard to debug the issue.
- Improve job scheduling: If jobs with QUEUED status are not in queue, re-enqueue them.
- Pass `imagePullSecrets` to k8s job when using `k8s` as the executor.
- Fix streaming pipeline cancellation.
- Fix the version of google-cloud-run package.
- Fix query permissions for block resource
- Catch `sqlalchemy.exc.InternalError` in server and roll back transaction.

View full [Changelog](https://www.notion.so/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.8.69

Not secure
![hairy_otter](https://static.wikia.nocookie.net/cat-army-and-dog-army/images/3/3d/Ijed.jpg/revision/latest?cb=20220301164944)

Markdown blocks aka Note blocks or Text blocks

Added Markdown block to Pipeline Editor.

Doc: [https://docs.mage.ai/guides/blocks/markdown-blocks](https://docs.mage.ai/guides/blocks/markdown-blocks)

![https://user-images.githubusercontent.com/78053898/235216815-af25cbc8-eeee-4849-9f66-cb2da5ff4f4a.gif](https://user-images.githubusercontent.com/78053898/235216815-af25cbc8-eeee-4849-9f66-cb2da5ff4f4a.gif)

Git integration improvements

Add git clone action

![Untitled](https://media.graphassets.com/1st8tITd6t3aNsCW1RyA)

Allow users to select which files to commit

![Untitled](https://media.graphassets.com/3EjBuMzWSCmicZ2yxcod)

Add HTTPS authentication

Doc: [https://docs.mage.ai/production/data-sync/git#https-token-authentication](https://docs.mage.ai/production/data-sync/git#https-token-authentication)

Add a terminal toggle, so that users have easier access to the terminal

![Untitled](https://media.graphassets.com/tcdQ89idSrah0drIm3jK)

Callback block improvements

Doc: [https://docs.mage.ai/development/blocks/callbacks/overview](https://docs.mage.ai/development/blocks/callbacks/overview)

Make callback block more generic and support it in data integration pipeline.

![Untitled](https://media.graphassets.com/ZIIOIviwR4Ssx9KdGAKi)

Keyword arguments available in data integration pipeline callback blocks: [https://docs.mage.ai/development/blocks/callbacks/overview#data-integration-pipelines-only](https://docs.mage.ai/development/blocks/callbacks/overview#data-integration-pipelines-only)

Transfer owner status or edit the owner account email

- Owners can make other users owners.
- Owners can edit other users' emails.
- Users can edit their emails.

![transfer-owner](https://media.graphassets.com/nejxgiE3Q4lDoyKCQVu4)

Bulk retry pipeline runs.

Support bulk retrying pipeline runs for a pipeline.

![image](https://media.graphassets.com/GEfFmpNwS8CI5yKSCOxx)

Right click context menu

Add right click context menu for row on pipeline list page for pipeline actions (e.g. rename).

![Untitled](https://media.graphassets.com/0TU4slsOQo2zbUPgbLeA)

Navigation improvements

When hovering over left and right vertical navigation, expand it to show navigation title like BigQuery’s UI.

![Untitled](https://media.graphassets.com/O8j9dCDEQ1i0dSRpD4n1)

Use Great Expectations suite from JSON object or JSON file

Doc: [https://docs.mage.ai/development/testing/great-expectations#json-object](https://docs.mage.ai/development/testing/great-expectations#json-object)

Support DBT incremental models

Doc: [https://docs.mage.ai/dbt/incremental-models](https://docs.mage.ai/dbt/incremental-models)

Data integration pipeline

New destination: Google Cloud Storage

Shout out to [André Ventura](https://github.com/andreventura02) for his contribution of adding the Google Cloud Storage destination to data integration pipeline.

Doc: [https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/destinations/google_cloud_storage/README.md](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/destinations/google_cloud_storage/README.md)

Other improvements

- Use bookmarks properly in Intercom incremental streams.
- Support IAM role based authentication in the Amazon S3 source and Amazon S3 destination for data integration pipelines

SQL module improvements

A**dd Apache Druid data source**

Shout out to [Dhia Eddine Gharsallaoui](https://github.com/dhia-gharsallaoui) again for his contribution of adding Druid data source to Mage.

![Untitled](https://media.graphassets.com/fyFCvUxzR5Chj3cxBfB4)

Doc: [https://docs.mage.ai/integrations/databases/Druid](https://docs.mage.ai/integrations/databases/Druid)

Add location as a config in BigQuery IO

![Untitled](https://media.graphassets.com/DYn3nbJsSrWmvUzV32aU)

Speed up Postgres IO export method

Use `COPY` command in `mage_ai.io.postgres.Postgres` export method to speed up writing data to Postgres.

Streaming pipeline

New source: Google Cloud PubSub

Doc: [https://docs.mage.ai/guides/streaming/sources/google-cloud-pubsub](https://docs.mage.ai/guides/streaming/sources/google-cloud-pubsub)

****Deserialize message with Avro schema in Confluent schema registry****

Doc: [https://docs.mage.ai/guides/streaming/sources/kafka#deserialize-message-with-avro-schema-in-confluent-schema-registry](https://docs.mage.ai/guides/streaming/sources/kafka#deserialize-message-with-avro-schema-in-confluent-schema-registry)

Kubernetes executor

Add config to set all pipelines use K8s executor

Setting the environment variable `DEFAULT_EXECUTOR_TYPE` to `k8s` to use K8s executor by default for all pipelines. Doc: [https://docs.mage.ai/production/configuring-production-settings/compute-resource#2-set-executor-type-and-customize-the-compute-resource-of-the-mage-executor](https://docs.mage.ai/production/configuring-production-settings/compute-resource#2-set-executor-type-and-customize-the-compute-resource-of-the-mage-executor)

Add the `k8s_executor_config` to project’s metadata.yaml to apply the config to all the blocks that use k8s executor in this project. Doc: [https://docs.mage.ai/production/configuring-production-settings/compute-resource#kubernetes-executor](https://docs.mage.ai/production/configuring-production-settings/compute-resource#kubernetes-executor)

Support configuring GPU for k8s executor

Allow specifying [GPU resource](https://kubernetes.io/docs/tasks/manage-gpus/scheduling-gpus/) in `k8s_executor_config`.

![Untitled](https://media.graphassets.com/JlEyQlBQn32gjO9YptAU)

Not use `default` as service account namespace in Helm chart

Fix service account permission for creating Kubernetes jobs by not using `default` namespace.

Doc for deploying with Helm: [https://docs.mage.ai/production/deploying-to-cloud/using-helm](https://docs.mage.ai/production/deploying-to-cloud/using-helm)

Other bug fixes & polish

- Fix error: When selecting or filtering data from parent block, error occurs: "AttributeError: list object has no attribute tolist".
- Fix bug: Web UI crashes when entering edit page ([github issue](https://github.com/mage-ai/mage-ai/issues/2134)).
- Fix bug: Hidden folder (.mage_temp_profiles) disabled in File Browser and not able to be minimized
- Support configuring Mage server public host used in the email alerts by setting environment variable `MAGE_PUBLIC_HOST`.
- Speed up PipelineSchedule DB query by adding index to column.
- Fix EventRulesResource AWS permissions error
- Fix bug: Bar chart shows too many X-axis ticks

![image](https://static.wikia.nocookie.net/chainedechoes_gamepedia_en/images/f/fc/Bestiary_228.png/revision/latest/scale-to-width-down/350?cb=20230108215151)

View full [Changelog](https://www.notion.so/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

0.8.58

Not secure
![Once_and_always](https://media.giphy.com/media/5GtBeb2KIlYrSjlFq4/giphy.gif)

Trigger pipeline from a block

Provide code template to trigger another pipeline from a block within a different pipeline.****

![trigger_pipeline](https://media.graphassets.com/SwmGjAtkQ3iwbKTy7QRt)

Doc: [https://docs.mage.ai/orchestration/triggers/trigger-pipeline](https://docs.mage.ai/orchestration/triggers/trigger-pipeline)

Data integration pipeline

New source: Twitter Ads

Doc: [https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/twitter_ads/README.md](https://github.com/mage-ai/mage-ai/blob/master/mage_integrations/mage_integrations/sources/twitter_ads/README.md)

Streaming pipeline

New sink: MongoDB

Doc: [https://docs.mage.ai/guides/streaming/destinations/mongodb](https://docs.mage.ai/guides/streaming/destinations/mongodb)

Allow deleting SQS message manually

Mage supports two ways to delete messages:

1. Delete the message in the data loader automatically after deserializing the message body.
2. Manually delete the message in transformer after processing the message.

Doc: [https://docs.mage.ai/guides/streaming/sources/amazon-sqs#message-deletion-method](https://docs.mage.ai/guides/streaming/sources/amazon-sqs#message-deletion-method)

Allow running multiple executors for streaming pipeline

Set `executor_count` variable in the pipeline’s metadata.yaml file to run multiple executors at the same time to scale the streaming pipeline execution

Doc: [https://docs.mage.ai/guides/streaming/overview#run-pipeline-in-production](https://docs.mage.ai/guides/streaming/overview#run-pipeline-in-production)

Improve instructions in the sidebar for getting block output variables

- Update generic block templates for custom, transformer, and data exporter blocks so it's easier for users to pass the output from upstream blocks.
- Clarified language for block output variables in Sidekick.

![block_output_variables](https://media.graphassets.com/xqBkj4tqQgGUlhmTkER2)

Paginate block runs in Pipeline Detail page and schedules on Trigger page

Added pagination to Triggers and Block Run pages

![pagination_triggers_block_run_pages](https://media.graphassets.com/ymOxoktbSn2ATaWCfsIQ)

Automatically install requirements.txt file after git pulls

After pulling the code from git repository to local, automatically install the libraries in `requirements.txt` so that the pipelines can run successfully without manual installation of the packages.

Add warning for kernel restarts and show kernel metrics

- Add warning for kernel if it unexpectedly restarts.
- Add memory and cpu metrics for the kernel.

![add_warning_kernel](https://media.graphassets.com/zxyYyaFvS9ea8wGjDN7w)

![add_memory_kernel](https://media.graphassets.com/BNgSOL6NRjS5WqYZtJey)

SQL block: **Customize upstream table names**

Allow setting the table names for upstream blocks when using SQL blocks.

![sql_block_upstream_table](https://media.graphassets.com/hx1K5JGUSjGOZNoO15CF)

Other bug fixes & polish

- Fix “Too many open files” error by providing the option to increase the “maximum number of open files” value: [https://docs.mage.ai/production/configuring-production-settings/overview#ulimit](https://docs.mage.ai/production/configuring-production-settings/overview#ulimit)
- Add `connect_timeout` to PostgreSQL IO
- Add `location` to BigQuery IO
- Mitigate race condition in Trino IO
- When clicking the sidekick navigation, don’t clear the URL params
- UI: support dynamic child if all dynamic ancestors eventually reduce before dynamic child block
- Fix PySpark pipeline deletion issue. Allow pipeline to be deleted without switching kernel.
- DBT block improvement and bug fixes
- Fix the bug of running all models of DBT
- Fix DBT test not reading profile
- Disable notebook shortcuts when adding new DBT model.
- Remove `.sql` extension in DBT model name if user includes it (the `.sql` extension should not be included).
- Dynamically size input as user types DBT model name with `.sql` suffix trailing to emphasize that the `.sql` extension should not be included.
- Raise exception and display in UI when user tries to add a new DBT model to the same file location/path.
- Fix `onSuccess` [callback](https://docs.mage.ai/guides/blocks/callbacks) logging issue
- Fixed `mage run` command. Set repo_path before initializing the DB so that we can get correct db_connection_url.
- Fix bug: Code block running spinner keeps spinning when restarting kernel.
- Fix bug: Terminal doesn’t work in mage demo
- Automatically redirect users to the sign in page if they are signed in but can’t load anything.
- Add folder lines in file browser.
- Fix `ModuleNotFoundError: No module named 'aws_secretsmanager_caching'` when running pipeline from command line

View full [Changelog](https://www.notion.so/What-s-new-7cc355e38e9c42839d23fdbef2dabd2c)

Page 7 of 10

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.