Dagster

Latest version: v1.10.7

Safety actively analyzes 723200 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 27 of 54

1.0.4

Not secure
New

- Assets can now be materialized to storage conditionally by setting `output_required=False`. If this is set and no result is yielded from the asset, Dagster will not create an asset materialization event, the I/O manager will not be invoked, downstream assets will not be materialized, and asset sensors monitoring the asset will not trigger.
- `JobDefinition.run_request_for_partition` can now be used inside sensors that target multiple jobs (Thanks Metin Senturk!)
- The environment variable `DAGSTER_GRPC_TIMEOUT_SECONDS` now allows for overriding the default timeout for communications between host processes like dagit and the daemon and user code servers.
- Import time for the `dagster` module has been reduced, by approximately 50% in initial measurements.
- `AssetIn` now accepts a `dagster_type` argument, for specifying runtime checks on asset input values.
- [dagit] The column names on the Activity tab of the asset details page no longer reference the legacy term “Pipeline”.
- [dagster-snowflake] The `execute_query` method of the snowflake resource now accepts a `use_pandas_result` argument, which fetches the result of the query as a Pandas dataframe. (Thanks swotai!)
- [dagster-shell] Made the execute and execute_script_file utilities in dagster_shell part of the public API (Thanks Fahad Khan!)
- [dagster-dbt] `load_assets_from_dbt_project` and `load_assets_from_dbt_manifest` now support the `exclude` parameter. (Thanks flvndh!)

Bugfixes

- [dagit] Removed the x-frame-options response header from Dagit, allowing the Dagit UI to be rendered in an iframe.
- [fully-featured project example] Fixed the duckdb IO manager so the comment_stories step can load data successfully.
- [dagster-dbt] Previously, if a `select` parameter was configured on the `dbt_cli_resource`, it would not be passed into invocations of `context.resources.dbt.run()` (and other similar commands). This has been fixed.
- [dagster-ge] An incompatibility between `dagster_ge_validation_factory` and dagster 1.0 has been fixed.
- [dagstermill] Previously, updated arguments and properties to `DagstermillExecutionContext` were not exposed. This has since been fixed.

Documentation

- The integrations page on the docs site now has a section for links to community-hosted integrations. The first linked integration is silentsokolov’s Vault integration.

1.0.3

Not secure
New

- `Failure` now has an `allow_retries` argument, allowing a means to manually bypass retry policies.
- `dagstermill.get_context` and `dagstermill.DagstermillExecutionContext` have been updated to reflect stable dagster-1.0 APIs. `pipeline`/`solid` referencing arguments / properties will be removed in the next major version bump of `dagstermill`.
- `TimeWindowPartitionsDefinition` now exposes a `get_cron_schedule` method.

Bugfixes

- In some situations where an asset was materialized and that asset that depended on a partitioned asset, and that upstream partitioned asset wasn’t part of the run, the partition-related methods of InputContext returned incorrect values or failed erroneously. This was fixed.
- Schedules and sensors with the same names but in different repositories no longer affect each others idempotence checks.
- In some circumstances, reloading a repository in Dagit could lead to an error that would crash the page. This has been fixed.

Community Contributions

- will-holley added an optional `key` argument to GCSFileManager methods to set the GCS blob key, thank you!
- Fix for sensors in [fully featured example](https://docs.dagster.io/guides/dagster/example_project#fully-featured-project), thanks pwachira!

Documentation

- New documentation for getting started with Dagster Cloud, including:
- [Serverless deployment documentation](https://docs.dagster.io/dagster-cloud/getting-started/getting-started-with-serverless-deployment)
- [Hybrid deployment documentation](https://docs.dagster.io/dagster-cloud/getting-started/getting-started-with-hybrid-deployment)

1.0.2

Not secure
New

- When the workpace is updated, a notification will appear in Dagit, and the Workspace tab will automatically refresh.

Bugfixes

- Restored the correct version mismatch warnings between dagster core and dagster integration libraries
- `Field.__init__` has been typed, which resolves an error that pylance would raise about `default_value`
- Previously, `dagster_type_materializer` and `dagster_type_loader` expected functions to take a context argument from an internal dagster import. We’ve added `DagsterTypeMaterializerContext` and `DagsterTypeLoaderContext` so that functions annotated with these decorators can annotate their arguments properly.
- Previously, a single-output op with a return description would not pick up the description of the return. This has been rectified.

Community Contributions

- Fixed the `dagster_slack` documentation examples. Thanks ssingh13-rms!

Documentation

- New documentation for [Dagster Cloud environment variables](https://docs.dagster.io/dagster-cloud/developing-testing/environment-variables).
- The full list of APIs removed in 1.0 has been added to the [migration guide](https://github.com/dagster-io/dagster/blob/master/MIGRATION.md).

1.0.1

Not secure
Bugfixes

- Fixed an issue where Dagster libraries would sometimes log warnings about mismatched versions despite having the correct version loaded.

Documentation

- The [Dagster Cloud docs](https://docs.dagster.io/dagster-cloud) now live alongside all the other Dagster docs! Check them out by nagivating to Deployment > Cloud.

1.0.0

Not secure
Major Changes

- A docs site overhaul! Along with tons of additional content, the existing pages have been significantly edited and reorganized to improve readability.
- All Dagster [examples](https://github.com/dagster-io/dagster/tree/master/examples)[](https://github.com/dagster-io/dagster/tree/master/examples) are revamped with a consistent project layout, descriptive names, and more helpful README files.
- A new `dagster project `CLI contains commands for bootstrapping new Dagster projects and repositories:
- `dagster project scaffold` creates a folder structure with a single Dagster repository and other files such as workspace.yaml. This CLI enables you to quickly start building a new Dagster project with everything set up.
- `dagster project from-example` downloads one of the Dagster examples. This CLI helps you to quickly bootstrap your project with an officially maintained example. You can find the available examples via `dagster project list-examples`.
- Check out [Create a New Project](https://docs.dagster.io/getting-started/create-new-project) for more details.
- A `default_executor_def` argument has been added to the `repository` decorator. If specified, this will be used for any jobs (asset or op) which do not explicitly set an `executor_def`.
- A `default_logger_defs` argument has been added to the `repository` decorator, which works in the same way as `default_executor_def`.
- A new `execute_job` function presents a Python API for kicking off runs of your jobs.
- Run status sensors may now yield `RunRequests`, allowing you to kick off a job in response to the status of another job.
- When loading an upstream asset or op output as an input, you can now set custom loading behavior using the `input_manager_key` argument to AssetIn and In.
- In the UI, the global lineage graph has been brought back and reworked! The graph keeps assets in the same group visually clustered together, and the query bar allows you to visualize a custom slice of your asset graph.

Breaking Changes and Deprecations

Legacy API Removals

In 1.0.0, a large number of previously-deprecated APIs have been fully removed. A full list of breaking changes and deprecations, alongside instructions on how to migrate older code, can be found in [MIGRATION.md](https://github.com/dagster-io/dagster/blob/master/MIGRATION.md). At a high level:

- The `solid` and `pipeline` APIs have been removed, along with references to them in extension libraries, arguments, and the CLI _(deprecated in `0.13.0)`_.
- The `AssetGroup` and `build_asset_job` APIs, and a host of deprecated arguments to asset-related functions, have been removed _(deprecated in `0.15.0`)_.
- The `EventMetadata` and `EventMetadataEntryData` APIs have been removed _(deprecated in `0.15.0`)_.

Deprecations

- `dagster_type_materializer` and `DagsterTypeMaterializer` have been marked experimental and will likely be removed within a 1.x release. Instead, use an `IOManager`.
- `FileManager` and `FileHandle` have been marked experimental and will likely be removed within a 1.x release.

Other Changes

- As of 1.0.0, Dagster no longer guarantees support for python 3.6. This is in line with [PEP 494](https://peps.python.org/pep-0494/), which outlines that 3.6 has reached end of life.
- **[planned]** In an upcoming 1.x release, we plan to make a change that renders values supplied to `configured` in Dagit. Up through this point, values provided to `configured` have not been sent anywhere outside the process where they were used. This change will mean that, like other places you can supply configuration, `configured` is not a good place to put secrets: **You should not include any values in configuration that you don't want to be stored in the Dagster database and displayed inside Dagit.**
- `fs_io_manager`, `s3_pickle_io_manager`, and `gcs_pickle_io_manager`, and `adls_pickle_io_manager` no longer write out a file or object when handling an output with the `None` or `Nothing` type.
- The `custom_path_fs_io_manager` has been removed, as its functionality is entirely subsumed by the `fs_io_manager`, where a custom path can be specified via config.
- The default `typing_type` of a `DagsterType` is now `typing.Any` instead of `None`.
- Dagster’s integration libraries haven’t yet achieved the same API maturity as Dagster core. For this reason, all integration libraries will remain on a pre-1.0 (0.16.x) versioning track for the time being. However, 0.16.x library releases remain fully compatible with Dagster 1.x. In the coming months, we will graduate integration libraries one-by-one to the 1.x versioning track as they achieve API maturity. If you have installs of the form:


pip install dagster=={DAGSTER_VERSION} dagster-somelibrary=={DAGSTER_VERSION}


this should be converted to:


pip install dagster=={DAGSTER_VERSION} dagster-somelibrary


to make sure the correct library version is installed.

0.15.8

Not secure
New

- Software-defined asset config schemas are no longer restricted to `dict`s.
- The `OpDefinition` constructor now accept `ins` and `outs` arguments, to make direct construction easier.
- `define_dagstermill_op` accepts `ins` and `outs` in order to make direct construction easier.

Bugfixes

- Fixed a bug where default configuration was not applied when assets were selected for materialization in Dagit.
- Fixed a bug where `RunRequests` returned from `run_status_sensors` caused the sensor to error.
- When supplying config to `define_asset_job`, an error would occur when selecting most asset subsets. This has been fixed.
- Fixed an error introduced in 0.15.7 that would prevent viewing the execution plan for a job re-execution from 0.15.0 → 0.15.6
- [dagit] The Dagit server now returns `500` http status codes for GraphQL requests that encountered an unexpected server error.
- [dagit] Fixed a bug that made it impossible to kick off materializations of partitioned asset if the `day_offset`, `hour_offset`, or `minute_offset` parameters were set on the asset’s partitions definition.
- [dagster-k8s] Fixed a bug where overriding the Kubernetes command to use to run a Dagster job by setting the `dagster-k8s/config` didn’t actually override the command.
- [dagster-datahub] Pinned version of `acryl-datahub` to avoid build error.

Breaking Changes

- The constructor of `JobDefinition` objects now accept a config argument, and the `preset_defs` argument has been removed.

Deprecations

- `DagsterPipelineRunMetadataValue` has been renamed to `DagsterRunMetadataValue`. `DagsterPipelineRunMetadataValue` will be removed in 1.0.

Community Contributions

- Thanks to hassen-io for fixing a broken link in the docs!

Documentation

- `MetadataEntry` static methods are now marked as deprecated in the docs.
- `PartitionMapping`s are now included in the API reference.
- A dbt example and memoization example using legacy APIs have been removed from the docs site.

Page 27 of 54

Links

Releases

Has known vulnerabilities

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.