Datacontract-cli

Latest version: v0.10.23

Safety actively analyzes 723158 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 8

0.10.17

Added
- added export format **markdown**: `datacontract export --format markdown` (545)
- When importing in dbt format, add the dbt unique information as a datacontract unique field (558)
- When importing in dbt format, add the dbt primary key information as a datacontract primaryKey field (562)
- When exporting in dbt format, add the datacontract references field as a dbt relationships test (569)
- When importing in dbt format, add the dbt relationships test field as a reference in the data contract (570)
- Add serve command on README (592)

Changed
- Primary and example fields have been deprecated in Data Contract Specification v1.1.0 (561)
- Define primaryKey and examples for model to follow the changes in datacontract-specification v1.1.0 (559)

Fixed
- SQL Server: cannot escape reserved word on model (557)
- Export dbt-staging-sql error on multi models contracts (587)

Removed
- OpenTelemetry publisher, as it was hardly used

0.10.16

Added
- Support for exporting a Data Contract to an Iceberg schema definition.
- When importing in dbt format, add the dbt `not_null` information as a datacontract `required` field (547)

Changed
- Type conversion when importing contracts into dbt and exporting contracts from dbt (534)
- Ensure 'name' is the first column when exporting in dbt format, considering column attributes (541)
- Rename dbt's `tests` to `data_tests` (548)

Fixed
- Modify the arguments to narrow down the import target with `--dbt-model` (532)
- SodaCL: Prevent `KeyError: 'fail'` from happening when testing with SodaCL
- fix: populate database and schema values for bigquery in exported dbt sources (543)
- Fixing the options for importing and exporting to standard output (544)
- Fixing the data quality name for model-level and field-level quality tests

0.10.15

Added
- Support for model import from parquet file metadata.
- Great Expectation export: add optional args (496)
- `suite_name` the name of the expectation suite to export
- `engine` used to run checks
- `sql_server_type` to define the type of SQL Server to use when engine is `sql`
- Changelog support for `Info` and `Terms` blocks.
- `datacontract import` now has `--output` option for saving Data Contract to file
- Enhance JSON file validation (local and S3) to return the first error for each JSON object, the max number of total errors can be configured via the environment variable: `DATACONTRACT_MAX_ERRORS`. Furthermore, the primaryKey will be additionally added to the error message.
- fixes issue where records with no fields create an invalid bq schema.

Changed
- Changelog support for custom extension keys in `Models` and `Fields` blocks.
- `datacontract catalog --files '*.yaml'` now checks also any subfolders for such files.
- Optimize test output table on console if tests fail

Fixed
- raise valid exception in DataContractSpecification.from_file if file does not exist
- Fix importing JSON Schemas containing deeply nested objects without `required` array
- SodaCL: Only add data quality tests for executable queries

0.10.14

Data Contract CLI now supports the Open Data Contract Standard (ODCS) v3.0.0.

Added
- `datacontract test` now also supports ODCS v3 data contract format
- `datacontract export --format odcs_v3`: Export to Open Data Contract Standard v3.0.0 (460)
- `datacontract test` now also supports ODCS v3 anda Data Contract SQL quality checks on field and model level
- Support for import from Iceberg table definitions.
- Support for decimal logical type on avro export.
- Support for custom Trino types

Changed
- `datacontract import --format odcs`: Now supports ODSC v3.0.0 files (474)
- `datacontract export --format odcs`: Now creates v3.0.0 Open Data Contract Standard files (alias to odcs_v3). Old versions are still available as format `odcs_v2`. (460)

Fixed
- fix timestamp serialization from parquet -> duckdb (472)

0.10.13

Added
- `datacontract export --format data-caterer`: Export to [Data Caterer YAML](https://data.catering/setup/guide/scenario/data-generation/)

Changed
- `datacontract export --format jsonschema` handle optional and nullable fields (409)
- `datacontract import --format unity` handle nested and complex fields (420)
- `datacontract import --format spark` handle field descriptions (420)
- `datacontract export --format bigquery` handle bigqueryType (422)

Fixed
- use correct float type with bigquery (417)
- Support DATACONTRACT_MANAGER_API_KEY
- Some minor bug fixes

0.10.12

Added
- Support for import of DBML Models (379)
- `datacontract export --format sqlalchemy`: Export to [SQLAlchemy ORM models](https://docs.sqlalchemy.org/en/20/orm/quickstart.html) (#399)
- Support of varchar max length in Glue import (351)
- `datacontract publish` now also accepts the `DATACONTRACT_MANAGER_API_KEY` as an environment variable
- Support required fields for Avro schema export (390)
- Support data type map in Spark import and export (408)
- Support of enum on export to avro
- Support of enum title on avro import

Changed
- Deltalake is now using DuckDB's native deltalake support (258). Extra deltalake removed.
- When dumping to YAML (import) the alias name is used instead of the pythonic name. (373)

Fixed
- Fix an issue where the datacontract cli fails if installed without any extras (400)
- Fix an issue where Glue database without a location creates invalid data contract (351)
- Fix bigint -> long data type mapping (351)
- Fix an issue where column description for Glue partition key column is ignored (351)
- Corrected name of table parameter for bigquery import (377)
- Fix a failed to connect to S3 Server (384)
- Fix a model bug mismatching with the specification (`definitions.fields`) (375)
- Fix array type management in Spark import (408)

Page 2 of 8

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.