Dbt-databricks

Latest version: v1.8.1

Safety actively analyzes 640278 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 9

1.7.0

Features

- Added support for getting info only on specified relations to improve performance of gathering metadata ([486](https://github.com/databricks/dbt-databricks/pull/486)), also (with generous help from from mikealfare) ([499](https://github.com/databricks/dbt-databricks/pull/499))
- Added support for getting freshness from metadata ([481](https://github.com/databricks/dbt-databricks/pull/481))

Fixes

- Node info now gets added to SQLQuery event (thanks davidharting!) ([494](https://github.com/databricks/dbt-databricks/pull/494))
- Compatibility with dbt-spark and dbt-core 1.7.1 ([499](https://github.com/databricks/dbt-databricks/pull/499))

Under the Hood

- Added required adapter tests to ensure compatibility with 1.7.0 ([487](https://github.com/databricks/dbt-databricks/pull/487))
- Improved large seed performance by not casting every value (thanks nrichards17!) ([493](https://github.com/databricks/dbt-databricks/pull/493)). Note: for `file_format="parquet"` we still need to cast.

1.7.0rc1

Fixes

- Fixed a bug where setting a primary key constraint before a null constraint would fail by ensuring null constraints happen first ([479](https://github.com/databricks/dbt-databricks/pull/479))
- Foreign key constraints now work with dbt's constraint structure ([479](https://github.com/databricks/dbt-databricks/pull/479))

Under the Hood

- Compatibility with dbt-spark 1.7.0rc1 ([479](https://github.com/databricks/dbt-databricks/pull/479))

1.6.6

Fixes

- Optimize now runs after creating / updating liquid clustering tables ([463](https://github.com/databricks/dbt-databricks/pull/463))
- Fixing an issue where the new python library install from index behavior breaks users who were already customizing their installs ([472](https://github.com/databricks/dbt-databricks/pull/472))

Under the Hood

- fix Pylance import errors (thanks dataders) ([471](https://github.com/databricks/dbt-databricks/pull/471))

1.6.5

Features

- When installing python libraries onto clusters, you can now specify an index_url (Thanks casperdamen123) ([367](https://github.com/databricks/dbt-databricks/pull/367))
- Log job run information such as run_id when submitting Python jobs to databricks (Thanks jeffrey-harrison) ([454](https://github.com/databricks/dbt-databricks/pull/454))

Fixes

- Node info now gets added to SQLQueryStatus (Thanks colin-rogers-dbt) ([453](https://github.com/databricks/dbt-databricks/pull/453))
- Fixing python model compatibility with newer DBRs ([459](https://github.com/databricks/dbt-databricks/pull/459))
- Updated the Databricks SDK dependency so as to prevent reliance on an insecure version of requests ([460](https://github.com/databricks/dbt-databricks/pull/460))
- Update logic around submitting python jobs so that if the cluster is already starting, just wait for it to start rather than failing ([461](https://github.com/databricks/dbt-databricks/pull/461))

1.6.4

Not secure
Fixes

- Fixed an issue with AWS OAuth M2M flow ([445](https://github.com/databricks/dbt-databricks/pull/445))
- Fixed an issue where every table in hive_metastore would get described ([446](https://github.com/databricks/dbt-databricks/pull/446))

1.6.3

Not secure
Fixes

- Improved legibility of python stack traces ([434](https://github.com/databricks/dbt-databricks/pull/434)).
- Add `fetchmany`, resolves 408 (Thanks NodeJSmith) ([409](https://github.com/databricks/dbt-databricks/pull/409))
- Improved legibility of python stack traces ([434](https://github.com/databricks/dbt-databricks/pull/434))
- Update our Databricks Workflow README to make clear that jobs clusters are not supported targets ([436](https://github.com/databricks/dbt-databricks/pull/436))
- Relaxed the constraint on databricks-sql-connector to allow newer versions ([436](https://github.com/databricks/dbt-databricks/pull/436))
- Streamlined sql connector output in dbt.log ([437](https://github.com/databricks/dbt-databricks/pull/437))

Under the hood

- Switch to running integration tests with OAuth ([436](https://github.com/databricks/dbt-databricks/pull/436))

Page 4 of 9

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.