Python-dlt

Latest version: v0.2.1

Safety actively analyzes 682532 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 7

0.2.0a18

What's Changed
* small improvements to duckdb docs by TyDunn in https://github.com/dlt-hub/dlt/pull/128
* Fix current columns and new columns referencing the same object by steinitzu in https://github.com/dlt-hub/dlt/pull/127
* tests file rotation on schema changes + bumps duckdb to 0.7 by rudolfix in https://github.com/dlt-hub/dlt/pull/129

New Contributors
* steinitzu made their first contribution in https://github.com/dlt-hub/dlt/pull/127

0.2.0a17

What's Changed
* adds duckdb destination and dbt support by rudolfix in https://github.com/dlt-hub/dlt/pull/124
The 🦆 `db` destination is added and may be used like any other destination. The multithreaded loading is quite fast thanks to `duckdb` dropping the GIL when called. See more in our [docs](https://dlthub.com/docs/destinations#duckdb)
We also support the `dbt-duckdb` adapter, the [jaffle shop example](https://github.com/dlt-hub/dlt/blob/devel/docs/examples/dbt_run_jaffle.py) was converted to `duckdb` to showcase this ability

* transaction support was added to `sql_client` and Big Query got multi-statement transactions via Sessions


**Full Changelog**: https://github.com/dlt-hub/dlt/compare/0.2.0a16...0.2.0a17

0.2.0a16

What's Changed
* uses structured data types to store json by rudolfix in https://github.com/dlt-hub/dlt/pull/121
* brings back the functionality to run dbt packages by rudolfix in https://github.com/dlt-hub/dlt/pull/122

This update changes how the `complex` data type is being stored. Most often `complex` data type is generated to hold `json` data when `nesting_level` is limited in the source. The complex types are stored as JSONB (postgres), SUPER (redshift), JSON (BigQuery)

This update also allows to easily run `dbt` packages together with `dlt` pipelines. Most of the work went into making it user friendly and avoiding any dependency conflicts. This feature is also quite well tested. Take a look at those two examples:
https://github.com/dlt-hub/dlt/blob/devel/docs/examples/dbt_run_jaffle.py
https://github.com/dlt-hub/dlt/blob/devel/docs/examples/chess/chess_dbt.py
https://github.com/dlt-hub/dlt/tree/devel/docs/examples/chess/dbt_transform

Runing the `dbt` package is just two lines and you have more control over it that with the CLI

0.2.0a10

What's Changed
* only specific default config values are generated in `dlt init` ie BigQuery Location
* correct postgres port generated
* sends traces to sentry if `RUNTIME__SENTRY_DSN` is present
* sends slack notification if `RUNTIME__SLACK_INCOMING_HOOK` is present

0.2.0a9

What's Changed
* `dlt init` renames sources and resources by rudolfix in https://github.com/dlt-hub/dlt/pull/106

0.2.0a8

What's Changed
* deletes all pipeline state (schemas, state, intermediate files) if destination dataset is dropped
* synchronizes state with the destination in the `run` method
* you can opt out from state sync with `restore_from_destination=false` in ie. `config.toml`
* loads all schemas/source into single dataset by default. this will simplify experience for less advanced users. you can switch back to the old behavior (each source/schema has separate dataset) with `use_single_dataset=false` config option
* enables CTRL-C when running user code
* commits all files extracted from several sources after all user code ran

Page 3 of 7

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.