Lakehouse-engine

Latest version: v1.23.0

Safety actively analyzes 682244 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.22.1

- [DOCS] Introduction of GAB (Gold Asset Builder) Documentation
- [DOCS] Introduction of PRISMA Data Quality Documentation
- Improve deployment strategy to be more accurate & faster

1.22.0

- A new Data Quality Type **prisma** was lunched as part of the DQ offering
- The main Goal of this new DQ Type is to offer better observability with an enhanced output data model and with different ways to interact with it (by directly providing DQ Functions or by deriving those DQ Functions automatically from an auxiliary table)
- **Note**: proper documentation with examples will come on newer versions, but users can check our local tests for any doubts on using the feature for now
- A Rest API Writer was introduced and you can check the documentation [here](https://adidas.github.io/lakehouse-engine-docs/lakehouse_engine_usage/data_loader/write_to_rest_api.html).
- A Custom SQL Transformer was introduced and the respective documentation is available [here](https://adidas.github.io/lakehouse-engine-docs/lakehouse_engine_usage/data_loader/custom_transformer_sql.html).
- 2 Great Expectations Custom Expectations were added:
- expect_column_pair_a_to_be_not_equal_to_b
- expect_column_pair_date_a_to_be_greater_than_or_equal_to_date_b
- Upgrade several libraries on the lock files due to the upgrade of:
- pip-tools to 7.4.1
- pip-audit to 2.7.3
- pdoc to 14.5.1
- twine to 5.1.1
- and addition of the types-requests library

1.21.0

**Possible Breaking Change on the Lakehouse Engine Installation**
- A new Deployment / Installation strategy was applied to make Lakehouse Engine lighter to install and manage by breaking it into plugins (DQ, Azure, OS, SFTP).
- Now if people install the Lakehouse Engine without specifying optional dependencies, they will simply install the core package, which will be way faster and bring way less dependencies, but can break their code if they were using features coming from optional dependencies.
- Ex: if you were doing `pip install lakehouse_engine` and you were using Data Quality features, now you should change the installation command to `pip install lakehouse_engine[dq]`
- More details on the installation can be found in the ReadMe file

1.20.1

- Implement alternative for the toJson usage on the GAB algorithm as it makes usage of RDDs which is not whitelisted on Databricks Unity Shared Clusters

1.20.0

- Introduction of the Gold Assets Builder (GAB) algorithm - an accelerator for the creation of Gold Tables/Materialisations on top of fact tables with different:

1. aggregations,
2. dimensions,
3. metrics,
4. cadences, and,
5. reconciliation windows
- Fix DQ custom expectation validation

1.19.0

- Ensure the Lakehouse Engine is ready for working in an environment with Databricks Unity Catalog enabled and DBR13.3 https://docs.databricks.com/en/compute/access-mode-limitations.html#shared-access-mode-limitations-on-unity-catalog
- Upgrade Great Expectations library from 0.17.11 to 0.18.8
- Added File Manager support for DBFS (databricks file system utils adding support to direct interact with Databricks Volumes, for example)
- Apply code changes related with breaking changes imposed by Databricks Unity and/or DBR13.3 (e.g. remove RDDs and spark context usages)
- Improve Documentation of the Lakehouse Engine by adding several examples usages and context around those
- More than 30 documentation pages were added and you can find it here:
- https://adidas.github.io/lakehouse-engine-docs/
- https://adidas.github.io/lakehouse-engine-docs/lakehouse_engine_usage.html
- Upgrade jinja2 library from 3.0.3 to 3.1.3
- Add support for an advanced parser and more flexibility for using different delimiters for splitting and processing SQL commands

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.