Plateau

Latest version: v4.4.0

Safety actively analyzes 714860 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 8

3.13.1

===========================

* Fix evaluation of "OR"-connected predicates (295)

3.13.0

===========================

Improvements
^^^^^^^^^^^^

* Update timestamp related code into Ktk Discover Cube functionality.
* Support backward compatibility to old cubes and fix for cli entry point.

3.12.0

===========================

New functionality
^^^^^^^^^^^^^^^^^

* Introduction of ``cube`` Functionality which is made with multiple Kartothek datasets.
* Basic Features - Extend, Query, Remove(Partitions),
Delete (can delete entire datasets/cube), API, CLI, Core and IO features.
* Advanced Features - Multi-Dataset with Single Table, Explicit physical Partitions, Seed based join system.

3.11.0

===========================

New functionality
^^^^^^^^^^^^^^^^^

* Add ``kartothek.io_components.metapartition.MetaPartition.get_parquet_metadata`` and ``kartothek.io.dask.dataframe.collect_dataset_metadata``, enabling users to collect information about the Parquet metadata of a dataset (306)

Bug fixes
^^^^^^^^^

* Performance of dataset update with ``delete_scope`` significantly improved for datasets with many partitions (308)

3.10.0

===========================

Improvements
^^^^^^^^^^^^
* Dispatch performance improved for large datasets including metadata
* Introduction of ``dispatch_metadata`` kwarg to metapartitions read pipelines
to allow for transition for future breaking release.

Bug fixes
^^^^^^^^^

* Ensure that the empty (sentinel) DataFrame used in `kartothek.io.eager.read_table``
also has the correct behaviour when using the ``categoricals`` argument.


Breaking changes in ``io_components.read``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

* The ``dispatch_metapartitions`` and ``dispatch_metapartitions_from_factory``
will no longer attach index and metadata information to the created MP
instances, unless explicitly requested.

3.9.0

==========================

Improvements
^^^^^^^^^^^^
* Arrow 0.17.X support
* Significant performance improvements for shuffle operations in
``kartothek.io.dask.dataframe.update_dataset_from_ddf``
for large dask.DataFrames with many payload columns by using in-memory
compression during the shuffle operation.
* Allow calling ``kartothek.io.dask.dataframe.update_dataset_from_ddf``
without `partition_on` when `shuffle=True`.
* ``kartothek.io.dask.dataframe.read_dataset_as_ddf`` supports kwarg ``dispatch_by``
to control the internal partitioning structure when creating a dataframe.
* ``kartothek.io.dask.dataframe.read_dataset_as_ddf`` and ``kartothek.io.dask.dataframe.update_dataset_from_ddf``
now allow the keyword ``table`` to be optional, using the default SINGLE_TABLE identifier.
(recommended since the multi table dataset support is in sunset).

Page 5 of 8

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.