Zenml

Latest version: v0.58.2

Safety actively analyzes 640549 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 20 of 20

0.1.5

Not secure
New Features
* Added [Kubernetes Orchestrator](https://github.com/maiot-io/zenml/tree/main/zenml/core/backends/orchestrator/kubernetes) to run pipelines on a kubernetes cluster.
* Added timeseries support with [StandardSequencerStep](https://github.com/maiot-io/zenml/blob/main/zenml/core/steps/sequencer/standard_sequencer/standard_sequencer.py).
* Added more [CLI groups] such as `step`, `datasource` and `pipelines`. E.g. `zenml pipeline list` gives list of pipelines in current repo.
* Completed a significant portion of the [Docs](https://docs.zenml.io).
* Refactored Step Interfaces for easier integrations into other libraries.
* Added a [GAN Example](https://github.com/maiot-io/zenml/tree/main/examples/gan) to showcase ImageDatasource.
* Set up base for more Trainer Interfaces like PyTorch, scikit etc.
* Added ability to see historical steps.

Bug Fixes
* All files except YAML files picked up while parsing `pipelines_dir`, in reference to concerns raised in 13.

Upcoming changes
* Next release will be a major one and will involve refactoring of design decisions that might cause backward incompatible changes to existing ZenML repos.

0.1.4

Not secure
New Features
* Ability to add a custom image to Dataflow ProcessingBackend.

Bug Fixes
* Fixed requirements.txt and setup.py to enable local build.
* Pip package should install without any requirement conflicts now.
* Added custom docs made by Jupyter book in the `docs/book` folder.

0.1.3

Not secure
New Features
* Launch GCP preemptible VM instances to orchestrate pipelines with OrchestratorGCPBackend. See full example [here](https://github.com/maiot-io/zenml/tree/main/examples/gcp_orchestrated/run.py).
* Train using Google Cloud AI Platform with SingleGPUTrainingGCAIPBackend. See full example [here](https://github.com/maiot-io/zenml/tree/main/examples/gcp_trained/run.py)
* Use Dataflow for distributed preprocessing. See full example [here](https://github.com/maiot-io/zenml/tree/main/examples/gcp_dataflow/run.py).
* Run pipelines locally with SQLite Metadata Store, local Artifact Store, and local Pipelines Directory.
* Native Git integration: All steps are pinned with the Git SHA of the code when the pipelines it was used in is run. See details [here](https://docs.zenml.io/repository/integration-with-git).
* All pipelines run are reproducible with a unique combination of the Metadata Store, Artifact Store and the Pipelines Directory.

Bug Fixes
* Metadata Store and Artifact Store specified in pipelines disassociated from default .zenml_config file.
* Fixed typo in default docker images constants.

Page 20 of 20

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.