Ploomber

Latest version: v0.23.3

Safety actively analyzes 688792 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 13 of 20

0.12

- Changes the logic that determines project root: only considers `pipeline.yaml` and `setup.py` (instead of `environment.yml` or `requirements.txt`)
- Adds configuration and scaffold user guides
- Updates Jupyter user guide
- Deletes conda user guide
- Renames internal modules for consistency (this should not impact end-users)
- Fixes error that caused Files generated from TaskGroups in the spec API not to resolve to their absolute values
- Fixes error that caused metadata not to delete on when saving files in Jupyter if using a source in more than one task
- `DAGSpec` loads an `env.{name}.yaml` file when loading a `pipeline.{name}.yaml` if one exists
- `ploomber plot` saves to `pipeline.{name}.png`
- Override `env.yaml` to load using `PLOOMBER_ENV_FILENAME` environment variable
- `EnvDict` init no longer searches recursively, moved that logic to `EnvDict.find`. `with_env` decorator now uses the latter to prevent breaking the API
- `PostgresCopyFrom` compatible with `psycopg>=2.9`
- `jupyter_hot_reload=True` by default
- `PythonCallableSource` finds the location of a dotted path without importing any of the submodules
- Jupyter integration lazily loads DAGs (no need to import callable tasks)
- CLI no longer showing `env.yaml` parameters when initializing from directory or pattern

0.11.1

- Task's `metadata.params` stores `null` if any parameter isn't serializable
- Task status ignores `metadata.params` if they are `null`
- Fixes unserialization when an upstream task produces a `MetaProduct`

0.11

- Adds `remote` parameter to `DAG.render` to check status against remote storage
- `NotebookSource` no longer includes the injected cell in its `str` representation
- `Metadata` uses task params to determine task status
- Support for wildcards when building dag partially
- Support to skip upstream dependencies when building partially
- Faster `File` remote metadata downloads using multi-threading during `DAG.render`
- Faster upstream dependencies parallel download using multi-threading during `Task.build`
- Suppresses papermill `FutureWarning` due to importing a deprecated `pyarrow` module
- Fixes error that caused a warning due to unused env params when using `import_tasks_from`
- Other bug fixes

0.10.4

- `DAGSpec.find` exposes `starting_dir` parameter
- `ploomber install` supports `pip`'s `requirements.txt` files
- `ploomber install` supports non-packages (i.e., no `setup.py`)
- `ploomber scaffold` flags to use conda (`--conda`) and create package (`--package`)

0.10.3

- `ParamGrid` supports initialization from a list
- Adds `tasks[*].grid` to generate multiple tasks at once
- Support for using wildcards to declare dependencies (e.g., `task-*`)
- Fixes to `ploomber scaffold` and `ploomber install`
- `PythonCallable` creates parent directories before execution
- Support for the parallel executor in Spec API
- `DagSpec.find` exposes `lazy_import` argument
- `TaskGroup` internal API changes

0.10.2

- `GCloudStorageClient` loads credentials relative to the project root
- Adds `ploomber install`
- Adds `S3Client`

Page 13 of 20

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.