Clearml

Latest version: v1.16.5

Safety actively analyzes 687959 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 12 of 21

1.0

1. Take down the serving containers (docker-compose or k8s)
2. Update the clearml-serving CLI `pip3 install -U clearml-serving`
3. Re-add a single existing endpoint with `clearml-serving model add ...` (press yes when asked)
(it will upgrade the clearml-serving session definitions)
4. Pull latest serving containers (`docker-compose pull ...` or k8s)
5. Re-spin serving containers (docker-compose or k8s)

1.0.0

Not secure
Stable Release

**Notice: This release is not backwards compatible**


* Breaking Changes
* pre / post processing class functions get 3 arguments, see [example](https://github.com/allegroai/clearml-serving/blob/a12311c7d6f273cb02d1e09cf1135feb2afc3338/clearml_serving/preprocess/preprocess_template.py#L27)
* Add support for per-request state storage, passing information between the pre/post processing functions

* Features & Bug fixes
* Optimize serving latency while collecting statistics
* Fix metric statistics collecting auto-refresh issue
* Fix live update of model preprocessing code
* Add `pandas` to the default serving container
* Add per endpoint/variable statistics collection control
* Add `CLEARML_EXTRA_PYTHON_PACKAGES` for easier additional python package support (serving inference container)
* Upgrade Nvidia Triton base container image to 22.04 (requires Nvidia drivers 510+)
* Add Kubernetes Helm chart

0.17.5

Not secure
Features

- Add `force_download` argument to `Artifact.get()` and `Artifact.get_local_copy()` 319
- Support all reporting using subprocesses instead of threads (default `sdk.development.report_use_subprocess=True`)
- Improve Datasets support
* Add `clearml-data publish` to allow publishing a dataset task
* Add `clearml-data` sync can now create, sync and close a dataset with a single command
* Allow Dataset to be created on a running Task
* Add `dataset_tags` argument to `Dataset.get()`
* Add `Dataset.get_logger()`
- Add `Task.add_requirements()` support for specifying version `<>=~` etc.
- Add `StorageManager.upload_folder()` and `StorageManager.download_folder()`
- Add progress report logging for `StorageHelper.upload_from_stream()`, `StorageHelper.upload()` and `StorageHelper.upload_from_stream()`
- Add jupyter auto-magic store jupyter notebook as an artifact on the Task (default `sdk.development.store_jupyter_notebook_artifact=True`)
- Add upload HTML preview of jupyter notebook as an artifact
- Add `PipelineController` disable clone base task option
- Add links to Tasks in optimization summary table (not just Task IDs)
- Add support for datetime in scatter plots + matplotlib support
- Improve plotly value type conforming
- Improve PyTorch `DataLoader` speed 207
- Update Auto Scaler default values and configuration
- Examples
* Add Hydra example
* Add artifacts retrieval example
* Update various examples

Bug Fixes

- Fix warning or error message if requirements parsing failed 291
- Fix pytorch-lighting multi node store 292
- Fix strip remote diff 295
- Fix python package detection `sklearn` -> `scikit-learn` 296
- Fix argparse issues
* Fix argparse with `[None]` in default parameter 297
* Fix parsing of arguments in scientific notation 313
* Fix argparser logging always captures defaults (Windows only, ignored cmd)
* Fix argparse `nargs` passed in command line `--nargs 1 2` should be stored as `[1, 2]` not as `['1', '2']`
* Fix support for nonstandard argparse with default value that is not of defined type
* Fix server updated with the argparse in remote before Task.init() is called (respect skipped args)
- Fix Dataset support
* Fix `Dataset.remove_files()` can't find files when files are in dataset root 303
* Fix closing a dataset with only files removed
* Fix Dataset generate removed / modified / added of summary table
- Fix hydra multi-run support 306
- Fix TF/TensorBoard support
* Fix TensorBoard multiple `Task.init()`/`Task.close()` calls within the same process 312
* Fix TensorBoard 2+ `pr_curve`
* Fix TF `pr_curve` should not be inverted
* Fix TF +2.3 mixed eager mode execution summary metrics not reported
* Fix TF bind keyboard interrupt
* Fix TF 2.4 keras load/save model
- Fix `clearml-task`
* Fix error when script cannot be found
* Fix `--docker` flag not passed
* Fix patching local git diff
- Fix `clearml-data`
* Fix `clearml-data sync` requires `--name`
* Fix missing required argument `--files` in `clearml-data remove`
- Fix `Task.execute_remotely()` from Jupyter Notebook
- Fix populate Task called from Jupyter Notebook (use `Task.create(packages=True)` to auto populate based on locally installed packages)
- Fix plotly plot with numpy containing `NaN`/`datetime`
- Fix matplotlib with Agg backend (or in remote execution)
- Fix trying to upload model file as a folder (automatically package the folder)
- Fix broken packages on package `importlib` detection failed the entire requirements detection
- Fix `Task.connect(object)` should always return the same object instance
- Fix `Task.create()` with repo and script that exists locally
- Fix crash in case `Logger.get_logger()` cannot get the file name
- Fix exception at exit in python 3.8+ on MacOS/Windows
- Fix make pipeline summary table link to Task step logs
- Fix Hydra 1.1 support (argparse description)
- Fix close task after logger is closed
- Fix `Task.set_base_docker()` in remote execution
- Fix artifact preview limit to 64Kb
- Fix JupyterLab Notebook detection
- Fix Python 2.7 support

0.17.4

Not secure
Features and Bug Fixes
- Add `HyperParameterOptimizer` parallel coordinates 279
- Add `Task.init()` argument `tags`
- Change HPO parallel coordinates color scale, yellow is low
- Change `HyperParameterOptimizer` `spawn_task_project` to `spawn_project`
- Use only lower limit constraint for numpy
- Fix argparse `nargs` support was broken
- Fix argparse with `action="append"`
- Fix PyJWT v2.0 token parsing
- Fix python package detection should not list `file://` links

0.17.3

Not secure
Features
- Add `Task.delete()` support
- Add `Task.debug_simulate_remote_task()` to simulate task execution by ClearML Agent
- Add warning on archived Task in pipeline 274
- Add `Task.init(..., output_uri=True)` will use the default files_server as output uri
- Make `clearml-data` CLI stateful, remember last dataset ID as default dataset
- Added `HyperParameterOptimizer.get_optimizer_top_experiments()` for querying post execution optimization pipeline
- Add `Task.set_archived()` and `Task.get_archived()`
- Add `Task.set_credentials()` option to store into credentials file
- Add `clearml-data` close now auto uploads
- Add `HyperParameterOptimizer` arguments `spawn_task_project` and `save_top_k_tasks_only`


Bug Fixes
- Fix `PipelineController` running remotely without configuration does not execute the default code pipeline 273
- Fix reusing task after its project was deleted 274
- Fix `Task.archived_tag` read-only property does not work 274
- Fix argparse support to store consistent str representation of custom objects. Avoid changing default value if remote value matches
- Fix argsparse type as function
- Fix Dataset add single and multiple file(s)
- Fix get project name from parent dataset if not specified
- Fix mpl exporter. Added support for legend
- Fix model upload
- Fix optimizer callback best experiment
- Fix Optuna optimizer failing on tasks with `None` value in scalar query
- Fix auto python package detection installed directly from URLs
- Fix dataset upload aborted on server watchdog
- Fix dataset genealogy, graph and restoring data
- Fix numpy dependency for python versions

0.17.2

Not secure
Bug Fixes

- Fix broken `clearml-task` CLI

Page 12 of 21

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.