Clearml

Latest version: v1.16.5

Safety actively analyzes 687959 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 7 of 21

1.6.4

Not secure
Bug Fixes

- Fix `APIClient` fails when calling `get_all` endpoints with API 2.20 (affects CLI tools such as `clearml-session`)

1.6.3

Not secure
New Features and Improvements

- Add option to specify an endpoint URL when creating S3 resource service (679, thanks AndolsiZied!)
- Add support for providing `ExtraArgs` to boto3 when uploading files using the `sdk.aws.s3.extra_args` configuration option
- Add support for Server API 2.20
- Add `Task.get_num_enqueued_tasks()` to get the number of tasks enqueued in a specific queue
- Add support for updating model metadata using `Model.set_metadata()`, `Model.get_metadata()`, `Model.get_all_metadata()`, `Model.get_all_metadata_casted()` and `Model.set_all_metadata()`
- Add `Task.get_reported_single_value()`
- Add a retry mechanism for models and artifacts upload
- Pipelines with empty configuration takes it from code
- Add support for running pipeline steps on preemptible instances
- Datasets
- Add description to Datasets
- Add wild-card support in `clearml-data`

Bug Fixes

- Fix dataset download (713, thanks dankirsdot!)
- Fix lock is not released after dataset cache is downloaded (708, thanks mralgos!)
- Fix deadlock might occur when using process pool large number processes (674)
- Fix 'series' not appearing on UI when using `logger.report_table()` (684)
- Fix `Task.init()` docstring to include behavior when executing remotely (737, thanks mmiller-max!)
- Fix `KeyError` when running remotely and no params were passed to click (https://github.com/allegroai/clearml-agent/issues/111)
- Fix full path is stored when uploading a single artifact file
- Fix passing non-alphanumeric filename in `sdk.development.detect_with_pip_freeze`
- Fix Python 3.6 and 3.10 support
- Fix mimetype cannot be `None` when uploading to S3
- Pipelines
- Fix pipeline DAG
- Add support for pipelines with spot instances
- Fix pipeline proxy object is always resolved in main pipeline logic
- Fix pipeline steps with empty configuration should try and take it from code
- Fix wait for jobs based on local/remote pool frequency
- Fix `UniformIntegerParameterRange.to_list()` ignores min value
- Fix pipeline component returning a list of length 1
- Datasets
- Fix `Dataset.get()` does not respect `auto_create`
- Fix getting datasets fails with new ClearML Server v1.6
- Fix datasets can't be queried by project/name alone
- Fix adding child dataset to older parent dataset without stats
- Fix error when connecting an input model
- Fix deadlocks, including:
- Change thread Event/Lock to a process fork safe threading objects
- Use file lock instead of process lock to avoid future deadlocks since python process lock is not process safe (killing a process holding a lock will Not release the lock)
- Fix `StorageManager.list()` on a local Windows path
- Fix model not created in the current project
- Fix `keras_tuner_cifar` example raises `DeprecationWarning` and `ValueError`

1.6.2

Not secure
Bug Fixes

- Fix format string construction sometimes causing delayed evaluation errors (706)

1.6.1

Not secure
Bug Fixes

- Fix `Task.get_tasks()` fails when sending `search_hidden=False`
- Fix LightGBM example shows `UserWarning`

1.6

New Features and Improvements
- New HyperParameter Optimization CLI `clearml-param-search`
- Improvements to ClearML Data
* Add support for a new ClearML Data UI in the ClearML WebApp
* Add `clearml-data` new options `set-description` and `rename`
- Add random seed control using `Task.set_random_seed()` allowing to set a new random seed for task initialization or to disable it
- Improve error messages when failing to download an artifact
- Improve error messages when testing for permissions

Bug Fixes
- Fix axis range settings when logging plots
- Fix `Task.get_project()` to return more than 500 entries (612)
- Fix pipeline progress calculation
- Fix `StorageManager.upload_folder()` returns `None` for both successful and unsuccessful uploads
- Fix script path capturing stores a relative path and not an absolute path
- Fix HTML debug samples are saved incorrectly on S3
- Fix Hydra deprecation warning in examples
- Fix missing requirement for tensorboardx example

Known issues
- When removing an image from a Dataset, its preview image won't be removed
- Moving Datasets between projects still shows the Dataset in the old project

1.6.0

Not secure
New Features

- Upgrade requests library (https://github.com/allegroai/clearml-agent/pull/162, thanks jday1!)
- Add support for controlling PyTorch resolving mode using the `CLEARML_AGENT_PACKAGE_PYTORCH_RESOLVE` environment variable and `agent.package_manager.pytorch_resolve` configuration setting with `none` (no resolving), `pip` (sets extra index based on cuda and lets pip resolve) or `direct` (the previous parsing algorithm that does the matching and downloading), default is `pip` (152)
- Add backwards compatibility in standalone mode using the `CLEARML_AGENT_STANDALONE_CONFIG_BC` environment variable
- Add `CLEARML_AGENT_DOCKER_AGENT_REPO` alias for the `FORCE_CLEARML_AGENT_REPO` environment variable
- Show a better message for agent init when an existing `clearml.conf` is found
- Add support for task field injection into container docker name using the `agent.docker_container_name_format_fields` configuration setting
- Add support for adding additional labels to docker containers using the `CLEARML_AGENT_EXTRA_DOCKER_LABELS` environment variable
- Add support for setting file mode in files applied by the agent (using the `files` configuration option) using the `mode` property
- Add support for skipping agent pip upgrade in the default k8s pod container bash script using the `CLEARML_AGENT_NO_UPDATE` environment variable
- Add support for additional pip install flags when installing dependencies using the `CLEARML_EXTRA_PIP_INSTALL_FLAGS` environment variable and `agent.package_manager.extra_pip_install_flags` configuration option
- Add support for extra docker arguments referencing machines environment variables using the `agent.docker_allow_host_environ` configuration option, allowing users to use $ENV in the task docker arguments (e.g. `-e HOST_NAME=$HOST_NAME`)
- Add support for k8s jobs execution (as opposed to only pods)
- Update default docker image versions
- Add Python 3.11 support


Bug Fixes

- Fix `git+ssh://` links inside installed packages not being properly converted to authenticated `https://` and vice versa
- Fix pip version required in the "Installed Packages" is now preserved and reinstalled
- Fix various agent paths not loaded correctly if an empty string or null is used (should be disabled, not converted to `.`)
- Fix docker container backwards compatibility for API<2.13
- Fix default docker match rules resolver (used incorrect field "container" instead of "image")
- Fix task docker argument might be passed twice (might cause an error with flags such as `--network` and `--ipc`)

Page 7 of 21

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.