Clearml-serving

Latest version: v1.3.4

Safety actively analyzes 707375 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.3.1

New Features and Bug Fixes

- Add missing await (55, thanks amirhmk!)
- Add traceback for failing to load preprocess class (57)
- Fix Triton `config.pbtxt` is not checked for missing values or colliding specifications (62)
- Add safer code for pulling from Kafka
- Add `str` type to Triton type conversion
- Fix ignore auto detected `platform` when passing `config.pbtxt` with `platform` entry
- Fix Triton engine model with multiple versions was not properly supported
- Fix serving session keep alive is also sent on idle
- Fix examples readme files
- Log preprocess exceptions with full stack trace to serving session console output

1.3.0

Stable Release

* Features
* 20% Overall performance increase :rocket: thank you python 3.11 :fire:
* gRPC channel configuration 49 amirhmk
* Huggingface Transformer example

* Bug fixes
* 47 / 46 fix numpy compatibility, galleon , anon-it
* 50 fix triton examples, amirhmk
* 45 add storage environment variables, besrym

1.2.0

Stable Release

* Features
* GPU Performance improvements, 50%-300% improvement over vanilla Triton
* Performance improvements on CPU, optimize uvloop + multi-processing
* Huggingface Transformer example
* Binary input support, 37 , thanks Aleksandar1932

* Bug fixes
* stdout/stderr in inference service was not logged to dedicated Task

1.1.0

Stable Release

**Notice: This release is not backwards compatible - see notes below on upgrading**

* Breaking Changes
* Triton engine size supports variable request size (-1)

* Features & Bug fixes
* Add version number of serving session task
* Triton engine support for variable request (matrix) sizes
* Triton support, fix --aux-config to support more configurations elements
* Huggingface Transformer support
* `Preprocess` class as module (see note below)

**Note**: To add a `Preprocess` class from a module (the entire module folder will be packaged)

preprocess_folder
├── __init__.py from .sub.some_file import Preprocess
└── sub
└── some_file.py

Pass the top folder as a path for `--preprocess`, for example:

clearml-serving --id <serving_session_id> model add --preprocess /path/to/preprocess_folder ...

1.0

1. Take down the serving containers (docker-compose or k8s)
2. Update the clearml-serving CLI `pip3 install -U clearml-serving`
3. Re-add a single existing endpoint with `clearml-serving model add ...` (press yes when asked)
(it will upgrade the clearml-serving session definitions)
4. Pull latest serving containers (`docker-compose pull ...` or k8s)
5. Re-spin serving containers (docker-compose or k8s)

1.0.0

Stable Release

**Notice: This release is not backwards compatible**


* Breaking Changes
* pre / post processing class functions get 3 arguments, see [example](https://github.com/allegroai/clearml-serving/blob/a12311c7d6f273cb02d1e09cf1135feb2afc3338/clearml_serving/preprocess/preprocess_template.py#L27)
* Add support for per-request state storage, passing information between the pre/post processing functions

* Features & Bug fixes
* Optimize serving latency while collecting statistics
* Fix metric statistics collecting auto-refresh issue
* Fix live update of model preprocessing code
* Add `pandas` to the default serving container
* Add per endpoint/variable statistics collection control
* Add `CLEARML_EXTRA_PYTHON_PACKAGES` for easier additional python package support (serving inference container)
* Upgrade Nvidia Triton base container image to 22.04 (requires Nvidia drivers 510+)
* Add Kubernetes Helm chart

Page 1 of 2

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.