Optimum

Latest version: v1.24.0

Safety actively analyzes 710445 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 11 of 23

1.10.1

* Fix OwlViT exporter by regisss in https://github.com/huggingface/optimum/pull/1188

* Fix SD loading when safetensors weights only by echarlaix in https://github.com/huggingface/optimum/pull/1232

* Fix `optimum-intel` version requirements by echarlaix in https://github.com/huggingface/optimum/pull/1234


**Full Changelog**: https://github.com/huggingface/optimum/compare/v1.10.0...v1.10.1

1.10

This release is fully compatible with [SynapseAI v1.10.0](https://docs.habana.ai/en/v1.10.0/).

- Upgrade to SynapseAI v1.10.0 255 regisss


HPU graphs for training

You can now use HPU graphs for training your models.

- Improve performance and scalability of BERT FT training 200 mlapinski-habana

Check out the [documentation](https://huggingface.co/docs/optimum/habana/usage_guides/accelerate_training#hpu-graphs) for more information.


Various model optimizations

- Update BLOOM modeling for SynapseAI 1.10 277
- Optimize conv1d forward 231 ZhaiFeiyue
- Add static key-value cache for OPT, GPT-J, GPT-NeoX 246 248 249 ZhaiFeiyue
- Optimizations for running FLAN T5 with DeepSpeed ZeRO-3 257 libinta


Asynchronous data copy

You can now enable asynchronous data copy between the host and devices during training using `--non_blocking_data_copy`.

- Enable asynchronous data copy to get a better performance 211 jychen-habana

Check out the [documentation](https://huggingface.co/docs/optimum/habana/usage_guides/accelerate_training#nonblocking-data-copy) for more information.


Profiling

It is now possible to profile your training relying on `GaudiTrainer`. You will need to pass [`--profiling_steps N`](https://huggingface.co/docs/optimum/habana/package_reference/trainer#optimum.habana.GaudiTrainingArguments.profiling_steps) and [`--profiling_warmup_steps K`](https://huggingface.co/docs/optimum/habana/package_reference/trainer#optimum.habana.GaudiTrainingArguments.profiling_warmup_steps).

- Enable profiling 250 ZhaiFeiyue


Adjusted throughput calculation

You can now let the `GaudiTrainer` compute the real throughput of your run (i.e. not counting the time spent while logging, evaluating and saving the model) with `--adjust_throughput`.

- Added an option to remove save checkpoint time from throughput calculation 237 libinta


Check SynapseAI version at import

A check is performed when importing `optimum.habana` to let you know if you are running the version of SynapseAI for which Optimum Habana has been tested.

- Check Synapse version when `optimum.habana` is used 225 regisss


Enhanced examples

Several examples have been added or improved. You can find them [here](https://github.com/huggingface/optimum-habana/tree/main/examples).

- the text-generation example now supports sampling and beam search decoding, and full bf16 generation 218 229 238 251 258 271
- the contrastive image-text example now supports HPU-accelerated data loading 256
- new Seq2Seq QA example 221
- new protein folding example with ESMFold 235 276

1.10.0

Stable Diffusion XL

Enable SD XL ONNX export and ONNX Runtime inference by echarlaix in https://github.com/huggingface/optimum/pull/1168

* Enable SD XL ONNX export using the CLI :


optimum-cli export onnx --model stabilityai/stable-diffusion-xl-base-0.9 --task stable-diffusion-xl ./sd_xl_onnx


* Add SD XL pipelines for ONNX Runtime inference (supported tasks : **text-to-image** and **image-to-image**) :


python
from optimum.onnxruntime import ORTStableDiffusionXLPipeline

model_id = "stabilityai/stable-diffusion-xl-base-0.9"
pipeline = ORTStableDiffusionXLPipeline.from_pretrained(model_id, export=True)

prompt = "sailing ship in storm by Leonardo da Vinci"
image = pipeline(prompt).images[0]
pipeline.save_pretrained("onnx-sd-xl-base-0.9")


Stable Diffusion pipelines

Enable **image-to-image** and **inpainting** pipelines for ONNX Runtime inference by echarlaix in https://github.com/huggingface/optimum/pull/1121

More examples in [documentation](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models#imagetoimage)


Major bugfixes

* Fix bloom KV cache usage in ORTForCausalLM by fxmarty in https://github.com/huggingface/optimum/pull/1152


What's Changed

* Add stable diffusion example by prathikr in https://github.com/huggingface/optimum/pull/1136
* Fixed incomplete ONNX export model memory release issue by sharpbai in https://github.com/huggingface/optimum/pull/1154
* Add trust remote code option for config by changwangss in https://github.com/huggingface/optimum/pull/1151
* Fix typos of ONNXRuntimme -> ONNXRuntime by mgoin in https://github.com/huggingface/optimum/pull/1155
* Fix ONNX export for MobileViT for segmentation by regisss in https://github.com/huggingface/optimum/pull/1128
* Revert "update the default block size" by rui-ren in https://github.com/huggingface/optimum/pull/1162
* ONNX export for custom architectures & models with custom modeling code by fxmarty in https://github.com/huggingface/optimum/pull/1166
* Update Optimum Neuron doc by regisss in https://github.com/huggingface/optimum/pull/1164
* Fix stable diffusion ONNX export by echarlaix in https://github.com/huggingface/optimum/pull/1173
* Add gpt_bigcode model_type to NormalizedTextConfig by changwangss in https://github.com/huggingface/optimum/pull/1170
* Allow `attention_mask=None` for BetterTransformer in the inference batched case for gpt2 & gpt-neo by fxmarty in https://github.com/huggingface/optimum/pull/1180
* Fix encoder attention mask input order for ORT by fxmarty in https://github.com/huggingface/optimum/pull/1181
* Fix ORTModel initialization on specific device id by fxmarty in https://github.com/huggingface/optimum/pull/1182
* Add stable diffusion img2img and inpain documentation by echarlaix in https://github.com/huggingface/optimum/pull/1149
* Fix SD XL ONNX export for img2img task by echarlaix in https://github.com/huggingface/optimum/pull/1194
* Remove graphcore from documentation quickstart by echarlaix in https://github.com/huggingface/optimum/pull/1201
* Unpin tensorflow by fxmarty in https://github.com/huggingface/optimum/pull/1211
* Fix ORT test for unknown architecture for task by fxmarty in https://github.com/huggingface/optimum/pull/1212
* add ort + stable diffusion documentation by prathikr in https://github.com/huggingface/optimum/pull/1205
* Fix vision encoder decoder that may not cache cross-attention by fxmarty in https://github.com/huggingface/optimum/pull/1210
* Add documentation for Optimum Furiosa by regisss in https://github.com/huggingface/optimum/pull/1165
* Add BLIP-2 to BetterTransformer documentation by fxmarty in https://github.com/huggingface/optimum/pull/1218
* Set default value to unet config sample size by echarlaix in https://github.com/huggingface/optimum/pull/1223
* Fix broken link in doc by regisss in https://github.com/huggingface/optimum/pull/1222
* Fix BT test by fxmarty in https://github.com/huggingface/optimum/pull/1224
* Add SD XL documentation by echarlaix in https://github.com/huggingface/optimum/pull/1198
* Update setup.py to add optimum-furiosa extras by mht-sharma in https://github.com/huggingface/optimum/pull/1226

New Contributors

* sharpbai made their first contribution in https://github.com/huggingface/optimum/pull/1154
* mgoin made their first contribution in https://github.com/huggingface/optimum/pull/1155

**Full Changelog**: https://github.com/huggingface/optimum/compare/v1.9.0...v1.10.0

1.9.4

* Fix `OVDataLoader` for NNCF quantization aware training for `transformers` > v4.31.0 by echarlaix in 376

**Full Changelog**: https://github.com/huggingface/optimum-intel/compare/v1.9.3...v1.9.4

1.9.3

* Improved performance of decoders by AlexKoff88 354
* Fix openvino model integration compatibility for optimum > v1.9.0 by echarlaix in 365

**Full Changelog**: https://github.com/huggingface/optimum-intel/compare/v1.9.2...v1.9.3

1.9.2

* Fix INC distillation to be compatible with `neural-compressor` v2.2.0 breaking changes by echarlaix in 338

Page 11 of 23

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.