Optimum-graphcore

Latest version: v0.7.1

Safety actively analyzes 688746 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 3

0.4.3

Minor improvements and bug fixes:
* Disable automatic loss scaling option for inference (https://github.com/huggingface/optimum-graphcore/pull/213)
* Improved error messages when IPU config is not compatible with model (https://github.com/huggingface/optimum-graphcore/pull/210)
* Set enable-half-partials by default to True (https://github.com/huggingface/optimum-graphcore/pull/209)

0.4.2

Small bug fixes and improvements.

0.4.1

Fixes a bug in the `IPUTrainer` breaking the `save_model` method (191).

0.4.0

0.3.2

Documentation

Thanks to lewtun the building blocks to write and integrate documentation for `optimum-graphcore` to the main `optimum` documentation is now available.

Notebooks

New notebooks are available:

- Wave2Vec notebooks (142)
- Summarization notebook (153)
- Image classification and language modeling notebooks (152)

Python 3.7+ support

From this release, only Python 3.7 and above are supported.
To use `optimum-graphcore` on Python 3.6 and below, please use `optimum-graphcore==0.3.1`.

Misc

- Layerdrop is now supported for HuBERT (149)
- It is possible to provide an `eval_data_collator` (120)
- `pad_on_batch_axis` collator now allows to train / eval models on datasets not dividing the combined batch size by repeating samples for the batch to reach the proper size (154)

0.3.1

New model additions

- The Wave2Vec2 architecture is now supported for pretraining and the CTC task (81 and 123)
- The ConvNeXT architecture is now supported for pretraining and the image classification task (113)
- The sequence classification task is supported for BART (134, 137, 138)

Fixes

- Speed metrics are now properly computed when resuming from checkpoint (105)
- The number of samples for the speed metrics are now correct (109)

Page 2 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.