Curated-transformers

Latest version: v2.0.1

Safety actively analyzes 682229 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

2.0.1

🔴 Bug fixes

* Fix Python 3.12.3 activation lookup error (375).

2.0.0

✨ New features and improvements

* Register models using `catalogue` to support external models in `Auto{Decoder,Encoder,CausalLM}` (351, 352).
* Add support for loading parameters in-place (370).
* Support for ELECTRA models (358).
* Add support for write/upload operations with `HFHubRepository` (354).
* Add support for converting Curated Transformer configs to HF-compatible configs (333).

🔴 Bug fixes

* Support PyTorch 2.2 (360).

⚠️ Backwards incompatibilities

* Support for TorchScript tracing is removed (361).
* The `qkv_split` argument is now mandatory for `AttentionHeads`, `AttentionHeads.uniform`, `AttentionHeads.multi_query`, and `AttentionHeads.key_value_broadcast` (374).
* All `FromHFHub` mixins are renamed to `FromHF` (374).
* `FromHF.convert_hf_state_dict` is removed in favor of `FromHF.state_dict_from_hf` (374).

👥 Contributors

danieldk, honnibal, ines, KennethEnevoldsen, shadeMe

1.3.2

🔴 Bug fixes

* Fix Python 3.12.3 activation lookup error (377).

1.3.1

🔴 Bug fixes

* Ensure that parameters are leaf nodes when loading a model (364).
* Set the Torch upper bound to <2.1.0 (363).


**Note:** we have set the Torch upper bound to <2.1.0 because later versions made some incompatible changes. Newer versions of Torch will be supported by Curated Transformers 2.0.0.

1.3.0

✨ New features and improvements

* Add support for model repositories other than Hugging Face Hub (331).
* Add support for [`fsspec`](https://filesystem-spec.readthedocs.io/en/latest/) filesystems as a repository type (#327, 331).
* Add support for NVTX Ranges (320).
* Add a `config` property to models to query their configuration (328).

🔴 Bug fixes

* Fix a potential loading issue that may arise when a model's `dtype` is not set in the Hugging Face configuration (330).


🏛️ Feature: Model repositories

The new (experimental) [repository API](https://curated-transformers.readthedocs.io/en/v1.3.x/repositories.html) adds support for loading models from repositories other than Hugging Face Hub. You can also easily add your own repository types by implementing the [`Repository`](https://curated-transformers.readthedocs.io/en/v1.3.x/repositories.html#curated_transformers.repository.Repository) interface. Using a repository is as easy as calling the new `from_repo` method that is provided by all models and tokenizers:

python
from curated_transformers.models import AutoDecoder

decoder = AutoDecoder.from_repo(MyRepository("mpt-7b-my-qa"))


Curated Transformers comes with two repository classes out-of-the-box:

* [`HfHubRepository`](https://curated-transformers.readthedocs.io/en/v1.3.x/repositories.html#curated_transformers.repository.HfHubRepository) downloads models from Hugging Face Hub and is now used by the `from_hf_hub` methods.
* [`FsspecRepository`](https://curated-transformers.readthedocs.io/en/v1.3.x/repositories.html#curated_transformers.repository.FsspecRepository) supports the wide range of filesystems [provided by the fsspec package](https://filesystem-spec.readthedocs.io/en/latest/api.html#built-in-implementations) and [third-party](https://filesystem-spec.readthedocs.io/en/latest/api.html#other-known-implementations) implementations.

👥 Contributors

danieldk, honnibal, ines, shadeMe

1.2.0

✨ New features and improvements

* Add support for Safetensor checkpoints (310).
* Add [`from_hf_hub_to_cache`](https://curated-transformers.readthedocs.io/en/v1.2.x/utils.html#curated_transformers.models.FromHFHub.from_hf_hub_to_cache) method to `FromHFHub` mixins. This method downloads a model from Hugging Face hub to the local cache without loading it (303).

🔴 Bug fixes

* MPT: Honor `no_bias config` option in layer norms (321).
* Fix a typing issue in `MPTGenerator` (317).

👥 Contributors

danieldk, honnibal, ines, mayankjobanputra, shadeMe

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.