Adapters

Latest version: v0.2.1

Safety actively analyzes 638755 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 5

4.5.1

All major new features & changes are described at https://docs.adapterhub.ml/v2_transition.
- all changes merged via 105

Additional changes & Fixes
- Support loading adapters with load_best_model_at_end in Trainer (calpt via 122)
- Add setter for active_adapters property (calpt via 132)
- New notebooks for NER, text generation & AdapterDrop (hSterz via 135)
- Enable trainer to load adapters from checkpoints (hSterz via 138)
- Update & clean up example scripts (hSterz via 154 & calpt via 141, 155)
- Add unfreeze_adapters param to train_fusion() (calpt via 156)
- Ensure eval/ train mode is correct for AdapterFusion (calpt via 157)

3.5.1

New
- New model with adapter support: DistilBERT (calpt via 67)
- Save label->id mapping of the task together with the adapter prediction head (hSterz via 75)
- Automatically set matching label->id mapping together with active prediction head (hSterz via 81)
- Upgraded underlying transformers version (calpt via 55, 72 and 85)
- Colab notebook tutorials showcasing all AdapterHub concepts (calpt via 89)

Fixed
- Support for models with flexible heads in pipelines (calpt via 80)
- Adapt input to models with flexible heads to static prediction heads input (calpt via 90)

3.2.1

This is the last release of `adapter-transformers`. See here for the legacy codebase: https://github.com/adapter-hub/adapter-transformers-legacy.

3.2.0

3.1.0

3.0.1

Page 2 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.