Transformers-lightning

Latest version: v0.7.12

Safety actively analyzes 702232 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 3

0.7.5

- Added `trainer` as second positional argument of every DataModule.

- Renamed `MapDataset` to `TransformersMapDataset`.

- Fixed typo about default shuffling in `SuperDataModule` and `CompressedDataModule`.

0.7.4

- Added `SortingLanguageModeling` technique and tests.

- Added `SwappingLanguageModeling` technique and tests.

- Added `add_argparse_args` method to `SuperAdapter` to allow adding parameters to the CLI.

- Fixed typo with which `AdapterDataModule` was not receiving `collate_fn` argument.

- Fixed typos in `imports`.

- Refactored `datamodules` section.

0.7.3

- Added `get_dataset` method to `AdaptersDataModule` to facilitate creation of dataset from adapters.

- Dropped support for `drop_last` in every dataloader: lightning uses `False` everywhere by default.

- Fixed `TransformersModel.num_training_steps` that in some cases was providing slightly wrong numbers due to rounding.

- Fixed `whole_words_tail_mask` in `language_modeling` which was not working correctly.

- Improved testing of `models` and `language_models`.

0.7.2

- Added tests for `optimizers` package.

- Fixed some imports.

- Fixed some calls to **super** method in optimizers and schedulers.

0.7.1

- Fixed `metrics` package imports and added tests.

0.7.0

- Added `LineAdapter` to read files line by line.

- Every `add_*_specific_args` method now should return nothing.

- Added `predict` capability to `AdaptersDataModule`.

- Added `predict` capability to `CompressedDataModule`.

- Added `do_predict()` and `predict_dataloader()` to `SuperDataModule`.

- Added `do_preprocessing` init argument to `MapDataset` and `TransformersIterableDataset` to eventually avoid calling the preprocessing function defined in the `Adapter`.

- Added check over tokenizer type in `whole_word_tails_mask()`.

- Added functions `get_optimizer`, `get_scheduler`, `num_training_steps` and corresponding CLI parameters to `TransformersModel` to allow for more flexible definition of optimizers and schedulers.

- Added optimizer wrappers to be instantiated through CLI parameters. You can still use your own optimizer in `configure_optimizers` without problems.

- Added scheduler wrappers to be instantiated through CLI parameters. You can still use your own scheduler in `configure_optimizers` without problems.

- (Re)Added metrics package with `HitRate`. However, this will likely be moved to `torchmetrics` in the next releases.

- Changed `hparams` attribute of every class (`models`, `adapters`, `datamodules`, `optimizers`, `schedulers`, `callbacks` and `datasets`) to `hyperparameters` to avoid conflict with new lightning `hparams` getters and setters.

- Changed logic of `TransformersModelCheckpointCallback` since training loop has changed in `pytorch-lightning` **v1.4**.

- Removed `TransformersAdapter` because it was too specific and useless.

- General refactoring of classes. Cleaned and removed unused imports. Refactored also some tests.

Page 2 of 3

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.