Transformers-lightning

Latest version: v0.7.12

Safety actively analyzes 625951 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

0.7.12

- Fixed `num_training_steps` for lightning 1.7.

- Changed all static methods `add_*_args` to standard form `add_argparse_args`.

- Deprecated strategies based on DataParallel as in `pytorch-lightning` and added MPS accelerator.

- Fixed deprecated classes in lightning 1.7.

0.7.10

- Moved `pre_trained_dir` hyperparameter from `Defaults` to `TransformersModelCheckpointCallback`.

- Fixed `JsonboardLogger` with `pytorch-lightning>=1.6`.

0.7.9

- Fixed steps computation when `max_steps` is not provided by the user.

- Added `JsonboardLogger`.

- Added some tests for automatic steps computation with `deepspeed`.

0.7.8

- Fixed `TransformersMapDataset` parameters and adapter loading.

- Removed `CompressedDataModule`.

- Added `RichProgressBar` with `global_step` logging.

- Fixed deprecated `transformers` `AdamW` inside optimizers to `torch` implementation.

- Fixed typos.

0.7.7

- Removed update `TransformersModelCheckpointCallback`.

- `TransformersModel.num_training_steps` is not a function and not a property anymore + fix.

- Updated tests to use new `accelerator` and `strategy` signature for defining the training hardware to be used.

- Fixed check on shuffle in `SuperDataModule`.

- Completely removed metrics package, now all metrics available in `torchmetrics` library.

0.7.6

- Package publication fixed

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.