Simpletransformers

Latest version: v0.70.1

Safety actively analyzes 638361 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 40 of 62

0.24.2

0.24.1

0.24.00.24.0

Added

- Added ELECTRA pretraining support.
- Added better support for configuring model architectures when training language models from scratch.
- Any options which should be overriden from the default config can now be specified in the `args` dict. (`config` key)

Changed

- Default entry for `vocab_size` removed from `args` for `LanguageModelingModel` as it differs for different model types.
- `vocab_size` must now be specified whenever a new tokenizer is to be trained.

Fixed

- Fixed bugs when training BERT (with word piece tokenization) language models from scratch.
- Fixed incorrect special tokens being used with BERT models when training a new tokenizer.
- Fixed potential bugs with BERT tokenizer training.

0.24.0

0.23.30.23.3

Fixed

- Fixed bug in `QuestionAnsweringModel` where the `save_model()` method wasn't being called properly.
- Fixed bug in calculating global step when resuming training.

0.23.20.23.2

Fixed

- Prevent padding tokens being added when using `openai-gpt` and `gpt2` models for language modeling.

Page 40 of 62

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.