Pytorch-warmup

Latest version: v0.1.1

Safety actively analyzes 681812 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.1.1

* This release was made for the PyPI package to include a license file. (9)
* In addition, the GitHub Actions workflows are updated.

There is no further update in this release.

0.1.0

* The `with` statement is used for encapsulating the undampened learning rate.
* The learning rate scheduler "chaining" can work along with this version of pytorch_warmup.

Why this change is needed
For the previous version, we have to work around to a "chaining" problem by using ugly code:
python
optimizer.step()
lr_scheduler.step(lr_scheduler.last_epoch+1)
warmup_scheduler.dampen()

This code causes a more serious problem that PyTorch emits a user warning:

UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)

For this version, we simply code:
python
optimizer.step()
with warmup_scheduler.dampening():
lr_scheduler.step()


If you use no LR scheduler, you code with `pass`:
python
optimizer.step()
with warmup_scheduler.dampening():
pass

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.