Finetuning-scheduler

Latest version: v2.4.0

Safety actively analyzes 681866 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 7

2.0.9

- Support for Lightning 2.0.8 and 2.0.9

2.0.7

- Support for Lightning 2.0.7

2.0.6

- Support for Lightning 2.0.5 and 2.0.6

2.0.4

- Support for PyTorch Lightning 2.0.3 and 2.0.4
- adjusted default example log name
- disabled fsdp 1.x mixed precision tests temporarily until https://github.com/Lightning-AI/lightning/pull/17807 is merged

2.0.2

Added

- Beta support for [optimizer reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/optimizer_reinitialization.html). Resolves [#6](https://github.com/speediedan/finetuning-scheduler/issues/6)
- Use structural typing for Fine-Tuning Scheduler supported optimizers with ``ParamGroupAddable``
- Support for ``jsonargparse`` version ``4.20.1``

Changed

- During schedule phase transitions, the latest LR state will be restored before proceeding with the next phase configuration and execution (mostly relevant to lr scheduler and optimizer reinitialization but also improves configuration when restoring best checkpoints across multiple depths)

Fixed

- Allow sharded optimizers ``ZeroRedundancyOptimizer`` to be properly reconfigured if necessary in the context of ``enforce_phase0_params`` set to ``True``.

2.0.1

Added

- Support for PyTorch Lightning 2.0.1
- Lightning support for ``use_orig_params`` via ([16733](https://github.com/Lightning-AI/lightning/pull/16733))

Page 3 of 7

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.