Finetuning-scheduler

Latest version: v2.5.1

Safety actively analyzes 723650 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 7

0.3.4

Added

- support for `pytorch-lightning` 1.8.6
- Notify the user when ``max_depth`` is reached and provide the current training session stopping conditions. Resolves [7](https://github.com/speediedan/finetuning-scheduler/issues/7).


Changed

- set package version ceilings for the examples requirements along with a note regarding their introduction for stability
- promoted PL CLI references to top-level package

Fixed

- replaced deprecated ``Batch`` object reference with ``LazyDict``

0.3.3

Added

- support for `pytorch-lightning` 1.8.4

Changed

- pinned `jsonargparse` dependency to <4.18.0 until [205](https://github.com/omni-us/jsonargparse/issues/205) is fixed

0.3.2

Added

- support for `pytorch-lightning` 1.8.2

0.3.1

Added

- support for `pytorch-lightning` 1.8.1
- augmented `standalone_tests.sh` to be more robust to false negatives

Changed

- added temporary expected `distutils` warning until fixed upstream in PL
- updated `depth` type hint to accommodate updated mypy default config
- bumped full test timeout to be more conservative given a dependent package that is currently slow to install in some contexts (i.e. `grpcio` on MacOS 11 with python `3.10`)

0.3.0

Added

- support for pytorch-lightning 1.8.0
- support for python 3.10
- support for PyTorch 1.13
- support for `ZeroRedundancyOptimizer`

Fixed

- call to PL `BaseFinetuning.freeze` did not properly hand control of `BatchNorm` module thawing to FTS schedule. Resolves [5](https://github.com/speediedan/finetuning-scheduler/issues/5).
- fixed codecov config for azure pipeline gpu-based coverage

Changed

- Refactored unexpected and expected multi-warning checks to use a single test helper function
- Adjusted multiple FTS imports to adapt to reorganized PL/Lite imports
- Refactored fts-torch collect_env interface to allow for (slow) collect_env evolution on a per-torch version basis
- Bumped required jsonargparse version
- adapted to PL protection of `_distributed_available`
- made callback setup stage arg mandatory
- updated mypy config to align with PL `Trainer` handling
- updated dockerfile defs for PyTorch 1.13 and python 3.10
- updated github actions versions to current versions
- excluded python 3.10 from torch 1.9 testing due to incompatibility

Deprecated

- removed use of deprecated `LightningCLI` `save_config_overwrite` in PL 1.8

0.2.3

Added

- support for pytorch-lightning 1.7.7
- add new temporary HF expected warning to examples
- added HF `evaluate` dependency for examples

Changed

- Use HF `evaluate.load()` instead of `datasets.load_metric()`

Page 5 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.