Finetuning-scheduler

Latest version: v2.5.0

Safety actively analyzes 707299 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 7

2.6.0

Added

- Support for Lightning and PyTorch ``2.6.0``

Deprecated

- removed support for PyTorch `2.2`
- removed use of conda builds (aligning with upstream PyTorch)

2.5.0

Added

- Support for Lightning and PyTorch ``2.5.0``
- FTS support for PyTorch's composable distributed (e.g. ``fully_shard``, ``checkpoint``) and Tensor Parallelism (TP) APIs
- Support for Lightning's ``ModelParallelStrategy``
- Experimental 'Auto' FSDP2 Plan Configuration feature, allowing application of the ``fully_shard`` API using module
name/pattern-based configuration instead of manually inspecting modules and applying the API in ``LightningModule.configure_model``
- FSDP2 'Auto' Plan Convenience Aliases, simplifying use of both composable and non-composable activation checkpointing APIs
- Flexible orchestration of advanced profiling combining multiple complementary PyTorch profilers with FTS ``MemProfiler``

Deprecated

- removed support for PyTorch `2.1`

2.4.1

Added

- Support for Lightning and PyTorch ``2.4.1``

Fixed

- Added logic to more robustly condition depth-aligned checkpoint metadata updates to address edge-cases where `current_score` precisely equaled the `best_model_score` at multiple different depths. Resolved [15](https://github.com/speediedan/finetuning-scheduler/issues/15).

2.4.0

Added

- Support for Lightning and PyTorch ``2.4.0``
- Support for Python ``3.12``

Changed

- Changed default value of the ``frozen_bn_track_running_stats`` option to the FTS callback constructor to ``True``.

Deprecated

- removed support for PyTorch `2.0`
- removed support for Python `3.8`

2.3.3

- Support for Lightning <= ``2.3.3`` (includes critical security fixes) and PyTorch <= ``2.3.1``

2.3.2

- Support for Lightning <= ``2.3.2`` and PyTorch <= ``2.3.1``

Page 1 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.