Finetuning-scheduler

Latest version: v2.2.1

Safety actively analyzes 623343 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 6

2.2.0

Added

- Support for Lightning and PyTorch ``2.2.0``
- FTS now inspects any base `EarlyStopping` or `ModelCheckpoint` configuration passed in by the user and applies that configuration when instantiating the required FTS callback dependencies (i.e., `FTSEarlyStopping` or `FTSCheckpoint`). Part of the resolution to [12](https://github.com/speediedan/finetuning-scheduler/issues/12).

Changed

- updated reference to renamed `FSDPPrecision`
- increased `jsonargparse` minimum supported version to `4.26.1`
- bumped `sphinx` requirement to `>5.0,<6.0`

Fixed

- Explicitly `rank_zero_only`-guarded `ScheduleImplMixin.save_schedule` and `ScheduleImplMixin.gen_ft_schedule`. Some codepaths were incorrectly invoking them from non-`rank_zero_only` guarded contexts. Resolved [11](https://github.com/speediedan/finetuning-scheduler/issues/11).
- Added a [note in the documentation](https://finetuning-scheduler.readthedocs.io/en/latest/#basic-usage) indicating more clearly the behavior of FTS when no monitor metric configuration is provided. Part of the resolution to [12](https://github.com/speediedan/finetuning-scheduler/issues/12).

Deprecated

- removed support for PyTorch `1.12`
- removed legacy FTS examples

2.1.4

Added

- Support for Lightning ``2.1.4``

Changed

- bumped `sphinx` requirement to `>5.0,<6.0`

Deprecated

- removed deprecated lr `verbose` init param usage
- removed deprecated `tensorboard.dev` references

2.1.3

Added

- Support for Lightning ``2.1.3``

2.1.2

Added

- Support for Lightning ``2.1.2``

Fixed

- Explicitly `rank_zero_only`-guarded `ScheduleImplMixin.save_schedule` and `ScheduleImplMixin.gen_ft_schedule`. Some codepaths were incorrectly invoking them from non-`rank_zero_only` guarded contexts. Resolves [11](https://github.com/speediedan/finetuning-scheduler/issues/11).

2.1.1

Added

- Support for Lightning ``2.1.1``

2.1.0

Added

- Support for Lightning and PyTorch ``2.1.0``
- Support for Python ``3.11``
- Support for simplified scheduled FSDP training with PyTorch >= ``2.1.0`` and ``use_orig_params`` set to ``True``
- Unified different FSDP `use_orig_params` mode code-paths to support saving/restoring full, consolidated OSD (PyTorch versions >= ``2.0.0``)
- added support for FSDP `activation_checkpointing_policy` and updated FSDP profiling examples accordingly
- added support for `CustomPolicy` and new implementation of `ModuleWrapPolicy` with FSDP `2.1.0`

Changed

- FSDP profiling examples now use a patched version of ``FSDPStrategy`` to avoid https://github.com/omni-us/jsonargparse/issues/337 with ``jsonargparse`` < ``4.23.1``

Fixed

- updated ``validate_min_wrap_condition`` to avoid overly restrictive validation in some ``use_orig_params`` contexts
- for PyTorch versions < 2.0, when using the FSDP strategy, disabled optimizer state saving/restoration per https://github.com/Lightning-AI/lightning/pull/18296
- improved fsdp strategy adapter `no_decay` attribute handling

Deprecated

- ``FSDPStrategyAdapter`` now uses the ``configure_model`` hook rather than the deprecated ``configure_sharded_model`` hook to apply the relevant model wrapping. See https://github.com/Lightning-AI/lightning/pull/18004 for more context regarding ``configure_sharded_model`` deprecation.
- Dropped support for PyTorch ``1.11.x``.

Page 1 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.