Lightning

Latest version: v2.5.0.post0

Safety actively analyzes 715032 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 22 of 32

1.0.8

Detail changes

Added

- Added casting to python types for numpy scalars when logging `hparams` (4647)
- Added warning when progress bar refresh rate is less than 20 on Google Colab to prevent crashing (4654)
- Added `F1` class metric (4656)

Changed

- Consistently use `step=trainer.global_step` in `LearningRateMonitor` independently of `logging_interval` (4376)
- Metric states are no longer as default added to `state_dict` (4685)
- Renamed class metric `Fbeta` >> `FBeta` (4656)
- Model summary: add 1 decimal place (4745)
- Do not override `PYTHONWARNINGS` (4700)

Fixed

- Fixed checkpoint `hparams` dict casting when `omegaconf` is available (4770)
- Fixed incomplete progress bars when total batches not divisible by refresh rate (4577)
- Updated SSIM metric (4566)(4656)
- Fixed batch_arg_name - add `batch_arg_name` to all calls to `_adjust_batch_size`bug (4812)
- Fixed `torchtext` data to GPU (4785)
- Fixed a crash bug in MLFlow logger (4716)

Contributors

awaelchli, jonashaag, jungwhank, M-Salti, moi90, pgagarinov, s-rog, Samyak2, SkafteNicki, teddykoker, ydcjeff

_If we forgot someone due to not matching commit email with GitHub account, let us know :]_

1.0.7

Detail changes

Added

- Added lambda closure to `manual_optimizer_step` (4618)

Changed

- Change Metrics `persistent` default mode to `False` (4685)


Fixed

- Prevent crash if `sync_dist=True` on CPU (4626)
- Fixed average pbar Metrics (4534)
- Fixed `setup` callback hook to correctly pass the LightningModule through (4608)
- Allowing decorate model init with saving `hparams` inside (4662)
- Fixed `split_idx` set by `LoggerConnector` in `on_trainer_init` to `Trainer` (4697)

Contributors

ananthsub, Borda, SeanNaren, SkafteNicki, tchaton

_If we forgot someone due to not matching commit email with GitHub account, let us know :]_

1.0.6

Detail changes

Added

- Added metrics aggregation in Horovod and fixed early stopping (3775)
- Added `manual_optimizer_step` which work with `AMP Native` and `accumulated_grad_batches` (4485)
- Added `persistent(mode)` method to metrics, to enable and disable metric states being added to `state_dict` (4482)
- Added congratulations at the end of our notebooks (4555)

Changed

- Changed `fsspec` to tuner (4458)
- Unify sLURM/TorchElastic under backend plugin (4578, 4580, 4581, 4582, 4583)

Fixed

- Fixed feature-lack in `hpc_load` (4526)
- Fixed metrics states being overridden in DDP mode (4482)
- Fixed `lightning_getattr`, `lightning_hasattr` not finding the correct attributes in datamodule (4347)
- Fixed automatic optimization AMP by `manual_optimization_step` (4485)
- Replace `MisconfigurationException` with warning in `ModelCheckpoint` Callback (4560)
- Fixed logged keys in mlflow logger (4412)
- Fixed `is_picklable` by catching `AttributeError` (4508)

Contributors

dscarmo, jtamir, kazhang, maxjeblick, rohitgr7, SkafteNicki, tarepan, tchaton, tgaddair, williamFalcon

_If we forgot someone due to not matching commit email with GitHub account, let us know :]_

1.0.5

Detail changes

Added

- Added PyTorch 1.7 Stable support (3821)
- Added timeout for `tpu_device_exists` to ensure process does not hang indefinitely (4340)

Changed

- W&B log in sync with `Trainer` step (4405)
- Hook `on_after_backward` is called only when `optimizer_step` is being called (4439)
- Moved `track_and_norm_grad` into `training loop` and called only when `optimizer_step` is being called (4439)
- Changed type checker with explicit cast of ref_model object (4457)

Deprecated

- Deprecated passing `ModelCheckpoint` instance to `checkpoint_callback` Trainer argument (4336)

Fixed

- Disable saving checkpoints if not trained (4372)
- Fixed error using `auto_select_gpus=True` with `gpus=-1` (4209)
- Disabled training when `limit_train_batches=0` (4371)
- Fixed that metrics do not store computational graph for all seen data (4313)
- Fixed AMP unscale for `on_after_backward` (4439)
- Fixed TorchScript export when module includes Metrics (4428)
- Fixed CSV logger warning (4419)
- Fixed skip DDP parameter sync (4301)

Contributors

ananthsub, awaelchli, borisdayma, carmocca, justusschock, lezwon, rohitgr7, SeanNaren, SkafteNicki, ssaru, tchaton, ydcjeff

_If we forgot someone due to not matching commit email with GitHub account, let us know :]_

1.0.4

Detail changes

Added

- Added `dirpath` and `filename` parameter in `ModelCheckpoint` (4213)
- Added plugins docs and DDPPlugin to customize ddp across all accelerators (4258)
- Added `strict` option to the scheduler dictionary (3586)
- Added `fsspec` support for profilers (4162)
- Added autogenerated helptext to `Trainer.add_argparse_args` (4344)
- Added support for string values in `Trainer`'s `profiler` parameter (3656)

Changed

- Improved error messages for invalid `configure_optimizers` returns (3587)
- Allow changing the logged step value in `validation_step` (4130)
- Allow setting `replace_sampler_ddp=True` with a distributed sampler already added (4273)
- Fixed santized parameters for `WandbLogger.log_hyperparams` (4320)

Deprecated

- Deprecated `filepath` in `ModelCheckpoint` (4213)
- Deprecated `reorder` parameter of the `auc` metric (4237)
- Deprecated bool values in `Trainer`'s `profiler` parameter (3656)

Fixed

- Fixed setting device ids in DDP (4297)
- Fixed synchronization of best model path in `ddp_accelerator` (4323)
- Fixed `WandbLogger` not uploading checkpoint artifacts at the end of training (4341)

Contributors

ananthsub, awaelchli, carmocca, ddrevicky, louis-she, mauvilsa, rohitgr7, SeanNaren, tchaton

_If we forgot someone due to not matching commit email with GitHub account, let us know :]_

1.0.3

Detail changes

Added

- Added persistent flag to `Metric.add_state` (4195)

Changed

- Used `checkpoint_connector.hpc_save` in SLURM (4217)
- Moved base req. to root (4219)

Fixed

- Fixed `hparams` assign in init (4189)
- Fixed overwrite check for model hooks (4010)

Contributors

Borda, EspenHa, teddykoker

_If we forgot someone due to not matching commit email with GitHub account, let us know :]_

Page 22 of 32

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.