Added
- Added a check for optimizer attached to `lr_scheduler` (5338)
- Added support for passing non-existing `filepaths` to `resume_from_checkpoint` (4402)
Changed
- Skip restore from `resume_from_checkpoint` while `testing` (5161)
- Allowed `log_momentum` for adaptive optimizers in `LearningRateMonitor` (5333)
- Disabled checkpointing, earlystopping and logging with `fast_dev_run` (5277)
- Distributed group defaults to `WORLD` if `None` (5125)
Fixed
- Fixed `trainer.test` returning non-test metrics (5214)
- Fixed metric state reset (5273)
- Fixed `--num-nodes` on `DDPSequentialPlugin` (5327)
- Fixed invalid value for `weights_summary` (5296)
- Fixed `Trainer.test` not using the latest `best_model_path` (5161)
- Fixed existence check for `hparams` not using underlying filesystem (5250)
- Fixed `LightningOptimizer` AMP bug (5191)
- Fixed casted key to string in `_flatten_dict` (5354)
Contributors
8greg8, haven-jeon, kandluis, marload, rohitgr7, tadejsv, tarepan, tchaton
_If we forgot someone due to not matching commit email with GitHub account, let us know :]_