Pytorch-forecasting

Latest version: v1.1.1

Safety actively analyzes 681812 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 6

1.1.1

Hotfix for accidental package name change in `pyproject.toml`.

The package name is now corrected to `pytorch-forecasting`.

1.1.0

Maintenance update widening compatibility ranges and consolidating dependencies:

* support for python 3.11 and 3.12, added CI testing
* support for MacOS, added CI testing
* core dependencies have been minimized to `numpy`, `torch`, `lightning`, `scipy`, `pandas`, and `scikit-learn`.
* soft dependencies are available in soft dependency sets: `all_extras` for all soft dependencies, and `tuning` for `optuna` based optimization.

Dependency changes

* the following are no longer core dependencies and have been changed to optional dependencies : `optuna`, `statsmodels`, `pytorch-optimize`, `matplotlib`. Environments relying on functionality requiring these dependencies need to be updated to instlal these explicitly.
* `optuna` bounds have been updated to `optuna >=3.1.0,<4.0.0`
* `optuna-integrate` is now an additional soft dependency, in case of `optuna >=3.3.0`

Deprecations and removals

* from 1.2.0, the default optimizer will be changed from `"ranger"` to `"adam"` to avoid non-`torch` dependencies in defaults. `pytorch-optimize` optimizers can still be used. Users should set the optimizer explicitly to continue using `"ranger"`.
* from 1.1.0, the loggers do not log figures if soft dependency `matplotlib` is not present, but will raise no exceptions in this case. To log figures, ensure tha `matplotlib` is installed.

1.0.0

Breaking Changes

- Upgraded to pytorch 2.0 and lightning 2.0. This brings a couple of changes, such as configuration of trainers. See the [lightning upgrade guide](https://lightning.ai/docs/pytorch/latest/upgrade/migration_guide.html). For PyTorch Forecasting, this particularly means if you are developing own models, the class method `epoch_end` has been renamed to `on_epoch_end` and replacing `model.summarize()` with `ModelSummary(model, max_depth=-1)` and `Tuner(trainer)` is its own class, so `trainer.tuner` needs replacing. (#1280)
- Changed the `predict()` interface returning named tuple - see tutorials.

Changes

- The predict method is now using the lightning predict functionality and allows writing results to disk (1280).

Fixed

- Fixed robust scaler when quantiles are 0.0, and 1.0, i.e. minimum and maximum (1142)

0.10.3

Fixed

- Removed pandoc from dependencies as issue with poetry install (1126)
- Added metric attributes for torchmetric resulting in better multi-GPU performance (1126)

Added

- "robust" encoder method can be customized by setting "center", "lower" and "upper" quantiles (1126)

0.10.2

Added

- DeepVar network (923)
- Enable quantile loss for N-HiTS (926)
- MQF2 loss (multivariate quantile loss) (949)
- Non-causal attention for TFT (949)
- Tweedie loss (949)
- ImplicitQuantileNetworkDistributionLoss (995)

Fixed

- Fix learning scale schedule (912)
- Fix TFT list/tuple issue at interpretation (924)
- Allowed encoder length down to zero for EncoderNormalizer if transformation is not needed (949)
- Fix Aggregation and CompositeMetric resets (949)

Changed

- Dropping Python 3.6 suppport, adding 3.10 support (479)
- Refactored dataloader sampling - moved samplers to pytorch_forecasting.data.samplers module (479)
- Changed transformation format for Encoders to dict from tuple (949)

Contributors

- jdb78

0.10.1

Fixed

- Fix with creating tensors on correct devices (908)
- Fix with MultiLoss when calculating gradient (908)

Contributors

- jdb78

Page 1 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.