Gpytorch

Latest version: v1.14

Safety actively analyzes 723929 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 6 of 6

0.1.0.rc4

New features
- Implement diagonal correction for basic variational inference, improving predictive variance estimates. This is on by default.
- `LazyTensor._quad_form_derivative` now has a default implementation! While custom implementations are likely to still be faster in many cases, this means that it is no longer required to implement a custom `_quad_form_derivative` when implementing a new `LazyTensor` subclass.

Bug fixes
- Fix a number of critical bugs for the new variational inference.
- Do some hyperparameter tuning for the SV-DKL example notebook, and include fancier NN features like batch normalization.
- Made it more likely that operations internally preserve the ability to perform preconditioning for linear solves and log determinants. This may have a positive impact on model performance in some cases.

0.1.0.rc3

Variational inference has been refactored
- Easier to experiment with different variational approximations
- Massive performance improvement for [SV-DKL](https://github.com/cornellius-gp/gpytorch/blob/master/examples/08_Deep_Kernel_Learning/Deep_Kernel_Learning_DenseNet_CIFAR_Tutorial.ipynb)

Experimental Pyro integration for variational inference
- See the [example Pyro notebooks](https://github.com/cornellius-gp/gpytorch/tree/master/examples/09_Pyro_Integration)

Lots of tiny bug fixes
(Too many to name, but everything should be better 😬)

0.1.0.rc2

0.1.0.rc1

Beta release
GPyTorch is now available on pip! `pip install gpytorch`.

**Important!** This release requires the preview build of PyTorch (>= 1.0). You should either build from source or install **pytorch-nightly**. See [the PyTorch docs](https://pytorch.org/get-started/locally/) for specific installation instructions.

If you were previously using GPyTorch, see [the migration guide](https://github.com/cornellius-gp/gpytorch/wiki/Migration-guide-from-alpha-to-beta) to help you move over.

What's new
- Batch mode: it is possible to train multiple GPs simultaneously
- Improved multitask models

Breaking changes
- `gpytorch.random_variables` have been replaced by `gpytorch.distributions`. These build upon PyTorch distributions.
- `gpytorch.random_variables.GaussianRandomVariable` -> `gpytorch.distributions.MultivariateNormal`.
- `gpytorch.random_variables.MultitaskGaussianRandomVariable` -> `gpytorch.distributions.MultitaskMultivariateNormal`.

Utilities
- `gpytorch.utils.scale_to_bounds` is now `gpytorch.utils.grid.scale_to_bounds`

Kernels
- `GridInterpolationKernel`, `GridKernel`, `InducingPointKernel` - the attribute `base_kernel_module` has become `base_kernel` (for consistency)
- `AdditiveGridInterpolationKernel` no longer exists. Now use `AdditiveStructureKernel(GridInterpolationKernel(...))
- `MultiplicativeGridInterpolationKernel no longer exists. Now use `ProductStructureKernel(GridInterpolationKernel(...))`.

Attributes (`n_*` -> `num_*`)
- IndexKernel: n_tasks -> num_tasks
- LCMKernel: n_tasks -> num_tasks
- MultitaskKernel: n_tasks -> num_tasks
- MultitaskGaussianLikelihood: n_tasks -> num_tasks
- SoftmaxLikelihood: n_features -> num_features
- MultitaskMean: n_tasks -> num_tasks
- VariationalMarginalLogLikelihood: n_data -> num_data
- SpectralMixtureKernel: n_dimensions -> ard_num_dims, n_mixtures -> num_mixtures

alpha
Alpha release

We strongly encourage you to check out our beta release for lots of improvements!
However, if you still need an old version, or need to use PyTorch 0.4, you can install this release.

Page 6 of 6

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.