Gpytorch

Latest version: v1.14

Safety actively analyzes 714875 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 6

1.5.0

This release adds 2 new model classes, as well as a number of bug fixes:
- GPLVM models for unsupervised learning
- Polya-Gamma GPs for GP classification
In addition, this release contains numerous improvements to SGPR models (that have also been included in prior bug-fix releases).

New features
- Add example notebook that demos binary classification with Polya-Gamma augmentation (1523)
- New model class: Bayesian GPLVM with Stochastic Variational Inference (1605)
- Periodic kernel handles multi-dimensional inputs (1593)
- Add missing data gaussian likelihoods (1668)

Performance
- Speed up SGPR models (1517, 1528, 1670)

Fixes
- Fix erroneous loss for ExactGP multitask models (1647)
- Fix pyro sampling (1594)
- Fix initialize bug for additive kernels (1635)
- Fix matrix multiplication of rectangular ZeroLazyTensor (1295)
- Dirichlet GPs use true train targets not labels (1641)

1.4.2

Various bug fixes, including

- Use current PyTorch functionality (1611, 1586)
- Bug fixes to Lanczos factorization (1607)
- Fixes to SGPR model (1607)
- Various fixes to LazyTensor math (1576, 1584)
- SmoothedBoxPrior has a sample method (1546)
- Fixes to additive-structure models (1582)
- Doc fixes {1603)
- Fix to index kernel and LCM kernels (1608, 1592)
- Fixes to KeOps bypass (1609)

1.4.1

Fixes
- Simplify interface for 3+ layer DSPP models (1565)
- Fix marginal log likelihood calculation for exact Bayesian inference w/ Pyro (1571)
- Remove CG warning for small matrices (1562)
- Fix Pyro cluster-multitask example notebook (1550)
- Fix gradients for KeOps tensors (1543)
- Ensure that gradients are passed through lazily-evaluated kernels (1518)
- Fix bugs for models with batched fantasy observations (1529, 1499)
- Correct default `latent_dim` value for LMC variational models (1512)

New features
- Create `gpytorch.utils.grid.ScaleToBounds` utility to replace `gpytorch.utils.grid.scale_to_bounds` method (1566)
- Fix skip connections in Deep GP example (1531)
- Add fantasy point support for structured kernel interpolation models (1545)

Documentation
- Add default values to all gpytorch.settings (1564)
- Improve Hadamard multitask notebook (1537)

Performance
- Speed up SGPR models (1517, 1528)

1.4.0

This release includes many major speed improvements, especially to Kronecker-factorized multi-output models.

Performance improvements
- Major speed improvements for Kronecker product multitask models (1355, 1430, 1440, 1469, 1477)
- Unwhitened VI speed improvements (1487)
- SGPR speed improvements (1493)
- Large scale exact GP speed improvements (1495)
- Random Fourier feature speed improvements (1446, 1493)

New Features
- Dirichlet Classification likelihood (1484) - based on Milios et al. (NeurIPS 2018)
- MultivariateNormal objects have a `base_sample_shape` attribute for low-rank/degenerate distributions (1502)

New documentation
- Tutorial for designing your own kernels (1421)

Debugging utilities
- Better naming conventions for AdditiveKernel and ProductKernel (1488)
- `gpytorch.settings.verbose_linalg` context manager for seeing what linalg routines are run (1489)
- Unit test improvements (1430, 1437)

Bug Fixes
- `inverse_transform` is applied to the initial values of constraints (1482)
- `psd_safe_cholesky` obeys cholesky_jitter settings (1476)
- fix scaling issue with priors on variational models (1485)

Breaking changes
- `MultitaskGaussianLikelihoodKronecker` (deprecated) is fully incorporated in `MultitaskGaussianLikelihood` (1471)

1.3.1

Fixes
- Spectral mixture kernels work with SKI (1392)
- Natural gradient descent is compatible with batch-mode GPs (1416)
- Fix prior mean in whitened SVGP (1427)
- RBFKernelGrad has no more in-place operations (1389)
- Fixes to ConstantDiagLazyTensor (1381, 1385)

Documentation
- Include example notebook for multitask Deep GPs (1410)
- Documentation updates (1408, 1434, 1385, 1393)

Performance
- KroneckerProductLazyTensors use root decompositions of children (1394)
- SGPR now uses Woodbury formula and matrix determinant lemma (1356)

Other
- Delta distributions have an `arg_constraints` attribute (1422)
- Cholesky factorization now takes optional diagonal noise argument (1377)

1.3.0

This release primarily focuses on performance improvements, and adds contour integral quadrature based variational models.

Major Features

Variational models with contour integral quadrature
- Add an MVM-based approach to whitened variatiational inference (1372)
- This is based on the work in [Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization](https://arxiv.org/abs/2006.11267)

Minor Features

Performance improvements
- Kronecker product models compute a deterministic logdet (faster than the Lanczos-based logdet) (1332)
- Improve efficiency of `KroneckerProductLazyTensor` symeig method (1338)
- Improve SGPR efficiency (1356)

Other improvements
- `SpectralMixtureKernel` accepts arbitrary batch shapes (1350)
- Variational models pass around arbitrary `**kwargs` to the `forward` method (1339)
- `gpytorch.settings` context managers keep track of their default value (1347)
- Kernel objects can be pickle-d (1336)

Bug Fixes
- Fix `requires_grad` checks in `gpytorch.inv_matmul` (1322)
- Fix reshaping bug for batch independent multi-output GPs (1368)
- `ZeroMean` accepts a `batch_shape` argument (1371)
- Various doc fixes/improvements (1327, 1343, 1315, 1373)

Page 3 of 6

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.