**Important**: This release requires Python 3.7 (up from 3.6) and PyTorch 1.10 (up from 1.9)
New Features
- gpytorch.metrics module offers easy-to-use metrics for GP performance.(1870) This includes:
- gpytorch.metrics.mean_absolute_error
- gpytorch.metrics.mean_squared_error
- gpytorch.metrics.mean_standardized_log_loss
- gpytorch.metrics.negative_log_predictive_density
- gpytorch.metrics.quantile_coverage_error
- Large scale inference (using matrix-multiplication techniques) now implements the variance reduction scheme described in [Wenger et al., ICML 2022](https://arxiv.org/abs/2107.00243). (#1836)
- This makes it possible to use LBFGS, or other line search based optimization techniques, with large scale (exact) GP hyperparameter optimization.
- Variational GP models support online updates (i.e. “fantasizing new models). (1874)
- This utilizes the method described in [Maddox et al., NeurIPS 2021](https://papers.nips.cc/paper/2021/hash/325eaeac5bef34937cfdc1bd73034d17-Abstract.html)
- Improvements to gpytorch.priors
- New HalfCauchyPrior (1961)
- LKJPrior now supports sampling (1737)
Minor Features
- Add LeaveOneOutPseudoLikelihood for hyperparameter optimization (1989)
- The PeriodicKernel now supports ARD lengthscales/periods (1919)
- LazyTensors (A) can now be matrix multiplied with tensors (B) from the left hand side (i.e. B x A) (1932)
- Maximum Cholesky retries can be controlled through a setting (1861)
- Kernels, means, and likelihoods can be pickled (1876)
- Minimum variance for FixedNoiseGaussianLikelihood can be set with a context manager (2009)
Bug Fixes
- Fix backpropagation issues with KeOps kernels (1904)
- Fix broadcasting issues with lazily evaluated kernels (1971)
- Fix batching issues with PolynomialKernel (1977)
- Fix issues with PeriodicKernel.diag() (1919)
- Add more informative error message when train targets and the train prior distribution mismatch (1905)
- Fix issues with priors on ConstantMean (2042)