Beta release
GPyTorch is now available on pip! `pip install gpytorch`.
**Important!** This release requires the preview build of PyTorch (>= 1.0). You should either build from source or install **pytorch-nightly**. See [the PyTorch docs](https://pytorch.org/get-started/locally/) for specific installation instructions.
If you were previously using GPyTorch, see [the migration guide](https://github.com/cornellius-gp/gpytorch/wiki/Migration-guide-from-alpha-to-beta) to help you move over.
What's new
- Batch mode: it is possible to train multiple GPs simultaneously
- Improved multitask models
Breaking changes
- `gpytorch.random_variables` have been replaced by `gpytorch.distributions`. These build upon PyTorch distributions.
- `gpytorch.random_variables.GaussianRandomVariable` -> `gpytorch.distributions.MultivariateNormal`.
- `gpytorch.random_variables.MultitaskGaussianRandomVariable` -> `gpytorch.distributions.MultitaskMultivariateNormal`.
Utilities
- `gpytorch.utils.scale_to_bounds` is now `gpytorch.utils.grid.scale_to_bounds`
Kernels
- `GridInterpolationKernel`, `GridKernel`, `InducingPointKernel` - the attribute `base_kernel_module` has become `base_kernel` (for consistency)
- `AdditiveGridInterpolationKernel` no longer exists. Now use `AdditiveStructureKernel(GridInterpolationKernel(...))
- `MultiplicativeGridInterpolationKernel no longer exists. Now use `ProductStructureKernel(GridInterpolationKernel(...))`.
Attributes (`n_*` -> `num_*`)
- IndexKernel: n_tasks -> num_tasks
- LCMKernel: n_tasks -> num_tasks
- MultitaskKernel: n_tasks -> num_tasks
- MultitaskGaussianLikelihood: n_tasks -> num_tasks
- SoftmaxLikelihood: n_features -> num_features
- MultitaskMean: n_tasks -> num_tasks
- VariationalMarginalLogLikelihood: n_data -> num_data
- SpectralMixtureKernel: n_dimensions -> ard_num_dims, n_mixtures -> num_mixtures
alpha
Alpha release
We strongly encourage you to check out our beta release for lots of improvements!
However, if you still need an old version, or need to use PyTorch 0.4, you can install this release.