Pyro 0.2 supports PyTorch 0.4. See PyTorch [release notes](https://github.com/pytorch/pytorch/releases/tag/v0.4.0) for comprehensive changes. The most important change is that `Variable` and `Tensor` have been merged, so you can now simplify
diff
- pyro.param("my_param", Variable(torch.ones(1), requires_grad=True))
+ pyro.param("my_param", torch.ones(1))
PyTorch distributions
PyTorch's [torch.distributions](http://pytorch.org/docs/0.4.0/distributions.html) library is now Pyro’s main source for distribution implementations. The Pyro team helped create this library by collaborating with Adam Paszke, Alican Bozkurt, Vishwak Srinivasan, Rachit Singh, Brooks Paige, Jan-Willem Van De Meent, and many other contributors and reviewers. See the [Pyro wrapper docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pytorch-distributions) for wrapped PyTorch distributions and the [Pyro distribution docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro-distributions) for Pyro-specific distributions.
Constrained parameters
Parameters can now be constrained easily using notation like
python
from torch.distributions import constraints
pyro.param(“sigma”, torch.ones(10), constraint=constraints.positive)
See the [torch.distributions.constraints](http://pytorch.org/docs/0.4.0/distributions.html#module-torch.distributions.constraints) library and all of our Pyro [tutorials](http://pyro.ai/examples) for example usage.
Arbitrary tensor shapes
Arbitrary tensor shapes and batching are now supported in Pyro. This includes support for nested batching via `iarange` and support for batched multivariate distributions. The `iarange` context and `irange` generator are now much more flexible and can be combined freely. With power comes complexity, so check out our [tensor shapes tutorial](http://pyro.ai/examples/tensor_shapes.html) (hint: you’ll need to use [`.expand_by()`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro.distributions.TorchDistributionMixin.expand_by) and [`.independent()`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro.distributions.TorchDistributionMixin.independent)).
Parallel enumeration
Discrete enumeration can now be parallelized. This makes it especially easy and cheap to enumerate out discrete latent variables. Check out the [Gaussian Mixture Model tutorial](http://pyro.ai/examples/gmm.html) for example usage. To use parallel enumeration, you'll need to first configure sites, then use the `TraceEnum_ELBO` losss:
python
def model(...):
...
config_enumerate(default="parallel") configures sites
def guide(...):
with pyro.iarange("foo", 10):
x = pyro.sample("x", dist.Bernoulli(0.5).expand_by([10]))
...
svi = SVI(model, guide, Adam({}),
loss=TraceEnum_ELBO(max_iarange_nesting=1)) specify loss
svi.step()
Markov chain monte carlo via HMC and NUTS
This release adds experimental support for gradient-based Markov Chain Monte Carlo inference via Hamiltonian Monte Carlo [`pyro.infer.HMC`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/mcmc.html#module-pyro.infer.mcmc.hmc) and the No U-Turn Sampler [`pyro.infer.NUTS`](http://pyro-ppl.readthedocs.io/en/0.2.0-release/mcmc.html#module-pyro.infer.mcmc.nuts). See the [docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/mcmc.html) and [example](https://github.com/uber/pyro/blob/dev/examples/baseball.py) for details.
Gaussian Processes
A new Gaussian Process module [pyro.contrib.gp](http://pyro-ppl.readthedocs.io/en/0.2.0-release/contrib.gp.html) provides a framework for learning with Gaussian Processes. To get started, take a look at our [Gaussian Process Tutorial](http://pyro.ai/examples/gp.html). Thanks to Du Phan for this extensive contribution!
Automatic guide generation
Guides can now be created automatically with the [pyro.contrib.autoguide](http://pyro-ppl.readthedocs.io/en/0.2.0-release/contrib.autoguide.html) library. These work only for models with simple structure (no `irange` or `iarange`), and are easy to use:
python
from pyro.contrib.autoguide import AutoDiagNormal
def model(...):
...
guide = AutoDiagonalNormal(model)
svi = SVI(model, guide, ...)
Validation
Model validation is now available via three toggles:
python
pyro.enable_validation()
pyro.infer.enable_validation()
Turns on validation for PyTorch distributions.
pyro.distributions.enable_validation()
These can also be used temporarily as context managers
python
Run with validation in first step.
with pyro.validation_enabled(True):
svi.step()
Avoid validation on subsequent steps (may miss NAN errors).
with pyro.validation_enabled(False):
for i in range(1000):
svi.step()
Rejection sampling variational inference (RSVI)
We've added support for vectorized rejection sampling in a new `Rejector` distribution. See [docs](http://pyro-ppl.readthedocs.io/en/0.2.0-release/distributions.html#pyro.distributions.Rejector) or [`RejectionStandardGamma` class](https://github.com/uber/pyro/blob/0.4.0/pyro/distributions/testing/rejection_gamma.py#L12) for example usage.