-Plates
-- These can now be used to build time series models: https://github.com/improbable-research/keanu/pull/233
-Modelling
-- Composing many `BayesianNetworks` together can be done through Model Composition: https://github.com/improbable-research/keanu/pull/214
-Tensors
-- `split` enables you to divide a tensor into many chunks along a dimension.
-- `concat` is now a static method that takes an array of tensors to concat as opposed to being ran on the tensor we want to concat in to.
-Vertex
-- `Pareto`.
-- `HalfGaussian`.
-- `.reshape` and static `concat`.
-- `toString` now shows label, class and value.
-Auto Differentiation
-- Now utilises reverse mode.
-- Sum, take, concat, reshape, slice support reverse auto-diff.
--- There is some unexpected behaviour with `sum`, `take` and `reshape` in reverse for high rank tensors (2x2x2 and beyond) when calculating gradients that is currently being investigated. It's best to use rank 1 or 2 where possible.
-Other
-- Progress bar for sampling and inference: https://github.com/improbable-research/keanu/pull/216
-- Kullback–Leibler divergence.
-- Removed `Prior` and `RejectionSampler`.
-- Add `logProb` to `NetworkSamples`
-Bugs
-- Fixed issue with tensor equality in `greaterThanOrEqual`.
-- Fixed broadcast operations for high rank and low rank tensors.
-- Fixed issue with Metropolis Hastings always returning the sample when streaming without using down sampling.