**Notable Changes:**
- new Auto-Encoders:
+ `Ae`
+ `TripletAe` (`Ae` version of `TripletVae`)
+ `AdaAe` (`Ae` version of `AdaVae`)
+ `AdaNegTripletAe` (`Ae` version of `AdaNegTripletVae`)
- custom dataset MNIST example in the docs
**Breaking Changes**
- flattened `disent.frameworks.vae` and `disent.frameworks.ae` modules, `unsupervised`, `weaklysupervised`, and `supervised` submodules no longer exist.
- remove latent parameter classes from VAEs, VAEs now directly encode distributions with the `encode_dists()` function, this simplified a lot of other code.
- Datasets now only return `'x'` in the observation dictionary if an `augment` is specified, ~5% performance boost
- some dependencies are optional, more work is still required to minimise dependencies
- Removed `sample_random_traversal_factors`, `sample_random_cycle_factors` from `StateSpace` and replaced with generic function `sample_random_factor_traversal`
- renamed all autoencoders `AE` to `Ae`
**Other Changes:**
- hdf5 dataset performance fix, now up to 5x faster when not loaded into memory
- all Auto-Encoders have new config options to disable the augment loss, recon loss, or detach the decoder so that no loss flows back through the encoder. VAEs can additionally have the regularisation loss disabled.
- new `laplace` latent distribution, can be specified in VAE configs.
- triplet loss helper functions
- flatness components metric helper functions for use elsewhere: `compute_linear_score`, `compute_axis_score`
- `FftKernel` augment module inheriting from `torch.nn.Module`, applies a channel-wise convolution to the input.
- `to_standardised_tensor` fix for non-`PIL.Image.Image` inputs
- more math helper functions:
+ `torch_normalize` normalise values along an axis between 0 and 1
+ `torch_mean_generalized` now supports the `keepdim` argument
- `disent.visualise.visualise_module` removed old redundant code adapted from disentanglement_lib
- `disent.visualise.visualise_util` additions
+ `make_image_grid` and `make_animated_image_grid` auto-detect border colour from input dtype
+ replaced `cycle_factor` with `get_factor_traversal` that accepts different modes: `interval` and `cycle`
- cleaned up experiments
**++ many more additions and minor fixes ++**