Mogptk

Latest version: v0.5.2

Safety actively analyzes 682387 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 3

0.3.1

- Fix conversions to/from GPU
- Fix error on `plot_losses()`
- Rename `gpr.PhiKernel` as `gpr.FunctionKernel`
- Add kernel shortcuts such as `mogptk.Kernels.SpectralMixture`
- Include end point when calling `Data.remove_range()`
- Fix input dimensions for `AddKernel` and `MulKernel`
- Add `sigma` and `figsize` arguments to `Model.plot_prediction()`

0.3.0

Features
- Support for **variational** and **sparse** models
- Support for multi output (heterogeneous) likelihoods, i.e. different likelihoods for each channel
- New models: `Snelson`, `OpperArchambeau`, `Titsias`, `Hensman`
- New kernels: `Constant`, `White`, `Exponential`, `LocallyPeriodic`, `Cosine`, `Sinc`
- New likelihoods: `StudentT`, `Exponential`, `Laplace`, `Bernoulli`, `Beta`, `Gamma`, `Poisson`, `Weibull`, `LogLogistic`, `LogGaussian`, `ChiSquared`
- New mean functions: `Constant` and `Linear`
- Allow kernels to be added and multiplied (i.e. `K1 + K2` or `K1 * K2`)
- `Data` and `DataSet` now accept more data types as input, such as pandas series
- `Data`, `DataSet`, and `Model` plot functionalities return the figure and axes to allow customization
- Support sampling (prior or posterior) from the model
- Add the MOHSM kernel: multi-output harmonic spectral mixture kernel (Altamirano 2021)
- Parameters can be pegged to other parameters, essentially removing them from training
- Exact model supports training with known data point variances and draw their error bars in plots

Improvements
- Jitter added to the diagonal before calculating the Cholesky is now relative to the average value of the diagonal, this improves numeric stability for all kernels irrespective of the actual numerical magnitude of the values
- Kernels now implement `K_diag` that returns the kernel diagonal for better performance
- BNSE initialization method has been reimplemented with improved performance and stability
- Parameter initialization for all models from different initialization methods has been much improved
- Induction point initialization now support `random` or `grid` or `density`
- `SpectralMixture` (in addition to `Spectral`), `MultiOutputSpectralMixture` (in addition to `MultiOutputSpectral`) with higher performance
- Allow mixing of single-output and multi-output kernels using active
- All plotting functions have been restyled
- Model training allows custom error function for calculation at each iteration
- Support single and cross lengthscales for the `SquaredExponential`, `RationalQuadratic`, `Periodic`, `LocallyPeriodic` kernels
- Add AIC and BIC methods to model
- Add `model.plot_correlation()`

Changes
- Remove `rescale_x`
- `Parameter.trainable` => `Parameter.train`
- Kernels are by default initialized deterministically and not random, however the models (MOSM, MOHSM, CONV, CSM, SM-LMC, and SM) are still initialized randomly by default
- Plotting predictions happens from the model no the data: `model.plot_prediction()` instead of `model.predict(); data.plot()`

0.2.5

Bug fixes

0.2.4

- Set maximum frequency to Nyquist in MOSM, CSM, SM-LMC, and SM; fixes 21
- Improve CholeskyException messaging
- Update the GONU example
- Fix Sigmoid.backward, fixes 25
- Add support for multiple input dimensions for remove_range, fixes 24
- Fix SM model initialization for IPS
- Data now permits different dtypes per input dimension for X, LoadFunction now works for multi input dimensions, upgrading time delta for datetime64 now fixed
- Change X from (n,input_dims) to [(n,)] * input_dims
- Add dim to functions to specify input dimension
- Fix example 06
- Fix old import path, fixes 27
- Reuse torch.eye in log_marginal_likelihood
- Make rescale_x optional for models, see 28; return losses and errors from train()

0.2.3

- Adding the `MSE` and `sMAPE` error measures
- Fix returning tensors from the GPU back to the CPU
- Fix repeated use of a dataset by properly deepcopying it
- Add console output for training
- Fix the LBFGS optimizer
- Add the `plot_losses` function to the `Model` class to plot losses/errors after `train` separately

0.2.2

- Allow mean functions for MOGPs
- Add the Matérn kernel
- Fix a bug in the `get_gram_matrix` function

Page 2 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.