Mlpack

Latest version: v4.5.1

Safety actively analyzes 723177 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 9

3.1.1

_2019-05-26_

* Fix random forest bug for numerical-only data (1887).

* Significant speedups for random forest (1887).

* Random forest now has `minimum_gain_split` and `subspace_dim` parameters
(1887).

* Decision tree parameter `print_training_error` deprecated in favor of
`print_training_accuracy`.

* `output` option changed to `predictions` for adaboost and perceptron
binding. Old options are now deprecated and will be preserved until mlpack
4.0.0 (1882).

* Concatenated ReLU layer (1843).

* Accelerate NormalizeLabels function using hashing instead of linear search
(see `src/mlpack/core/data/normalize_labels_impl.hpp`) (1780).

* Add `ConfusionMatrix()` function for checking performance of classifiers
(1798).

* Install ensmallen headers when it is downloaded during build (1900).

3.1.0

_2019-04-25_

* Add DiagonalGaussianDistribution and DiagonalGMM classes to speed up the
diagonal covariance computation and deprecate DiagonalConstraint (1666).

* Add kernel density estimation (KDE) implementation with bindings to other
languages (1301).

* Where relevant, all models with a `Train()` method now return a `double`
value representing the goodness of fit (i.e. final objective value, error,
etc.) (1678).

* Add implementation for linear support vector machine (see
`src/mlpack/methods/linear_svm`).

* Change DBSCAN to use PointSelectionPolicy and add OrderedPointSelection (1625).

* Residual block support (1594).

* Bidirectional RNN (1626).

* Dice loss layer (1674, 1714) and hard sigmoid layer (1776).

* `output` option changed to `predictions` and `output_probabilities` to
`probabilities` for Naive Bayes binding (`mlpack_nbc`/`nbc()`). Old options
are now deprecated and will be preserved until mlpack 4.0.0 (1616).

* Add support for Diagonal GMMs to HMM code (1658, 1666). This can provide
large speedup when a diagonal GMM is acceptable as an emission probability
distribution.

* Python binding improvements: check parameter type (1717), avoid copying
Pandas dataframes (1711), handle Pandas Series objects (1700).

3.0.4

_2018-11-13_

* Bump minimum CMake version to 3.3.2.

* CMake fixes for Ninja generator by Marc Espie.

3.0.3

_2018-07-27_

* Fix Visual Studio compilation issue (1443).

* Allow running local_coordinate_coding binding with no initial_dictionary
parameter when input_model is not specified (1457).

* Make use of OpenMP optional via the CMake 'USE_OPENMP' configuration
variable (1474).

* Accelerate FNN training by 20-30% by avoiding redundant calculations
(1467).

* Fix math::RandomSeed() usage in tests (1462, 1440).

* Generate better Python setup.py with documentation (1460).

3.0.2

_2018-06-08_

* Documentation generation fixes for Python bindings (1421).

* Fix build error for man pages if command-line bindings are not being built
(1424).

* Add 'shuffle' parameter and Shuffle() method to KFoldCV (1412). This will
shuffle the data when the object is constructed, or when Shuffle() is
called.

* Added neural network layers: AtrousConvolution (1390), Embedding (1401),
and LayerNorm (layer normalization) (1389).

* Add Pendulum environment for reinforcement learning (1388) and update
Mountain Car environment (1394).

3.0.1

_2018-05-10_

* Fix intermittently failing tests (1387).

* Add big-batch SGD (BBSGD) optimizer in
src/mlpack/core/optimizers/bigbatch_sgd/ (1131).

* Fix simple compiler warnings (1380, 1373).

* Simplify NeighborSearch constructor and Train() overloads (1378).

* Add warning for OpenMP setting differences (1358/1382). When mlpack is
compiled with OpenMP but another application is not (or vice versa), a
compilation warning will now be issued.

* Restructured loss functions in src/mlpack/methods/ann/ (1365).

* Add environments for reinforcement learning tests (1368, 1370, 1329).

* Allow single outputs for multiple timestep inputs for recurrent neural
networks (1348).

* Add He and LeCun normal initializations for neural networks (1342).
Neural networks: add He and LeCun normal initializations (1342), add FReLU
and SELU activation functions (1346, 1341), add alpha-dropout (1349).

Page 4 of 9

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.