Mlpack

Latest version: v4.5.1

Safety actively analyzes 714919 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 7 of 9

2.0.0

_2015-12-24_

* Removed overclustering support from k-means because it is not well-tested,
may be buggy, and is (I think) unused. If this was support you were using,
open a bug or get in touch with us; it would not be hard for us to
reimplement it.

* Refactored KMeans to allow different types of Lloyd iterations.

* Added implementations of k-means: Elkan's algorithm, Hamerly's algorithm,
Pelleg-Moore's algorithm, and the DTNN (dual-tree nearest neighbor)
algorithm.

* Significant acceleration of LRSDP via the use of accu(a % b) instead of
trace(a * b).

* Added MatrixCompletion class (matrix_completion), which performs nuclear
norm minimization to fill unknown values of an input matrix.

* No more dependence on Boost.Random; now we use C++11 STL random support.

* Add softmax regression, contributed by Siddharth Agrawal and QiaoAn Chen.

* Changed NeighborSearch, RangeSearch, FastMKS, LSH, and RASearch API; these
classes now take the query sets in the Search() method, instead of in the
constructor.

* Use OpenMP, if available. For now OpenMP support is only available in the
DET training code.

* Add support for predicting new test point values to LARS and the
command-line `lars` program.

* Add serialization support for `Perceptron` and `LogisticRegression`.

* Refactor SoftmaxRegression to predict into an `arma::Row<size_t>` object,
and add a `softmax_regression` program.

* Refactor LSH to allow loading and saving of models.

* ToString() is removed entirely (487).

* Add `--input_model_file` and `--output_model_file` options to appropriate
machine learning algorithms.

* Rename all executables to start with an "mlpack" prefix (229).

* Add HoeffdingTree and `mlpack_hoeffding_tree`, an implementation of the
streaming decision tree methodology from Domingos and Hulten in 2000.

1.0.12

_2015-01-07_

* Switch to 3-clause BSD license (from LGPL).

1.0.11

_2014-12-11_

* Proper handling of dimension calculation in PCA.

* Load parameter vectors properly for LinearRegression models.

* Linker fixes for AugLagrangian specializations under Visual Studio.

* Add support for observation weights to LinearRegression.

* `MahalanobisDistance<>` now takes the root of the distance by default and
therefore satisfies the triangle inequality (TakeRoot now defaults to true).

* Better handling of optional Armadillo HDF5 dependency.

* Fixes for numerous intermittent test failures.

* math::RandomSeed() now sets the random seed for recent (>=3.930) Armadillo
versions.

* Handle Newton method convergence better for
SparseCoding::OptimizeDictionary() and make maximum iterations a parameter.

* Known bug: CosineTree construction may fail in some cases on i386 systems
(358).

1.0.10

_2014-08-29_

* Bugfix for NeighborSearch regression which caused very slow allknn/allkfn.
Speeds are now restored to approximately 1.0.8 speeds, with significant
improvement for the cover tree (347).

* Detect dependencies correctly when ARMA_USE_WRAPPER is not being defined
(i.e., libarmadillo.so does not exist).

* Bugfix for compilation under Visual Studio (348).

1.0.9

_2014-07-28_

* GMM initialization is now safer and provides a working GMM when constructed
with only the dimensionality and number of Gaussians (301).

* Check for division by 0 in Forward-Backward Algorithm in HMMs (301).

* Fix MaxVarianceNewCluster (used when re-initializing clusters for k-means)
(301).

* Fixed implementation of Viterbi algorithm in HMM::Predict() (303).

* Significant speedups for dual-tree algorithms using the cover tree (235,
314) including a faster implementation of FastMKS.

* Fix for LRSDP optimizer so that it compiles and can be used (312).

* CF (collaborative filtering) now expects users and items to be zero-indexed,
not one-indexed (311).

* CF::GetRecommendations() API change: now requires the number of
recommendations as the first parameter. The number of users in the local
neighborhood should be specified with CF::NumUsersForSimilarity().

* Removed incorrect PeriodicHRectBound (58).

* Refactor LRSDP into LRSDP class and standalone function to be optimized
(305).

* Fix for centering in kernel PCA (337).

* Added simulated annealing (SA) optimizer, contributed by Zhihao Lou.

* HMMs now support initial state probabilities; these can be set in the
constructor, trained, or set manually with HMM::Initial() (302).

* Added Nyström method for kernel matrix approximation by Marcus Edel.

* Kernel PCA now supports using Nyström method for approximation.

* Ball trees now work with dual-tree algorithms, via the BallBound<> bound
structure (307); fixed by Yash Vadalia.

* The NMF class is now AMF<>, and supports far more types of factorizations,
by Sumedh Ghaisas.

* A QUIC-SVD implementation has returned, written by Siddharth Agrawal and
based on older code from Mudit Gupta.

* Added perceptron and decision stump by Udit Saxena (these are weak learners
for an eventual AdaBoost class).

* Sparse autoencoder added by Siddharth Agrawal.

1.0.8

_2014-01-06_

* Memory leak in NeighborSearch index-mapping code fixed (298).

* GMMs can be trained using the existing model as a starting point by
specifying an additional boolean parameter to GMM::Estimate() (296).

* Logistic regression implementation added in methods/logistic_regression (see
also 293).

* L-BFGS optimizer now returns its function via Function().

* Version information is now obtainable via mlpack::util::GetVersion() or the
__MLPACK_VERSION_MAJOR, __MLPACK_VERSION_MINOR, and __MLPACK_VERSION_PATCH
macros (297).

* Fix typos in allkfn and allkrann output.

Page 7 of 9

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.