Smt

Latest version: v2.9.2

Safety actively analyzes 723144 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 8

1.1.0

* Mixed integer surrogate enhancements (thanks Paul-Saves)
- Add number of components estimation in KPLS surrogate models (325)
- Add ordered variables management in mixed integer surrogates (326, 327). Deprecation warning: INT type is deprecated and superseded by ORD type.
- Update version for the GOWER distance model. (330)
- Implement generalization of the homoscedastic hypersphere kernel from Pelamatti et al. (330)
- Refactor MixedInteger (328, 330)
* Add `propagate_uncertainty` option in MFK method (320 thanks anfelopera) :
- when True the variances of lower fidelity levels are taken into account.
* Add LHS expansion method (303, 323 thanks rconde1997)
* MOE: Fix computation of errors when choosing expert surrogates (334)
* **Breaking Changes**:
- In EGO SMT, `UCB` criteria mistakenly named regarding the litterature is renamed `LCB`! (321)
- In MixedInteger surrogate: `use_gower_distance=True` option replaced by `categorical_kernel=GOWER`
* Documentation:
- Add collab links in [Tutorial README](https://github.com/SMTorg/smt/blob/master/tutorial/README.md) (#322)
- Add notebook about MFK with noise handling (320)
- Fix typos (320, 321)

1.0.0

It is a good time to release SMT 1.0 (just after 0.9!).

The SMT architecture has shown to be useful and resilient since the 0.2 version presented in [the article](https://hal.archives-ouvertes.fr/hal-02294310/document) (more additions than actual breaking changes since then). Special thanks to bouhlelma and hwangjt and thanks to [all contributors](https://github.com/SMTorg/smt/blob/master/AUTHORS.md).

This is a smooth transition from SMT 0.9, with small additions and bug fixes:

* Add `random_state` option to `NestedLHS` for result reproducibility (296 thanks anfelopera)
* Add `use_gower_distance` option to EGO to use the Gower distance kernel
instead of continuous relaxation in presence of mixed integer variables (299 thanks Paul-Saves )
* Fix kriging based bug to allow `n_start=1` (301)
* Workaround PLS changes in `scikit-learn 0.24` which impact KPLS surrogate model family (306)
* Add documentation about [saving and loading surrogate models](https://smt.readthedocs.io/en/latest/_src_docs/surrogate_models.html#how-to-save-and-load-trained-surrogate-models) (308)

0.9.0

* Mixture of Experts improvements: (282 thanks jbussemaker, 283)
- add variance prediction API (ie. `predict_variances()`) which is enabled when `variances_support` option is set
- add `MOESurrogateModel` class which adapts `MOE` to the `SurrogateModel` interface
- allow selection of experts to be part of the mixture (see `allow`/`deny` options)
- `MOE.AVAILABLE_EXPERTS` lists all possible experts
- `enabled_experts` property of an MOE instance lists possible experts wrt `derivatives/variances_support`
and `allow/deny` options.
* Sampling Method interface refactoring: (284 thanks LDAP)
- create an intermediate `ScaledSamplingMethod` class to be the base class for sampling methods
which generate samples in the [0, 1] hypercube
- allow future implementation of sampling methods generating samples direcly in the input space (i.e. within xlimits)
* Use of Gower distance in kriging based mixed integer surrogate: (289 thanks raul-rufato )
- add `use_gower_distance` option to `MixedIntegerSurrogate`
- add `gower` correlation model to kriging based surrogate
- see [MixedInteger notebook](https://github.com/SMTorg/smt/blob/master/tutorial/SMT_MixedInteger_application.ipynb) for usage
* Improve kriging based surrogates with multistart method (293 thanks Paul-Saves )
- run several hyperparameter optimizations taking the best result
- number of optimization is controlled by `n_start` new option (default 10)
* Update documentation for MOE and SamplingMethod (285)
* Fixes (279, 281)

0.8.0

* Noise API changes for Kriging based surrogates (276, 257 thanks anfelopera):
- add a new tutorial notebook on [how to deal with noise in SMT](https://github.com/SMTorg/smt/blob/master/tutorial/SMT_Noise.ipynb)
- rename <code>noise</code> as <code>noise0</code> option and is now a list of values
- add option <code>use_het_noise</code> to manage heteroscedastic noise,
- improve noise management for MFK (different noise by level),
- add option <code>nugget</code> to enable the handling of numerical instabilitily
- [matern kernel documentation](https://smt.readthedocs.io/en/latest/_src_docs/surrogate_models/krg.html#kriging)
* Add <code>predict_variance_derivatives</code> API (256 , 259 thanks Paul-Saves)
- add spatial derivatives for Kriging based surrogates
- fix respect of parameters bounds in Kriging based surrogates
* Notebooks updates (262, 275 thanks NatOnera, 277 thanks Paul-Saves )
- [SMT tutorial](https://github.com/SMTorg/smt/blob/master/tutorial/SMT_Tutorial.ipynb)
- [SMT EGO tutorial](https://github.com/SMTorg/smt/blob/master/tutorial/SMT_EGO_application.ipynb)
- [SMT Mixed Integer tutorial](https://github.com/SMTorg/smt/blob/master/tutorial/SMT_MixedInteger_application.ipynb)
* Kriging based surrogates refactoring (261 thanks anfelopera)
- inheritance changes: MFKPLS -> MFK, KPLSK, GEKPLS -> KPLS
- improve noise options consistency
- improve options validity checking
* Code quality (264, 267, 268 thanks LDAP):
- use of abc metaclass to enforce developer API
- type hinting
- add 'build system' specification and requirements.txt for tests, setup cleanup

0.7.1

* allow noise evaluation for Kriging based surrogate (251)
* fix optimizer bounds in Kriging based surrogate (252)
* fix MFK parameterization by level (252)
* add <code>random_state</code> option to LHS sampling method for test repeatability (253)
* add <code>random_state</code> option to EGO application for test repeatability (255)
* cleanup tests (255)

0.7.0

* add Marginal Gaussian Process surrogate models(236, thanks repriem)
* add Matern kernels for kriging based surrogates (236, thanks repriem)
* add gradient based optimization for hyperparameters in kriging based surrogates: new <code>hyper_opt</code> option to specify TNC Scipy gradient based optimizer, gradient-free Cobyla optimizer remains the default. (236, thanks repriem)
* add <code>MixedIntegerContext</code> documentation (234 )
* fix bug in <code>mixed_integer::unfold_with_enum_mask</code> (233 )

Page 5 of 8

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.