Botorch

Latest version: v0.12.0

Safety actively analyzes 687918 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 5 of 8

0.6.4

New Features
* Implement `ExpectationPosteriorTransform` (903).
* Add `PairwiseMCPosteriorVariance`, a cheap active learning acquisition function (1125).
* Support computing quantiles in the fully Bayesian posterior, add `FullyBayesianPosteriorList` (1161).
* Add expectation risk measures (1173).
* Implement Multi-Fidelity GIBBON (Lower Bound MES) acquisition function (1185).

Other Changes
* Add an error message for one shot acquisition functions in `optimize_acqf_discrete` (939).
* Validate the shape of the `bounds` argument in `optimize_acqf` (1142).
* Minor tweaks to `SAASBO` (1143, 1183).
* Minor updates to tutorials (24f7fda7b40d4aabf502c1a67816ac1951af8c23, 1144, 1148, 1159, 1172, 1180).
* Make it easier to specify a custom `PyroModel` (1149).
* Allow passing in a `mean_module` to `SingleTaskGP/FixedNoiseGP` (1160).
* Add a note about acquisitions using gradients to base class (1168).
* Remove deprecated `box_decomposition` module (1175).

Bug Fixes
* Bug-fixes for `ProximalAcquisitionFunction` (1122).
* Fix missing warnings on failed optimization in `fit_gpytorch_scipy` (1170).
* Ignore data related buffers in `PairwiseGP.load_state_dict` (1171).
* Make `fit_gpytorch_model` properly honor the `debug` flag (1178).
* Fix missing `posterior_transform` in `gen_one_shot_kg_initial_conditions` (1187).

0.6.3

New Features
* Implement SAASBO - `SaasFullyBayesianSingleTaskGP` model for sample-efficient high-dimensional Bayesian optimization (1123).
* Add SAASBO tutorial (1127).
* Add `LearnedObjective` (1131), `AnalyticExpectedUtilityOfBestOption` acquisition function (1135), and a few auxiliary classes to support Bayesian optimization with preference exploration (BOPE).
* Add BOPE tutorial (1138).

Other Changes
* Use `qKG.evaluate` in `optimize_acqf_mixed` (1133).
* Add `construct_inputs` to SAASBO (1136).

Bug Fixes
* Fix "Constraint Active Search" tutorial (1124).
* Update "Discrete Multi-Fidelity BO" tutorial (1134).

0.6.2

New Features
* Use `BOTORCH_MODULAR` in tutorials with Ax (1105).
* Add `optimize_acqf_discrete_local_search` for discrete search spaces (1111).

Bug Fixes
* Fix missing `posterior_transform` in qNEI and `get_acquisition_function` (1113).

0.6.1

New Features
* Add `Standardize` input transform (1053).
* Low-rank Cholesky updates for NEI (1056).
* Add support for non-linear input constraints (1067).
* New MOO problems: MW7 (1077), disc brake (1078), penicillin (1079), RobustToy (1082), GMM (1083).

Other Changes
* Support multi-output models in MES using `PosteriorTransform` (904).
* Add `Dispatcher` (1009).
* Modify qNEHVI to support deterministic models (1026).
* Store tensor attributes of input transforms as buffers (1035).
* Modify NEHVI to support MTGPs (1037).
* Make `Normalize` input transform input column-specific (1047).
* Improve `find_interior_point` (1049).
* Remove deprecated `botorch.distributions` module (1061).
* Avoid costly application of posterior transform in Kronecker & HOGP models (1076).
* Support heteroscedastic perturbations in `InputPerturbations` (1088).

Performance Improvements
* Make risk measures more memory efficient (1034).

Bug Fixes
* Properly handle empty `fixed_features` in optimization (1029).
* Fix missing weights in `VaR` risk measure (1038).
* Fix `find_interior_point` for negative variables & allow unbounded problems (1045).
* Filter out indefinite bounds in constraint utilities (1048).
* Make non-interleaved base samples use intuitive shape (1057).
* Pad small diagonalization with zeros for `KroneckerMultitaskGP` (1071).
* Disable learning of bounds in `preprocess_transform` (1089).
* Fix `gen_candidates_torch` (4079164489613d436d19c7b2df97677d97dfa8dc).
* Catch runtime errors with ill-conditioned covar (1095).
* Fix `compare_mc_analytic_acquisition` tutorial (1099).

0.6.0

Compatibility
* Require PyTorch >=1.9 (1011).
* Require GPyTorch >=1.6 (1011).

New Features
* New `ApproximateGPyTorchModel` wrapper for various (variational) approximate GP models (1012).
* New `SingleTaskVariationalGP` stochastic variational Gaussian Process model (1012).
* Support for Multi-Output Risk Measures (906, 965).
* Introduce `ModelList` and `PosteriorList` (829).
* New Constraint Active Search tutorial (1010).
* Add additional multi-objective optimization test problems (958).

Other Changes
* Add `covar_module` as an optional input of `MultiTaskGP` models (941).
* Add `min_range` argument to `Normalize` transform to prevent division by zero (931).
* Add initialization heuristic for acquisition function optimization that samples around best points (987).
* Update initialization heuristic to perturb a subset of the dimensions of the best points if the dimension is > 20 (988).
* Modify `apply_constraints` utility to work with multi-output objectives (994).
* Short-cut `t_batch_mode_transform` decorator on non-tensor inputs (991).

Performance Improvements
* Use lazy covariance matrix in `BatchedMultiOutputGPyTorchModel.posterior` (976).
* Fast low-rank Cholesky updates for `qNoisyExpectedHypervolumeImprovement` (747, 995, 996).

Bug Fixes
* Update error handling to new PyTorch linear algebra messages (940).
* Avoid test failures on Ampere devices (944).
* Fixes to the `Griewank` test function (972).
* Handle empty base_sample_shape in `Posterior.rsample` (986).
* Handle `NotPSDError` and hitting `maxiter` in `fit_gpytorch_model` (1007).
* Use TransformedPosterior for subclasses of GPyTorchPosterior (983).
* Propagate `best_f` argument to `qProbabilityOfImprovement` in input constructors (f5a5f8b6dc20413e67c6234e31783ac340797a8d).

0.5.1

Compatibility
* Require GPyTorch >=1.5.1 (928).

New Features
* Add `HigherOrderGP` composite Bayesian Optimization tutorial notebook (864).
* Add Multi-Task Bayesian Optimziation tutorial (867).
* New multi-objective test problems from (876).
* Add `PenalizedMCObjective` and `L1PenaltyObjective` (913).
* Add a `ProximalAcquisitionFunction` for regularizing new candidates towards previously generated ones (919, 924).
* Add a `Power` outcome transform (925).

Bug Fixes
* Batch mode fix for `HigherOrderGP` initialization (856).
* Improve `CategoricalKernel` precision (857).
* Fix an issue with `qMultiFidelityKnowledgeGradient.evaluate` (858).
* Fix an issue with transforms with `HigherOrderGP`. (889)
* Fix initial candidate generation when parameter constraints are on different device (897).
* Fix bad in-place op in `_generate_unfixed_lin_constraints` (901).
* Fix an input transform bug in `fantasize` call (902).
* Fix outcome transform bug in `batched_to_model_list` (917).

Other Changes
* Make variance optional for `TransformedPosterior.mean` (855).
* Support transforms in `DeterministicModel` (869).
* Support `batch_shape` in `RandomFourierFeatures` (877).
* Add a `maximize` flag to `PosteriorMean` (881).
* Ignore categorical dimensions when validating training inputs in `MixedSingleTaskGP` (882).
* Refactor `HigherOrderGPPosterior` for memory efficiency (883).
* Support negative weights for minimization objectives in `get_chebyshev_scalarization` (884).
* Move `train_inputs` transforms to `model.train/eval` calls (894).

Page 5 of 8

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.