Deephyper

Latest version: v0.9.3

Safety actively analyzes 714860 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 7

0.9.2

0.9.1

What's Changed

* Remove tox file, unittest, and revise workflow by wigging in https://github.com/deephyper/deephyper/pull/247
* Add tests workflow for macOS by wigging in https://github.com/deephyper/deephyper/pull/254
* Add tests workflow for MPI by Deathn0t in https://github.com/deephyper/deephyper/pull/249
* Add tests workflow for Redis testing by Deathn0t in https://github.com/deephyper/deephyper/pull/253
* Add tests workflow for core extras by Deathn0t in https://github.com/deephyper/deephyper/pull/256
* Add support for ForbiddenClause When Using Mixedga/ga optimisers in CBO by Deathn0t in https://github.com/deephyper/deephyper/pull/257
* Add example for scaling centralized Bayesian optimization
* Add example for scaling Bayesian optimization with decentralized search by Deathn0t in https://github.com/deephyper/deephyper/pull/258
* Resolved bug with termination condition `max_evals` in the case of decentralized search by Deathn0t https://github.com/deephyper/deephyper/pull/259
* Updated spack package recipies by bretteiffert in https://github.com/deephyper/deephyper-spack-packages/pull/4

**Full Changelog**: https://github.com/deephyper/deephyper/compare/0.9.0...0.9.1

0.9.0

We are happy to release the new version of DeepHyper with software quality updates.

- DeepHyper is now compatible with `Python>=3.10;<=3.13`.
- The [pip installation](https://deephyper.readthedocs.io/en/stable/install/pip.html) was updated.
- The package build tools have been updated to modern tools (mainly, `pyproject.toml`, hatchling, and ruff). wigging
- The code base style/content was improved accordingly. wigging
- Our [contributing guidelines](https://deephyper.readthedocs.io/en/stable/developer_guides/contributing.html) have been updated. wigging
- The CI tests pipeline has been updated. wigging

deephyper.evaluator

The `Evaluator` API has been updated to avoid possibly leaking threads when using `search.search(timeout=...)`. The `"serial"` method is now only accepting `coroutinefunction` (i.e., using the `async def` keywords in the function definition). The running job received by the run-function `def run(job)` now has `job.status` (`READY`, `RUNNING`, `DONE`, `CANCELLING`, `CANCELLED`) to handle coopérative cancellation. The user can regularly check this status to manage the job termination (useful with parallel backends such as `"serial", "thread", "mpicomm"`. The dumped `results.csv` will now provide the final status of each job. Deathn0t

New examples will be provided in the documentation.

deephyper.evaluator.storage

Two new `Storage` are now available to benefit from shared memory in distributed parallel execution.
- [MPIWinStorage](https://deephyper.readthedocs.io/en/stable/_autosummary/deephyper.evaluator.storage.MPIWinStorage.html#deephyper.evaluator.storage.MPIWinStorage): specific to `"mpicomm"` execution and based on one-sided communication (a.k.a., remote memory access RMA). Deathn0t
- [SharedMemoryStorage](https://deephyper.readthedocs.io/en/stable/_autosummary/deephyper.evaluator.storage.SharedMemoryStorage.html#deephyper.evaluator.storage.SharedMemoryStorage): specific to `"process"` execution. Deathn0t

Removing deprecated modules

The `deephyper.search` and `deephyper.problem` subpackages previously deprecated are now removed. The `deephyper.hpo` should be used instead.

Spack installation

Our team is currently undergoing an update of our [Spack installation](https://deephyper.readthedocs.io/en/stable/install/spack.html). If you are a Spack user and would like to experiment with it, please get in touch with us. bretteiffert

0.8.1

We are happy to release the new version of DeepHyper with light updates bug-fixes.

- fixing issues with constant hyperparameters using conditions c761d3f63c91a069370fb163f6a65e23d6228c2a
- `Ensemble` can use `PredictorLoader` for lazy loading 9755cd09bb87904674d7b25ef9eeb42793ee322f
- fixing issue in `SklearnPredictor` 788e23827d24f7a6d61606f0a8189ec5ad0b05d8
- making `MeanAggregator` compatible with `MaskedArray` for predictions with variable targets (e.g., cross-validation, boostrap) e3b4408b6857ff8b2a714ea2671baaa4efc00aca
- adding `OnlineSelector` to combine online ensemble selection with hyperparameter optimization 4ba8d11f2604a905001ab7eb09c95d0991f8ae67
- `Evaluator` can now be used to schedule any parallel function call and be used outside HPO (for example it is now use within `Ensemble` to schedule parallel predictions of members).
- adding more regularization hyperparameters to greedy ensemble selection 77b6e5049fdb50fd0b5cdfa70d33dd28b690e760

0.8.0

We are happy to release the new version of DeepHyper with significant updates that improve the consistency of our package and make it interoperable with more machine-learning frameworks.

The most significant updates are for `deephyper.hpo` and `deephyper.ensemble`.

We have included for you below the details about the update.

deephyper.analysis

- `deephyper.analysis.hpo` ([link to documentation](https://deephyper.readthedocs.io/en/latest/_autosummary/deephyper.analysis.hpo.html#module-deephyper.analysis.hpo))includes new utility functions:
- `filter_failed_objectives`
- `parameters_at_max`
- `parameters_at_topk`
- `parameters_from_row`
- `read_results_from_csv`

deephyper.ensemble

The ensemble subpackage ([link to documentation](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.ensemble.html)) was refactored to expand its applicability. It is now compatible with any ML framework such as Tensorflow/Keras2, PyTorch, and Scikit-Learn models. The only requirement is to follow the new `deephyper.predictor` interface that represents a frozen machine learning model on which inferences can be run.

To help you start with the new ensemble feature we prepared a tutorial that optimizes the hyperparameter of a Decision Tree and build an ensemble to improve accuracy, probabilistic calibration, and uncertainty estimates: [Hyperparameter Optimization for Tree Ensemble with Uncertainty Quantification (Scikit-Learn)](https://deephyper.readthedocs.io/en/develop/tutorials/tutorials/notebooks/07_HPO_and_UQ_Ensemble_ScikitLearn/tutorial.html)

The ensemble API is mainly built around the follow classes:
- `Predictor` that represents a predictive function ([see doc on predictor](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.predictor.html)).
- `Aggregator`: to aggregate the predictions from a set of predictors ([see doc on aggregator](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.ensemble.aggregator.html#module-deephyper.ensemble.aggregator)).
- `Loss`: loss and scoring functions adapted to classification, regression, or uncertainty quantification ([see doc on loss](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.ensemble.loss.html#module-deephyper.ensemble.loss)).
- `Selector`: a selection algorithm that selects a subset of predictors ([see doc on selector](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.ensemble.selector.html#module-deephyper.ensemble.selector)) and weight them.

The API already provides the tools to build:
1. Simple ensembles to improve the predictions of models with variability (when retrained). Our tutorial on Decision trees is a good example of that.
2. Ensembles with Epistemic uncertainty.
3. Ensembles with decomposed aleatoric and epistemic uncertainty.

For classification, you can use the `MixedCategoricalAggregator` ([see doc](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.ensemble.aggregator.MixedCategoricalAggregator.html#deephyper.ensemble.aggregator.MixedCategoricalAggregator)) that can use `confidence` or `entropy` for uncertainty.

For regression, you can use the `MixedNormalAggregator` ([see doc](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.ensemble.aggregator.MixedNormalAggregator.html)) that uses `variance` for uncertainty.

deephyper.hpo

The `deephyper.hpo` comes in replacement of both `deephyper.search.hps` and `deephyper.search.nas`. For consistency, we decided to refactor neural architecture search and hyperparameter optimization together. The main algorithms to explore hyperparameters are:

- Bayesian optimization ([see doc on CBO](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.hpo.CBO.html)).
- Experimental design (factorial/grid, randomized, quasi-monte carlo) ([see doc on ExperimentalDesignSearch](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.hpo.ExperimentalDesignSearch.html#deephyper.hpo.ExperimentalDesignSearch)).
- Aging/regularized evolution ([see doc on RegularizedEvolution](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.hpo.RegularizedEvolution.html)).

All search algorithms are now following the `_ask/_tell` interface. Good and simple examples to follow if you want to implement your own algorithm are [RandomSearch](https://deephyper.readthedocs.io/en/develop/_modules/deephyper/hpo/_random.html#RandomSearch) and [RegularizedEvolution](https://deephyper.readthedocs.io/en/develop/_modules/deephyper/hpo/_regevo.html#RegularizedEvolution)).

Therefore all search algorithms are now compatible with decentralization. An example for decentralized search with subprocess is available [here](https://github.com/deephyper/deephyper/blob/master/tests/hpo/test__random.py#L129).

A new tutorial on how to do neural architecture search was published: [Neural Architecture Search with Tensorflow/Keras2 (Basic)](https://deephyper.readthedocs.io/en/develop/tutorials/tutorials/colab/NAS_basic_tf_keras2.html).

deephyper.predictor

Utility classes are provided to reload checkpointed models from [Scikit-Learn](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.predictor.sklearn.SklearnPredictorFileLoader.html#deephyper.predictor.sklearn.SklearnPredictorFileLoader), [Tensorflow/Keras2](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.predictor.tf_keras2.html) and [Pytorch](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.predictor.torch.TorchPredictorFileLoader.html#deephyper.predictor.torch.TorchPredictorFileLoader).

deephyper.stopper

This sub-package provides algorithm that can help speed-up hyperparameter optimization by observing the training of machine learning models.

The main update is to expose the [BayesianLearningCurveRegressor](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.stopper.lce.BayesianLearningCurveRegressor.html#deephyper.stopper.lce.BayesianLearningCurveRegressor) which helps extrapolate with uncertainty the future performance of a training curve based on parametric models.

The currently available algorithms are:
- Learning Curve Extrapolation ([see doc on LCModelStopper](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.stopper.LCModelStopper.html#deephyper.stopper.LCModelStopper)).
- Median Stopper ([see doc on MedicanStopper](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.stopper.MedianStopper.html#deephyper.stopper.MedianStopper)).
- Asynchronous Successive Halving ([see doc on SuccessiveHalvingStopper](https://deephyper.readthedocs.io/en/develop/_autosummary/deephyper.stopper.SuccessiveHalvingStopper.html#deephyper.stopper.SuccessiveHalvingStopper)).

0.7.0

deephyper.search

- If a `results.csv` file already exists in the `log_dir` folder, it is renamed instead of overwritten.
- Parallel and asynchronous standard experimental design can now be used through DeepHyper to perform Random, Grid, or Quasi-Monte-Carlo evaluations: [Example on Standard Experimental Design (Grid Search)](https://deephyper.readthedocs.io/en/latest/examples/plot_experimental_design.html).

Bayesian optimization (CBO and MPIDistributedBO)

- New optimizers of the acquisition function: the acquisition function for non-derivable surrogate models (e.g., `"RF", "ET"`) can now be optimized with `acq_optimizer="ga"` or `acq_optimizer="mixedga"`. This makes BO iterations more efficient but implies an additional overhead (negligible if the evaluated function is slow). The `acq_optimizer_freq=2` parameter can be used to amortize this overhead.
- New exploration/exploitation scheduler: the periodic exponential decay scheduler can now be specified with its initial and final values `CBO(..., kappa=10, scheduler={"type": "periodic-exp-decay", "period": 25, "kappa_final": 1.96})`. This mechanism allows to escape local optimum.
- New family of acquisition functions for Random-Forests surrogate models. `acq_func="UCBd"` or `"EId"` or `"PId"`. The `"d"` postfix stands for "deterministic" because in this case, the acquisition function will only use epistemic uncertainty from the black-box function to evaluate the acquisition function and it will ignore aleatoric uncertainty (i.e., noise estimation).
- The default surrogate model was renamed `"ET"` which stands for "Extremely Randomized Trees" to better match the machine learning literature. This surrogate model provides better epistemic uncertainty estimates than the standard `"RF"` which stands for "Random Forest". It is also a type of randomized ensemble of trees but it uses a randomized split decision rule instead of an optimized split decision rule.
- `HpProblem` based on `ConfigSpace` objects using constraints now uses the lower bound of each hyperparameter as a slack value.

deephyper.stopper

- The stopper based on learning curve extrapolation has an improved fit and speed.

Page 4 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.