- The documentation site theme was updated.
- PyPI Release: https://pypi.org/project/deephyper/0.6.0/
- New BibTeX citation for DeepHyper to include our growing community:
misc{deephyper_software,
title = {"DeepHyper: A Python Package for Scalable Neural Architecture and Hyperparameter Search"},
author = {Balaprakash, Prasanna and Egele, Romain and Salim, Misha and Maulik, Romit and Vishwanath, Venkat and Wild, Stefan and others},
organization = {DeepHyper Team},
year = 2018,
url = {https://github.com/deephyper/deephyper}
}
deephyper.evaluator
- `profile(memory=True)` decorator can now profile memory using `tracemalloc` (adding an overhead).
- `RayStorage` is now available for the `ray` parallel backend. It is based on remote actors and is a wrapper around the base `MemoryStorage`. This allows to use `deephyper.stopper` in parallel only with `ray` backend requirements.
deephyper.search
- Multi-objective optimization (MOO) has been upgraded for better optimization performance. A new tutorial to discover this feature is available at [Multi-Objective Optimization - 101](https://deephyper.readthedocs.io/en/develop/tutorials/tutorials/colab/Multi_objective_optimization_101.html).
- A minimum-lower bound performance can be specified to avoid exploring not interesting trade-offs `moo_lower_bounds=...`.
- A new objective scaler is available to normalize objectives (e.g., accuracy and latency) more efficiently `objective_scaler="quantile-uniform"`.
- The `results.csv` or DataFrame now contains a new information `pareto_efficient` which indicates the optimal solution in a multi-objective problem (i.e., Pareto-set/front).
- Random-Forest (RF) surrogate model predictions are faster by about x1.5 factor, speeding up the Bayesian optimization process.
- Added a dynamic prior update for Bayesian optimization: `update_prior=..., update_prior_quantile=...` This allows to increase the density of sampling in areas of interest and makes "random"-sampling-based optimization of the surrogate model more competitive (against more expensive optimizers like gradient-based or genetic algorithms).
-
deephyper.stopper
- `SuccessiveHalvingStopper` is now compatible with failures. If a "failure" is observed during training (i.e., observation starting with `"F"`) then previous observations are replaced in shared memory to notify other competitors of the failure.
deephyper.analysis
- Creation of a new module to provide utilities for the analysis of experiments.