A new documentation for the neural architecture search problem setup can be found [here]( https://deephyper.readthedocs.io/en/latest/user_guides/nas/problem.html).
It is now possible to defined [auto-tuned hyperparameters](https://deephyper.readthedocs.io/en/latest/user_guides/nas/problem.html#searched-hyperparameters) in addition of the architecture in a NAS Problem.
New Algorithms for Joint Hyperparameter and Neural Architecture Search
Three new algorithms are available to run a joint Hyperparameter and neural architecture search. The Hyperparameter optimisation is defined as HPO and neural architecture search as NAS.
* `agebo` (Aging Evolution for NAS with Bayesian Optimisation for HPO) * `ambsmixed` (an extension of Asynchronous Model-Based Search for HPO + NAS) * `regevomixed` (an extension of regularised evolution for HPO + NAS)
A run function to use data-parallelism with Tensorflow
A new run function to use data-parallelism during neural architecture search is available ([link to code](https://github.com/deephyper/deephyper/blob/c7608e0c61bd805c109145744b567cbb6cf01673/deephyper/nas/run/tf_distributed.py#L51))
To use this function pass it to the run argument of the command line such as:
console deephyper nas agebo ... --run deephyper.nas.run.tf_distributed.run ... --num-cpus-per-task 2 --num-gpus-per-task 2 --evaluator ray --address auto ...
This function allows for new hyperparameters in the `Problem.hyperparameters(...)`:
Easier model generation from Neural Architecture Search results
A new method is now available from the Problem object `Problem.get_keras_model(arch_seq)` to easily build a Keras model instance from an `arch_seq` (list encoding a neural network).
0.0001614947
loss: log_cosh
0.0001265946
loss: mae
0.9.3
What's Changed
* Simplify decentralized search by Deathn0t in https://github.com/deephyper/deephyper/pull/260 * Add uv installation page in documentation by wigging in https://github.com/deephyper/deephyper/pull/263 * Changes for Colab tutorial by bretteiffert in https://github.com/deephyper/deephyper/pull/271 * Add section in documentation for facility guides by wigging in https://github.com/deephyper/deephyper/pull/265, https://github.com/deephyper/deephyper/pull/276 * Add `parameters_at_max` function for Quick Start example by wigging in https://github.com/deephyper/deephyper/pull/275 * Fix ensemble inference by Deathn0t in https://github.com/deephyper/deephyper/pull/278 * Move BBO tutorial to Sphinx Gallery by Deathn0t in https://github.com/deephyper/deephyper/pull/280 * Adding ``LokyEvaluator`` by Deathn0t in https://github.com/deephyper/deephyper/pull/281 * Add Contrib documentation about gallery examples by Deathn0t in https://github.com/deephyper/deephyper/pull/282 * Move ensemble tutorials to Sphinx Gallery by Deathn0t in https://github.com/deephyper/deephyper/pull/283 * Removing tutorials and dependices `tf-keras2` by Deathn0t in https://github.com/deephyper/deephyper/pull/284 * Replacing `sdv` by Gaussian Mixture Model and removing dependencies by Deathn0t in https://github.com/deephyper/deephyper/pull/286 * Move multi-objective BBBO tutorial to Sphinx Gallery by wigging in https://github.com/deephyper/deephyper/pull/285 * Update example on failures in BBO by Deathn0t in https://github.com/deephyper/deephyper/pull/287 * Move overfitting tutorial to Sphinx Gallery by Deathn0t in https://github.com/deephyper/deephyper/pull/289