A new documentation for the neural architecture search problem setup can be found [here]( https://deephyper.readthedocs.io/en/latest/user_guides/nas/problem.html).
It is now possible to defined [auto-tuned hyperparameters](https://deephyper.readthedocs.io/en/latest/user_guides/nas/problem.html#searched-hyperparameters) in addition of the architecture in a NAS Problem.
New Algorithms for Joint Hyperparameter and Neural Architecture Search
Three new algorithms are available to run a joint Hyperparameter and neural architecture search. The Hyperparameter optimisation is defined as HPO and neural architecture search as NAS.
* `agebo` (Aging Evolution for NAS with Bayesian Optimisation for HPO) * `ambsmixed` (an extension of Asynchronous Model-Based Search for HPO + NAS) * `regevomixed` (an extension of regularised evolution for HPO + NAS)
A run function to use data-parallelism with Tensorflow
A new run function to use data-parallelism during neural architecture search is available ([link to code](https://github.com/deephyper/deephyper/blob/c7608e0c61bd805c109145744b567cbb6cf01673/deephyper/nas/run/tf_distributed.py#L51))
To use this function pass it to the run argument of the command line such as:
console deephyper nas agebo ... --run deephyper.nas.run.tf_distributed.run ... --num-cpus-per-task 2 --num-gpus-per-task 2 --evaluator ray --address auto ...
This function allows for new hyperparameters in the `Problem.hyperparameters(...)`:
Easier model generation from Neural Architecture Search results
A new method is now available from the Problem object `Problem.get_keras_model(arch_seq)` to easily build a Keras model instance from an `arch_seq` (list encoding a neural network).
0.0001614947
loss: log_cosh
0.0001265946
loss: mae
0.9.0
We are happy to release the new version of DeepHyper with software quality updates.
- DeepHyper is now compatible with `Python>=3.10;<=3.13`. - The [pip installation](https://deephyper.readthedocs.io/en/stable/install/pip.html) was updated. - The package build tools have been updated to modern tools (mainly, `pyproject.toml`, hatchling, and ruff). wigging - The code base style/content was improved accordingly. wigging - Our [contributing guidelines](https://deephyper.readthedocs.io/en/stable/developer_guides/contributing.html) have been updated. wigging - The CI tests pipeline has been updated. wigging
deephyper.evaluator
The `Evaluator` API has been updated to avoid possibly leaking threads when using `search.search(timeout=...)`. The `"serial"` method is now only accepting `coroutinefunction` (i.e., using the `async def` keywords in the function definition). The running job received by the run-function `def run(job)` now has `job.status` (`READY`, `RUNNING`, `DONE`, `CANCELLING`, `CANCELLED`) to handle coopérative cancellation. The user can regularly check this status to manage the job termination (useful with parallel backends such as `"serial", "thread", "mpicomm"`. The dumped `results.csv` will now provide the final status of each job. Deathn0t
New examples will be provided in the documentation.
deephyper.evaluator.storage
Two new `Storage` are now available to benefit from shared memory in distributed parallel execution. - [MPIWinStorage](https://deephyper.readthedocs.io/en/stable/_autosummary/deephyper.evaluator.storage.MPIWinStorage.html#deephyper.evaluator.storage.MPIWinStorage): specific to `"mpicomm"` execution and based on one-sided communication (a.k.a., remote memory access RMA). Deathn0t - [SharedMemoryStorage](https://deephyper.readthedocs.io/en/stable/_autosummary/deephyper.evaluator.storage.SharedMemoryStorage.html#deephyper.evaluator.storage.SharedMemoryStorage): specific to `"process"` execution. Deathn0t
Removing deprecated modules
The `deephyper.search` and `deephyper.problem` subpackages previously deprecated are now removed. The `deephyper.hpo` should be used instead.
Spack installation
Our team is currently undergoing an update of our [Spack installation](https://deephyper.readthedocs.io/en/stable/install/spack.html). If you are a Spack user and would like to experiment with it, please get in touch with us. bretteiffert