Deephyper

Latest version: v0.9.3

Safety actively analyzes 714919 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 7

6.38145

[00003] -- best objective: 6.38145 -- received objective: 3.73641
[00004] -- best objective: 7.29998 -- received objective: 7.29998

0.9236862659

optimizer: adam
patience_EarlyStopping: 22
patience_ReduceLROnPlateau: 10
'1':
arch_seq: '[229, 0, 22, 0, 1, 235, 29, 1, 313, 1, 0, 116, 123, 1, 37, 0, 1, 388]'
batch_size: 51

0.9231553674

optimizer: nadam
patience_EarlyStopping: 23
patience_ReduceLROnPlateau: 14


Neural architecture search

New documentation for the problem definition

A new documentation for the neural architecture search problem setup can be found [here](
https://deephyper.readthedocs.io/en/latest/user_guides/nas/problem.html).

It is now possible to defined [auto-tuned hyperparameters](https://deephyper.readthedocs.io/en/latest/user_guides/nas/problem.html#searched-hyperparameters) in addition of the architecture in a NAS Problem.


New Algorithms for Joint Hyperparameter and Neural Architecture Search

Three new algorithms are available to run a joint Hyperparameter and neural architecture search. The Hyperparameter optimisation is defined as HPO and neural architecture search as NAS.

* `agebo` (Aging Evolution for NAS with Bayesian Optimisation for HPO)
* `ambsmixed` (an extension of Asynchronous Model-Based Search for HPO + NAS)
* `regevomixed` (an extension of regularised evolution for HPO + NAS)


A run function to use data-parallelism with Tensorflow

A new run function to use data-parallelism during neural architecture search is available ([link to code](https://github.com/deephyper/deephyper/blob/c7608e0c61bd805c109145744b567cbb6cf01673/deephyper/nas/run/tf_distributed.py#L51))

To use this function pass it to the run argument of the command line such as:

console
deephyper nas agebo ... --run deephyper.nas.run.tf_distributed.run ... --num-cpus-per-task 2 --num-gpus-per-task 2 --evaluator ray --address auto ...


This function allows for new hyperparameters in the `Problem.hyperparameters(...)`:

python
...
Problem.hyperparameters(
...
lsr_batch_size=True,
lsr_learning_rate=True,
warmup_lr=True,
warmup_epochs=5,
...
)
...


Optimization of the input pipeline for the training

The data-ingestion pipeline was better optimised to reduce the overheads on GPU instances:

python
self.dataset_train = (
self.dataset_train.cache()
.shuffle(self.train_size, reshuffle_each_iteration=True)
.batch(self.batch_size)
.prefetch(tf.data.AUTOTUNE)
.repeat(self.num_epochs)
)


Easier model generation from Neural Architecture Search results

A new method is now available from the Problem object `Problem.get_keras_model(arch_seq)` to easily build a Keras model instance from an `arch_seq` (list encoding a neural network).

0.0001614947

loss: log_cosh

0.0001265946

loss: mae

0.9.3

What's Changed

* Simplify decentralized search by Deathn0t in https://github.com/deephyper/deephyper/pull/260
* Add uv installation page in documentation by wigging in https://github.com/deephyper/deephyper/pull/263
* Changes for Colab tutorial by bretteiffert in https://github.com/deephyper/deephyper/pull/271
* Add section in documentation for facility guides by wigging in https://github.com/deephyper/deephyper/pull/265, https://github.com/deephyper/deephyper/pull/276
* Add `parameters_at_max` function for Quick Start example by wigging in https://github.com/deephyper/deephyper/pull/275
* Fix ensemble inference by Deathn0t in https://github.com/deephyper/deephyper/pull/278
* Move BBO tutorial to Sphinx Gallery by Deathn0t in https://github.com/deephyper/deephyper/pull/280
* Adding ``LokyEvaluator`` by Deathn0t in https://github.com/deephyper/deephyper/pull/281
* Add Contrib documentation about gallery examples by Deathn0t in https://github.com/deephyper/deephyper/pull/282
* Move ensemble tutorials to Sphinx Gallery by Deathn0t in https://github.com/deephyper/deephyper/pull/283
* Removing tutorials and dependices `tf-keras2` by Deathn0t in https://github.com/deephyper/deephyper/pull/284
* Replacing `sdv` by Gaussian Mixture Model and removing dependencies by Deathn0t in https://github.com/deephyper/deephyper/pull/286
* Move multi-objective BBBO tutorial to Sphinx Gallery by wigging in https://github.com/deephyper/deephyper/pull/285
* Update example on failures in BBO by Deathn0t in https://github.com/deephyper/deephyper/pull/287
* Move overfitting tutorial to Sphinx Gallery by Deathn0t in https://github.com/deephyper/deephyper/pull/289


**Full Changelog**: https://github.com/deephyper/deephyper/compare/0.9.2...0.9.3

Page 3 of 7

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.