[00048] -- best objective: 9.99995 -- received objective: 9.99725
[00049] -- best objective: 9.99995 -- received objective: 9.99746
[00050] -- best objective: 9.99995 -- received objective: 9.99990
[00051] -- best objective: 9.99995 -- received objective: 9.99915
[00052] -- best objective: 9.99995 -- received objective: 9.99962
[00053] -- best objective: 9.99995 -- received objective: 9.99930
[00054] -- best objective: 9.99995 -- received objective: 9.99982
[00055] -- best objective: 9.99995 -- received objective: 9.99985
[00056] -- best objective: 9.99995 -- received objective: 9.99851
[00057] -- best objective: 9.99995 -- received objective: 9.99794
Stopping the search because it did not improve for the last 10 evaluations!
Tutorials
* [NEW] **Hyperparameter search for text classification (Pytorch)**
* [NEW] **Neural Architecture Search with Multiple Input Tensors**
* [NEW] **From Neural Architecture Search to Automated Deep Ensemble with Uncertainty Quantification**
* [UPDATED] **Execution on the Theta supercomputer/N-evaluation per 1-node**
Hyperparameter search
* [NEW] **Filtering duplicated samples**: New parameters `filter_duplicated` and `n_points` appeared for `deephyper.search.hps.AMBS`. By default `filter_duplicated = True` implies that the search space filters duplicated values until it cannot sample new unique values (and therefore will re-sample existing configurations of hyperparameters). This filtering behaviour and sampling speed are sensitive to the `n_points` parameter which corresponds to the number of samples drawn from the search space before being filtered by the surrogate model. By default `n_points = 10000`. If `filter_duplicated = False` then the filtering of duplicated points will be skipped but `n_points` will still impact sampling speed.
* Arguments of `AMBS` were adapted to match the maximisation setting of DeepHyper: `"LCB" -> "UCB"`, `cl_min -> cl_max`, `"cl_max" -> "cl_min"`.
Neural architecture search
The package `deephyper.nas` was restructured. All the neural architecture search space should now be subclasses of `deephyper.nas.KSearchSpace`:
python
import tensorflow as tf
from deephyper.nas import KSearchSpace
from deephyper.nas.node import ConstantNode, VariableNode
from deephyper.nas.operation import operation, Identity
Dense = operation(tf.keras.layers.Dense)
Dropout = operation(tf.keras.layers.Dropout)
class ExampleSpace(KSearchSpace):
def build(self):
input nodes are automatically built based on `input_shape`
input_node = self.input_nodes[0]
we want 4 layers maximum (Identity corresponds to not adding a layer)
for i in range(4):
node = VariableNode()
self.connect(input_node, node)
we add 3 possible operations for each node
node.add_op(Identity())
node.add_op(Dense(100, "relu"))
node.add_op(Dropout(0.2))
input_node = node
output = ConstantNode(op=Dense(self.output_shape[0]))
self.connect(input_node, output)
return self
space = ExampleSpace(input_shape=(1,), output_shape=(1,)).build()
space.sample().summary()
will output:
console
Model: "model_1"
_________________________________________________________________
Layer (type) Output Shape Param
=================================================================
input_0 (InputLayer) [(None, 1)] 0
_________________________________________________________________
dense_3 (Dense) (None, 100) 200
_________________________________________________________________
dense_4 (Dense) (None, 100) 10100
_________________________________________________________________
dropout_2 (Dropout) (None, 100) 0
_________________________________________________________________
dense_5 (Dense) (None, 1) 101
=================================================================
Total params: 10,401
Trainable params: 10,401
Non-trainable params: 0
_________________________________________________________________
To have a complete example follow the **Neural Architecture Search (Basic)** tutorial.
The main changes were the following:
* `AutoKSearchSpace`, `SpaceFactory`, `Dense`, `Dropout` and others were removed. Operations like `Dense` can now be created directly using the `operation(tf.keras.layers.Dense)` to allow for lazy tensor allocation.
* The search space class should now be passed directly to the `NaProblem.search_space(KSearchSpaceSubClass)`.
* `deephyper.nas.space` is now `deephyper.nas`
* All operations are now under `deephyper.nas.operation`
* Nodes are now under `deephyper.nas.node`
*
Documentation
* **API Reference**: A new section on the documentation website to give details about all usable functions/classes of DeepHyper.
Suppressed
* Notebooks generated with `deephyper-analytics` were removed.
* `deephyper ray-submit`
* `deephyper ray-config`
* Some unused dependencies were removed: `balsam-flow`, `deap`.