Pyshac

Latest version: v0.3.5.1

Safety actively analyzes 685670 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

0.3.5.1

Changelog
Bugfix update:

Fix same value prediction when batch size is smaller or equal to size of execution threads. Due to same value prediction, only one classifier is ever trained.

Now, SHAC.as_seeded() or SHAC.set_seed() must be called before every operation and after restoration in order to ensure non determinsm.

Cause of issue: Deterministic behaviour on multiprocessing systems requires distributed management of global and local seeds.

0.3.5.0

Changelog

Improvements
- All engines can now be locally seeded for deterministic behavior.

This can be done either with an engine level seed :

python
shac = pyshac.SHAC(...)
shac.set_seed(seed) deterministic from now onwards


Or as a context level seed :

python
shac = pyshac.SHAC(...)

with shac.as_deterministic(seed): deterministic within scope, reverts to random outside it.
...


Breaking Changes
- `np.random.seed(seed)` **no longer seeds the engine**.
- The only way to make the engine deterministic is to use one of the above methods.

0.3.4.1

Changelog

Improvements

- `None` can now be used as a value for Discrete Hyper parameters. This is exceptionally useful for scenarios where one does not want to utilize this value. **Caveat** : Only a single `None` value will be registered as a parameter, and multiple `None` in the same Discrete Hyper parameter will not work.

- Raise RuntimeError if predict is used with no parameters initialized (when predicting without restoring).

Bugfixes

- Using `None` as a value before now would cause a crash during casting, however this wasnt caught in tests. This is now properly checked.

0.3.4

Changelog

Improvements

- Dataset plotting now provides a trend line to show overall trend of evaluation metric during training of the engine.
- New argument `trend_deg` for `plot_model` which decides the degree of the line which fits the dataset. If the dataset is very noisy, it may be advisable to change this parameter to better reflect the dataset trend.

Example
python
import pyshac
from pyshac.utils.vis_utils import plot_dataset

shac = SHAC(...)
...

plot_dataset(shac.dataset, trend_deg=5)


Bugfixes

- Removes a flaky test from the Multi* parameter tests

0.3.3

Changelog

Improvements

- Addition of `Multi` parameters - `MultiDiscreteHyperParameter`, `MultiUniformContiniousHyperParameter` and `MultiNormalContiniousHyperParameter`.

These multi parameters have an additional argument `sample_count` which can be used to sample multiple times per step.

**Note**: The values will be concatenated linearly, so each multi parameter will have a list of values
returned in the resultant OrderedDict. If you wish to flatten the entire search space, you can
use `pyshac.flatten_parameters` on this OrderedDict.

Example : Creation of a search space of 1000 dice rolls and 500 samples of normally distributed noise.

python
import pyshac

mp1 = pyshac.MultiDiscreteHP('dice', values=[0, 1, 2, 3, 4, 5, 6], sample_count=1000)
mp2 = pyshac.MultiNormalHP('noise', mean=0.0, std=0.5, sample_count=500)

params = [mp1, mp2]
shac = pyshac.SHAC(params, ...)

------

- Documentation for usage of Multi parameters and an example for searching on large search spaces in the [Examples/multi_parameter folder](https://github.com/titu1994/pyshac/blob/master/examples/multi_parameter/multi_parameter_sampling.py).

0.3.2.1

Changelog

**Improvement**

- All engines will now accept the keyword `save_dir`, which will point to the base directory where the shac engine data will be stored.
- Evaluation speed should be somewhat improved for large batch sizes.

Bugfixes

- CSVLogger no longer logs the details of the XGBoost model that was trained.
- Tests now properly check if the engine cannot be restored properly.
- A bug where the engine fails to sample and crashes all threads using the threading engine is now fixed.

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.