Optax

Latest version: v0.2.4

Safety actively analyzes 683322 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 4

0.1.2

What's Changed
* Improve the documentation and discoverability of `set_to_zero`. by copybara-service in https://github.com/deepmind/optax/pull/299
* Add pages to documentation for contributors. by copybara-service in https://github.com/deepmind/optax/pull/301
* Replace _build_simple_adam with _build_stateful_sgd. by n2cholas in https://github.com/deepmind/optax/pull/302
* Make masked preserve param structure by n2cholas in https://github.com/deepmind/optax/pull/300
* Allow to set custom schedule for the second momentum in Adafactor. One useful use case is setting the upper bound for the second momentum, as otherwise it converges to 1.0 and effectively freezes the second momentum. It was used in https://arxiv.org/abs/2106.04560. by copybara-service in https://github.com/deepmind/optax/pull/303
* exporting CTC loss for public use by copybara-service in https://github.com/deepmind/optax/pull/305
* Specify that mask must be static in masked wrapper docstring. by copybara-service in https://github.com/deepmind/optax/pull/306
* Clarify the docstring of cosine_decay_schedule. by copybara-service in https://github.com/deepmind/optax/pull/310
* Typo in LARS docstring. by Olamon in https://github.com/deepmind/optax/pull/311
* Fix adam's mu_dtype casting. by copybara-service in https://github.com/deepmind/optax/pull/313
* Fix docs rendering of loss functions. by grahamgower in https://github.com/deepmind/optax/pull/318
* Clarifies `optax.adamw(weight_decay)` parameter. by copybara-service in https://github.com/deepmind/optax/pull/322
* Enhance `ctc_loss` to return forward probs. by yotarok in https://github.com/deepmind/optax/pull/321
* Replace `jax.tree_multimap`, which is now deprecated, with `jax.tree_map`. by copybara-service in https://github.com/deepmind/optax/pull/330
* Rename argument names for CTC loss functions. by yotarok in https://github.com/deepmind/optax/pull/331

New Contributors
* Olamon made their first contribution in https://github.com/deepmind/optax/pull/311
* grahamgower made their first contribution in https://github.com/deepmind/optax/pull/318
* yotarok made their first contribution in https://github.com/deepmind/optax/pull/321

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.1...v0.1.2

0.1.1

What's Changed
* Tweak the meta-learning example from the docs by copybara-service in https://github.com/deepmind/optax/pull/233
* Fix small bugs in metalearning example. by copybara-service in https://github.com/deepmind/optax/pull/236
* Do not reuse mini-batches between epochs in DPSGD example. by copybara-service in https://github.com/deepmind/optax/pull/230
* Make the version of typing_extensions less constrained. by copybara-service in https://github.com/deepmind/optax/pull/238
* [JAX] move example libraries from `jax.experimental` to `jax.example_libraries` by copybara-service in https://github.com/deepmind/optax/pull/200
* Export ScaleByBeliefState by NeilGirdhar in https://github.com/deepmind/optax/pull/239
* MultiStep optimizer: align parameter naming and type annotations of update function with signature of GradientTransform.update. by copybara-service in https://github.com/deepmind/optax/pull/243
* Fix imports of datasets in examples folder. by copybara-service in https://github.com/deepmind/optax/pull/242
* Enable example tests on github. Fix the bugs that were uncovered. by copybara-service in https://github.com/deepmind/optax/pull/244
* Formatting. by copybara-service in https://github.com/deepmind/optax/pull/249
* Add test for multi steps wrapper, verifying that the aggregated gradient is the mean of the input gradients. by copybara-service in https://github.com/deepmind/optax/pull/255
* MultiStep optimizer wrapper: replace naive streaming average gradient implementation with numerically stabler one. by copybara-service in https://github.com/deepmind/optax/pull/254
* Added ord, axis, and keepdims args to safe_norm by copybara-service in https://github.com/deepmind/optax/pull/252
* Add badges and include RTD build into CI tests. by copybara-service in https://github.com/deepmind/optax/pull/256
* Write a clearer doc-string for GradientTransformation by copybara-service in https://github.com/deepmind/optax/pull/257
* Refactor clipping.py by copybara-service in https://github.com/deepmind/optax/pull/260
* Implement split real norm by wdphy16 in https://github.com/deepmind/optax/pull/241
* Monkey-patch sphinx to output correct type annotations for the most common cases (e.g. params, opt state) in the documentation. by copybara-service in https://github.com/deepmind/optax/pull/266
* Improve docs by copybara-service in https://github.com/deepmind/optax/pull/268
* Implement stateless wrapper. by n2cholas in https://github.com/deepmind/optax/pull/246
* Replace _ with params to ensure you can always call init with named args. by copybara-service in https://github.com/deepmind/optax/pull/270
* Improve docs. by copybara-service in https://github.com/deepmind/optax/pull/269
* Add missing ` in two places. by copybara-service in https://github.com/deepmind/optax/pull/273
* Add option to cache examples datasets after pre-processing. by copybara-service in https://github.com/deepmind/optax/pull/272
* Fix an error in README.md rendering. by copybara-service in https://github.com/deepmind/optax/pull/275
* Remove the old venv directory before testing the package. by copybara-service in https://github.com/deepmind/optax/pull/289
* Fix Yogi optimizer by wdphy16 in https://github.com/deepmind/optax/pull/288
* Bump ipython from 7.16.1 to 7.16.3 in /requirements by dependabot in https://github.com/deepmind/optax/pull/286
* Clarifies `optax.adamw(mask)` parameter. by copybara-service in https://github.com/deepmind/optax/pull/284
* Fix the link to the complex-valued optim proposal in RTD. by copybara-service in https://github.com/deepmind/optax/pull/295
* Implement complex norm in optimizers by wdphy16 in https://github.com/deepmind/optax/pull/279
* Change add_noise to match the target variance by scaling by its sqrt. by Rupt in https://github.com/deepmind/optax/pull/294
* Minor tweaks to the optax documentation. by copybara-service in https://github.com/deepmind/optax/pull/297
* Bump version to 0.1.1 from 0.1.0 by copybara-service in https://github.com/deepmind/optax/pull/298

New Contributors
* wdphy16 made their first contribution in https://github.com/deepmind/optax/pull/241
* dependabot made their first contribution in https://github.com/deepmind/optax/pull/286
* Rupt made their first contribution in https://github.com/deepmind/optax/pull/294

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.0...v0.1.1

0.1.0

Support for Python 3.6 has been dropped following the [JAX deprecation policy](https://jax.readthedocs.io/en/latest/deprecation.html). Please upgrade to a supported Python version.

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.0.91...v0.1.0

0.0.91

It is the latest version compatible with Python 3.6. See 222 for more details.

0.0.9

Added:
* multi_transform
* LARS optimiser
* AdaFactor optimiser

Fixed:
* masked wrapper for empty params nest
* label type in loss.py
* checks in loss.py
* MultiSteps arguments handling
* Chex asserts in constrain_test.py

0.0.8

* Added `clipping_by_block_rms`
* Added `sgdr_schedule`
* Fixed inconsistency in ema's outputs
* Added `linear_schedule`

Page 3 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.