Optax

Latest version: v0.2.3

Safety actively analyzes 663899 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

0.1.7

What's Changed
* Remove deprecated field from pyproject.toml, which should hopefully resolve an issue with deploy

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.6...v0.1.7

0.1.6

What's Changed
* feat: Add support for is_leaf to tree_map_params, so that `tree_map_params` works with `optax.masked`.
* feat: migrate to `pyproject.toml` for building package by SauravMaheshkar in https://github.com/deepmind/optax/pull/513
* fix(README): small typo in README by jeertmans in https://github.com/deepmind/optax/pull/495
* Add support for PolyLoss (ICLR 2022) by acforvs in https://github.com/deepmind/optax/pull/467

New Contributors
* jeertmans made their first contribution in https://github.com/deepmind/optax/pull/495

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.5...v0.1.6

0.1.5

What's Changed
* Fix arXiv link to Optax Optimistic Gradient Descent optimizer by 8bitmp3 in https://github.com/deepmind/optax/pull/458
* Fix the Yogi optimizer paper year, change link to NeurIPS site by 8bitmp3 in https://github.com/deepmind/optax/pull/461
* Add exponent to cosine decay schedule and warmup + cosine decay by copybara-service in https://github.com/deepmind/optax/pull/476
* Fix typos in docstring by pomonam in https://github.com/deepmind/optax/pull/480
* Fix global_norm() signature by brentyi in https://github.com/deepmind/optax/pull/481
* Fix `inject_hyperparams()` for python < 3.10. by copybara-service in https://github.com/deepmind/optax/pull/486
* fixed NaN issues in `kl_divergence` loss function by LukasMut in https://github.com/deepmind/optax/pull/473
* feat(ci/tests): bump `setup-python` version and enable cache by SauravMaheshkar in https://github.com/deepmind/optax/pull/485
* Better tests for utils by acforvs in https://github.com/deepmind/optax/pull/465
* Run Github CI every day at 03:00. by copybara-service in https://github.com/deepmind/optax/pull/490
* Fix JIT for `piecewise_interpolate_schedule`, `cosine_onecycle_schedule`, `linear_onecycle_schedule ` by brentyi in https://github.com/deepmind/optax/pull/504
* Explicitly export "softmax_cross_entropy_with_integer_labels" by nasyxx in https://github.com/deepmind/optax/pull/499
* Add the Lion optimizer, discovered by symbolic program search. by copybara-service in https://github.com/deepmind/optax/pull/500
* Replaces references to jax.numpy.DeviceArray with jax.Array. by copybara-service in https://github.com/deepmind/optax/pull/511
* Update pytypes. by copybara-service in https://github.com/deepmind/optax/pull/514
* Fix pytype failures related to teaching pytype about NumPy scalar types. by copybara-service in https://github.com/deepmind/optax/pull/517
* Release v0.1.5. by copybara-service in https://github.com/deepmind/optax/pull/523

New Contributors
* pomonam made their first contribution in https://github.com/deepmind/optax/pull/480
* brentyi made their first contribution in https://github.com/deepmind/optax/pull/481
* LukasMut made their first contribution in https://github.com/deepmind/optax/pull/473
* SauravMaheshkar made their first contribution in https://github.com/deepmind/optax/pull/485
* acforvs made their first contribution in https://github.com/deepmind/optax/pull/465

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.4...v0.1.5

0.1.4

What's Changed

New features
* Expose adamax and adamaxw by nasyxx in https://github.com/deepmind/optax/pull/433
* Support feeding in extra dictionary data in optax/experimental in https://github.com/deepmind/optax/pull/373
* Add NovoGrad optimizer by DT6A in https://github.com/deepmind/optax/pull/385
* added optimistic gradient descent in https://github.com/deepmind/optax/pull/387
* Add types to utils.py by atgctg in https://github.com/deepmind/optax/pull/367
* Add hyperparam_dtype override to hyperparam injection in https://github.com/deepmind/optax/pull/414
* added typing to linear_algebra.py by shivance in https://github.com/deepmind/optax/pull/413
* Add amsgrad optimizer by merajhashemi in https://github.com/deepmind/optax/pull/382
* Add Hinge Loss by heytanay in https://github.com/deepmind/optax/pull/409

Bug fixes
* [optax] Increase chex version requirement to fix issue 456. in https://github.com/deepmind/optax/pull/457
* Start inject_hyperparams step count at 0. in https://github.com/deepmind/optax/pull/416
* Add noise before multiplying by the learning rate (`noisy_sgd `) by atgctg in https://github.com/deepmind/optax/pull/369

Miscellaneous
* Remove flags from lookahead_mnist example. in https://github.com/deepmind/optax/pull/395
* Bring alias docstrings more in line with style guide. in https://github.com/deepmind/optax/pull/406
* Bugfix in alias_test: Adamax named but not tested. in https://github.com/deepmind/optax/pull/419
* Test that all optimizers can be wrapped in inject_hyperparams. in https://github.com/deepmind/optax/pull/420
* Add an example for gradient accumulation. in https://github.com/deepmind/optax/pull/425
* [docs] Start adding numerical definitions to key parts of optax. in https://github.com/deepmind/optax/pull/430
* [optax] Add basic mnist example based on the lookahead_mnist example. in https://github.com/deepmind/optax/pull/436
* Install dependencies for dp-accounting manually. in https://github.com/deepmind/optax/pull/375
* Update documentation for AdamW in https://github.com/deepmind/optax/pull/376
* [Docs] softmax_cross_entropy_with_integer_label by lkhphuc in https://github.com/deepmind/optax/pull/351
* Update Returns section in gradient transformations' docstrings. in https://github.com/deepmind/optax/pull/388
* Update logo and theme for the documentation in https://github.com/deepmind/optax/pull/391
* Set the test version of flax used in the equivalence test. in https://github.com/deepmind/optax/pull/398
* Add holounic to contributors list. by in https://github.com/deepmind/optax/pull/400
* Bring transform docstrings more in line with style guide. by in https://github.com/deepmind/optax/pull/405
* Update citation. in https://github.com/deepmind/optax/pull/407
* Refine the doc of sigmoid_binary_cross_entropy to not assume the meaning of last dimension. in https://github.com/deepmind/optax/pull/418
* fix `integer_pow` recompilation in `_bias_correction` by epignatelli in https://github.com/deepmind/optax/pull/329
* Use auto instead of `/proc/cpuinfo`. in https://github.com/deepmind/optax/pull/454

New Contributors
* atgctg made their first contribution in https://github.com/deepmind/optax/pull/367
* nasyxx made their first contribution in https://github.com/deepmind/optax/pull/433
* shivance made their first contribution in https://github.com/deepmind/optax/pull/413
* epignatelli made their first contribution in https://github.com/deepmind/optax/pull/329
* merajhashemi made their first contribution in https://github.com/deepmind/optax/pull/382
* heytanay made their first contribution in https://github.com/deepmind/optax/pull/409

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.3...v0.1.4

0.1.3

What's Changed
* Do not use None in optax.masked. in https://github.com/deepmind/optax/pull/338
* Implement Adamax and AdamaxW optimizers. in https://github.com/deepmind/optax/pull/342
* Add Kullback-Leibler Divergence Loss by holounic in https://github.com/deepmind/optax/pull/309
* Add optax.softmax_cross_entropy_with_integer_labels. in https://github.com/deepmind/optax/pull/343
* Publicize private methods in transform in https://github.com/deepmind/optax/pull/348
* Update .pylintrc in https://github.com/deepmind/optax/pull/354
* Support mixture of dtypes for parameters when clipping. in https://github.com/deepmind/optax/pull/355
* Update "jax.tree_util" functions in https://github.com/deepmind/optax/pull/370

New Contributors
* holounic made their first contribution in https://github.com/deepmind/optax/pull/309

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.2...v0.1.3

0.1.2

What's Changed
* Improve the documentation and discoverability of `set_to_zero`. by copybara-service in https://github.com/deepmind/optax/pull/299
* Add pages to documentation for contributors. by copybara-service in https://github.com/deepmind/optax/pull/301
* Replace _build_simple_adam with _build_stateful_sgd. by n2cholas in https://github.com/deepmind/optax/pull/302
* Make masked preserve param structure by n2cholas in https://github.com/deepmind/optax/pull/300
* Allow to set custom schedule for the second momentum in Adafactor. One useful use case is setting the upper bound for the second momentum, as otherwise it converges to 1.0 and effectively freezes the second momentum. It was used in https://arxiv.org/abs/2106.04560. by copybara-service in https://github.com/deepmind/optax/pull/303
* exporting CTC loss for public use by copybara-service in https://github.com/deepmind/optax/pull/305
* Specify that mask must be static in masked wrapper docstring. by copybara-service in https://github.com/deepmind/optax/pull/306
* Clarify the docstring of cosine_decay_schedule. by copybara-service in https://github.com/deepmind/optax/pull/310
* Typo in LARS docstring. by Olamon in https://github.com/deepmind/optax/pull/311
* Fix adam's mu_dtype casting. by copybara-service in https://github.com/deepmind/optax/pull/313
* Fix docs rendering of loss functions. by grahamgower in https://github.com/deepmind/optax/pull/318
* Clarifies `optax.adamw(weight_decay)` parameter. by copybara-service in https://github.com/deepmind/optax/pull/322
* Enhance `ctc_loss` to return forward probs. by yotarok in https://github.com/deepmind/optax/pull/321
* Replace `jax.tree_multimap`, which is now deprecated, with `jax.tree_map`. by copybara-service in https://github.com/deepmind/optax/pull/330
* Rename argument names for CTC loss functions. by yotarok in https://github.com/deepmind/optax/pull/331

New Contributors
* Olamon made their first contribution in https://github.com/deepmind/optax/pull/311
* grahamgower made their first contribution in https://github.com/deepmind/optax/pull/318
* yotarok made their first contribution in https://github.com/deepmind/optax/pull/321

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.1...v0.1.2

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.