Optax

Latest version: v0.2.4

Safety actively analyzes 682244 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

0.1.8

What's Changed
* Remove legacy symbols. by copybara-service in https://github.com/google-deepmind/optax/pull/397
* Move second order utilities to dedicated sub-package. by copybara-service in https://github.com/google-deepmind/optax/pull/588
* Move stochastic gradient estimation functions to separate monte_carlo subpackage. by copybara-service in https://github.com/google-deepmind/optax/pull/589
* Move losses to dedicated sub-package. by copybara-service in https://github.com/google-deepmind/optax/pull/590
* Note TF's adam implementation difference by vwxyzjn in https://github.com/google-deepmind/optax/pull/572
* Note RMSProp PyTorch vs Optax implementation difference by vwxyzjn in https://github.com/google-deepmind/optax/pull/595
* Move complex_valued from experimental/ to contrib/. by copybara-service in https://github.com/google-deepmind/optax/pull/602
* Make radam docstring point to radam paper. by carlosgmartin in https://github.com/google-deepmind/optax/pull/510
* COntinuous COin Betting algorithm by albcab in https://github.com/google-deepmind/optax/pull/536
* Update second_order package internal structure. by copybara-service in https://github.com/google-deepmind/optax/pull/625
* Update github URL + nicer console highlighting by copybara-service in https://github.com/google-deepmind/optax/pull/628
* Add import stub. by copybara-service in https://github.com/google-deepmind/optax/pull/633
* Start optax.projections subpackage and add projection_non_negative. by copybara-service in https://github.com/google-deepmind/optax/pull/632
* Also apply masking to compatible extra_args. by copybara-service in https://github.com/google-deepmind/optax/pull/620
* Add tree_vdot to tree_utils. by copybara-service in https://github.com/google-deepmind/optax/pull/634
* Add: merge multistep and apply_every logic by celiolarcher in https://github.com/google-deepmind/optax/pull/596
* Remove redundant requirements folder by copybara-service in https://github.com/google-deepmind/optax/pull/643
* Run pytype on Python 3.11 by fabianp in https://github.com/google-deepmind/optax/pull/657
* Move MLP MNIST example to Colab notebook by copybara-service in https://github.com/google-deepmind/optax/pull/658
* Move DP-SGD example to Colab by copybara-service in https://github.com/google-deepmind/optax/pull/660
* Add tensorflow as dependency for the examples by copybara-service in https://github.com/google-deepmind/optax/pull/661
* Add tensorflow-datasets to doc requirements by copybara-service in https://github.com/google-deepmind/optax/pull/662
* D-Adaptation and Prodigy contrib implementations by adefazio in https://github.com/google-deepmind/optax/pull/651
* Fix typing issue in prodigy.py for python 3.9 by copybara-service in https://github.com/google-deepmind/optax/pull/671
* Fix CIFAR10 Flax&Optax example name in the gallery by copybara-service in https://github.com/google-deepmind/optax/pull/672
* Move flax example to Colab by copybara-service in https://github.com/google-deepmind/optax/pull/670
* update citation url since http://github.com/deepmind is now empty by copybara-service in https://github.com/google-deepmind/optax/pull/667
* Move Haiku example to Colab notebook by copybara-service in https://github.com/google-deepmind/optax/pull/673
* run pytype also on the contrib directory by copybara-service in https://github.com/google-deepmind/optax/pull/675
* Add reduce_on_plateau LR scheduler to contrib directory. by vz415 in https://github.com/google-deepmind/optax/pull/629
* Add to .gitignore temporary directories of building the docs. by fabianp in https://github.com/google-deepmind/optax/pull/680
* Add the SAM example to the example gallery by fabianp in https://github.com/google-deepmind/optax/pull/681
* Add reduce_on_plateau example (https://github.com/google-deepmind/optax/issues/679) by copybara-service in https://github.com/google-deepmind/optax/pull/683
* Wrong image path on redue on plateau example by copybara-service in https://github.com/google-deepmind/optax/pull/686
* Remove execution timeout (https://github.com/google-deepmind/optax/issues/687) by copybara-service in https://github.com/google-deepmind/optax/pull/689
* Documentation Typo Fixes for apply_if_finite by mharradon in https://github.com/google-deepmind/optax/pull/696
* FIX typos in public readme by copybara-service in https://github.com/google-deepmind/optax/pull/691
* Add adversarial training example by mmhamdy in https://github.com/google-deepmind/optax/pull/682
* Divide API documentation by MaanasArora in https://github.com/google-deepmind/optax/pull/688
* Avoid scalar conversion of non-scalar arrays by copybara-service in https://github.com/google-deepmind/optax/pull/693
* Improving docstrings for schedules. by copybara-service in https://github.com/google-deepmind/optax/pull/692
* Fix `TypeError` in `contrib.reduce_on_plateau()` when x64 is enabled by stefanocortinovis in https://github.com/google-deepmind/optax/pull/697
* Fix some doc links in README.md and other files 665 by Bigpet in https://github.com/google-deepmind/optax/pull/706
* replace branch master by main by copybara-service in https://github.com/google-deepmind/optax/pull/709
* New optax release by copybara-service in https://github.com/google-deepmind/optax/pull/701

New Contributors
* vwxyzjn made their first contribution in https://github.com/google-deepmind/optax/pull/572
* carlosgmartin made their first contribution in https://github.com/google-deepmind/optax/pull/510
* albcab made their first contribution in https://github.com/google-deepmind/optax/pull/536
* celiolarcher made their first contribution in https://github.com/google-deepmind/optax/pull/596
* adefazio made their first contribution in https://github.com/google-deepmind/optax/pull/651
* vz415 made their first contribution in https://github.com/google-deepmind/optax/pull/629
* mharradon made their first contribution in https://github.com/google-deepmind/optax/pull/696
* mmhamdy made their first contribution in https://github.com/google-deepmind/optax/pull/682
* MaanasArora made their first contribution in https://github.com/google-deepmind/optax/pull/688
* stefanocortinovis made their first contribution in https://github.com/google-deepmind/optax/pull/697
* Bigpet made their first contribution in https://github.com/google-deepmind/optax/pull/706

**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.1.7...v0.1.8

0.1.7

What's Changed
* Remove deprecated field from pyproject.toml, which should hopefully resolve an issue with deploy

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.6...v0.1.7

0.1.6

What's Changed
* feat: Add support for is_leaf to tree_map_params, so that `tree_map_params` works with `optax.masked`.
* feat: migrate to `pyproject.toml` for building package by SauravMaheshkar in https://github.com/deepmind/optax/pull/513
* fix(README): small typo in README by jeertmans in https://github.com/deepmind/optax/pull/495
* Add support for PolyLoss (ICLR 2022) by acforvs in https://github.com/deepmind/optax/pull/467

New Contributors
* jeertmans made their first contribution in https://github.com/deepmind/optax/pull/495

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.5...v0.1.6

0.1.5

What's Changed
* Fix arXiv link to Optax Optimistic Gradient Descent optimizer by 8bitmp3 in https://github.com/deepmind/optax/pull/458
* Fix the Yogi optimizer paper year, change link to NeurIPS site by 8bitmp3 in https://github.com/deepmind/optax/pull/461
* Add exponent to cosine decay schedule and warmup + cosine decay by copybara-service in https://github.com/deepmind/optax/pull/476
* Fix typos in docstring by pomonam in https://github.com/deepmind/optax/pull/480
* Fix global_norm() signature by brentyi in https://github.com/deepmind/optax/pull/481
* Fix `inject_hyperparams()` for python < 3.10. by copybara-service in https://github.com/deepmind/optax/pull/486
* fixed NaN issues in `kl_divergence` loss function by LukasMut in https://github.com/deepmind/optax/pull/473
* feat(ci/tests): bump `setup-python` version and enable cache by SauravMaheshkar in https://github.com/deepmind/optax/pull/485
* Better tests for utils by acforvs in https://github.com/deepmind/optax/pull/465
* Run Github CI every day at 03:00. by copybara-service in https://github.com/deepmind/optax/pull/490
* Fix JIT for `piecewise_interpolate_schedule`, `cosine_onecycle_schedule`, `linear_onecycle_schedule ` by brentyi in https://github.com/deepmind/optax/pull/504
* Explicitly export "softmax_cross_entropy_with_integer_labels" by nasyxx in https://github.com/deepmind/optax/pull/499
* Add the Lion optimizer, discovered by symbolic program search. by copybara-service in https://github.com/deepmind/optax/pull/500
* Replaces references to jax.numpy.DeviceArray with jax.Array. by copybara-service in https://github.com/deepmind/optax/pull/511
* Update pytypes. by copybara-service in https://github.com/deepmind/optax/pull/514
* Fix pytype failures related to teaching pytype about NumPy scalar types. by copybara-service in https://github.com/deepmind/optax/pull/517
* Release v0.1.5. by copybara-service in https://github.com/deepmind/optax/pull/523

New Contributors
* pomonam made their first contribution in https://github.com/deepmind/optax/pull/480
* brentyi made their first contribution in https://github.com/deepmind/optax/pull/481
* LukasMut made their first contribution in https://github.com/deepmind/optax/pull/473
* SauravMaheshkar made their first contribution in https://github.com/deepmind/optax/pull/485
* acforvs made their first contribution in https://github.com/deepmind/optax/pull/465

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.4...v0.1.5

0.1.4

What's Changed

New features
* Expose adamax and adamaxw by nasyxx in https://github.com/deepmind/optax/pull/433
* Support feeding in extra dictionary data in optax/experimental in https://github.com/deepmind/optax/pull/373
* Add NovoGrad optimizer by DT6A in https://github.com/deepmind/optax/pull/385
* added optimistic gradient descent in https://github.com/deepmind/optax/pull/387
* Add types to utils.py by atgctg in https://github.com/deepmind/optax/pull/367
* Add hyperparam_dtype override to hyperparam injection in https://github.com/deepmind/optax/pull/414
* added typing to linear_algebra.py by shivance in https://github.com/deepmind/optax/pull/413
* Add amsgrad optimizer by merajhashemi in https://github.com/deepmind/optax/pull/382
* Add Hinge Loss by heytanay in https://github.com/deepmind/optax/pull/409

Bug fixes
* [optax] Increase chex version requirement to fix issue 456. in https://github.com/deepmind/optax/pull/457
* Start inject_hyperparams step count at 0. in https://github.com/deepmind/optax/pull/416
* Add noise before multiplying by the learning rate (`noisy_sgd `) by atgctg in https://github.com/deepmind/optax/pull/369

Miscellaneous
* Remove flags from lookahead_mnist example. in https://github.com/deepmind/optax/pull/395
* Bring alias docstrings more in line with style guide. in https://github.com/deepmind/optax/pull/406
* Bugfix in alias_test: Adamax named but not tested. in https://github.com/deepmind/optax/pull/419
* Test that all optimizers can be wrapped in inject_hyperparams. in https://github.com/deepmind/optax/pull/420
* Add an example for gradient accumulation. in https://github.com/deepmind/optax/pull/425
* [docs] Start adding numerical definitions to key parts of optax. in https://github.com/deepmind/optax/pull/430
* [optax] Add basic mnist example based on the lookahead_mnist example. in https://github.com/deepmind/optax/pull/436
* Install dependencies for dp-accounting manually. in https://github.com/deepmind/optax/pull/375
* Update documentation for AdamW in https://github.com/deepmind/optax/pull/376
* [Docs] softmax_cross_entropy_with_integer_label by lkhphuc in https://github.com/deepmind/optax/pull/351
* Update Returns section in gradient transformations' docstrings. in https://github.com/deepmind/optax/pull/388
* Update logo and theme for the documentation in https://github.com/deepmind/optax/pull/391
* Set the test version of flax used in the equivalence test. in https://github.com/deepmind/optax/pull/398
* Add holounic to contributors list. by in https://github.com/deepmind/optax/pull/400
* Bring transform docstrings more in line with style guide. by in https://github.com/deepmind/optax/pull/405
* Update citation. in https://github.com/deepmind/optax/pull/407
* Refine the doc of sigmoid_binary_cross_entropy to not assume the meaning of last dimension. in https://github.com/deepmind/optax/pull/418
* fix `integer_pow` recompilation in `_bias_correction` by epignatelli in https://github.com/deepmind/optax/pull/329
* Use auto instead of `/proc/cpuinfo`. in https://github.com/deepmind/optax/pull/454

New Contributors
* atgctg made their first contribution in https://github.com/deepmind/optax/pull/367
* nasyxx made their first contribution in https://github.com/deepmind/optax/pull/433
* shivance made their first contribution in https://github.com/deepmind/optax/pull/413
* epignatelli made their first contribution in https://github.com/deepmind/optax/pull/329
* merajhashemi made their first contribution in https://github.com/deepmind/optax/pull/382
* heytanay made their first contribution in https://github.com/deepmind/optax/pull/409

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.3...v0.1.4

0.1.3

What's Changed
* Do not use None in optax.masked. in https://github.com/deepmind/optax/pull/338
* Implement Adamax and AdamaxW optimizers. in https://github.com/deepmind/optax/pull/342
* Add Kullback-Leibler Divergence Loss by holounic in https://github.com/deepmind/optax/pull/309
* Add optax.softmax_cross_entropy_with_integer_labels. in https://github.com/deepmind/optax/pull/343
* Publicize private methods in transform in https://github.com/deepmind/optax/pull/348
* Update .pylintrc in https://github.com/deepmind/optax/pull/354
* Support mixture of dtypes for parameters when clipping. in https://github.com/deepmind/optax/pull/355
* Update "jax.tree_util" functions in https://github.com/deepmind/optax/pull/370

New Contributors
* holounic made their first contribution in https://github.com/deepmind/optax/pull/309

**Full Changelog**: https://github.com/deepmind/optax/compare/v0.1.2...v0.1.3

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.