Optax

Latest version: v0.2.4

Safety actively analyzes 682229 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 4

0.2.4

What's Changed
* Beginning of 0.2.4 development by copybara-service in https://github.com/google-deepmind/optax/pull/1003
* Fix gallery display, add images by vroulet in https://github.com/google-deepmind/optax/pull/1005
* Fix docs for dog by jungtaekkim in https://github.com/google-deepmind/optax/pull/1008
* docs: remove `multi_normal` from utilities.rst by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1009
* Fix docs for dowg by jungtaekkim in https://github.com/google-deepmind/optax/pull/1013
* feat: add a mathematical description of AdaGrad optimizer by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1011
* fix: refactor AdaGrad optimizer recent changes by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1016
* Make masking compatible with callable pytrees a la Equinox by copybara-service in https://github.com/google-deepmind/optax/pull/1015
* Enable zero lrate for schedule-free optimization by copybara-service in https://github.com/google-deepmind/optax/pull/1018
* keep a local .pylintrc file by fabianp in https://github.com/google-deepmind/optax/pull/1024
* Add bias_correction and eps_in_sqrt options to rmsprop and associated transforms by copybara-service in https://github.com/google-deepmind/optax/pull/1019
* Replace adam(b1=0) by rmsprop for schedule_free by copybara-service in https://github.com/google-deepmind/optax/pull/1025
* Update init value for zakharov problem from 1e4 to 1e3 by copybara-service in https://github.com/google-deepmind/optax/pull/1027
* Fix typo by gil2rok in https://github.com/google-deepmind/optax/pull/1030
* Updated docs cosine_decay_schedule by bhargavyagnik in https://github.com/google-deepmind/optax/pull/1032
* feat: add mathematical notation docs of SM3 optimizer by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1012
* DOC: misc improvements in docstring of softmax_cross_entropy* by copybara-service in https://github.com/google-deepmind/optax/pull/1033
* add doctest to constant_schedule by fabianp in https://github.com/google-deepmind/optax/pull/1034
* Add axis and where arguments to loss functions. by carlosgmartin in https://github.com/google-deepmind/optax/pull/912
* Fix doctest error in make_fenchel_young_loss by copybara-service in https://github.com/google-deepmind/optax/pull/1035
* add doctest for polynomial_schedule by fabianp in https://github.com/google-deepmind/optax/pull/1037
* add missing schedule_free_* methods by fabianp in https://github.com/google-deepmind/optax/pull/1043
* fix error in softmax_cross_entropy formula by fabianp in https://github.com/google-deepmind/optax/pull/1041
* Fix typo in formula of cosine_decay_schedule by fabianp in https://github.com/google-deepmind/optax/pull/1044
* schedule_free: fix broadcasting of scalar arrays to 1d arrays by n-gao in https://github.com/google-deepmind/optax/pull/1042
* Update polynomial_schedule doctest per vroulet's feedback by fabianp in https://github.com/google-deepmind/optax/pull/1045
* Fix linting schedule_free_test by copybara-service in https://github.com/google-deepmind/optax/pull/1048
* more robust tests by copybara-service in https://github.com/google-deepmind/optax/pull/1050
* Generalizes safe_int32_increment to safe_increment by copybara-service in https://github.com/google-deepmind/optax/pull/1054
* Add dtype option to tree_random_like by copybara-service in https://github.com/google-deepmind/optax/pull/1056
* Add double precision tests for safe_increment and fix warnings on float64_test.py by copybara-service in https://github.com/google-deepmind/optax/pull/1055
* Add optax.tree_utils.tree_random_split. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1063
* Fix test.sh, which uses set -u, so that it works when JAX_VERSION is unset. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1070
* Migrate from jax.tree_util legacy APIs to new jax.tree API. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1066
* Ensure optimizers return updates of same dtype as params. by copybara-service in https://github.com/google-deepmind/optax/pull/1060
* Fix test.sh to not modify .pylintrc. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1071
* Replace deprecated typing.Hashable with collections.abc.Hashable. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1068
* Relax absolute tolerance for failing tests involving chex.assert_trees_all_close. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1069
* Fix doctests by copybara-service in https://github.com/google-deepmind/optax/pull/1073
* Tidy up test.sh and make it clean up properly. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1074
* Add missing initializer argument of 0 to tree_reduce in tree_vdot and tree_sum. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1065
* bump chex version for 1076 by fabianp in https://github.com/google-deepmind/optax/pull/1078
* correct RST references by fabianp in https://github.com/google-deepmind/optax/pull/1079
* deprecate methods in `optax/monte_carlo`. by copybara-service in https://github.com/google-deepmind/optax/pull/1076
* schedule-free optimier: ensure it's possible to donate both the state and the params by enolan in https://github.com/google-deepmind/optax/pull/1059
* add link to examples from docstring by fabianp in https://github.com/google-deepmind/optax/pull/1085
* Adopt US spelling for documentation and fix typos by miguelcsx in https://github.com/google-deepmind/optax/pull/1087
* Update docs: Note RMSprop usage instead of Adam for memory savings in… by nasyxx in https://github.com/google-deepmind/optax/pull/1086
* adding a perturbations module. Can take pytrees as inputs by copybara-service in https://github.com/google-deepmind/optax/pull/827
* Fix initial step of scale_by_optimistic_gradient. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1088
* Add Hungarian algorithm for the linear assignment problem. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1083
* Allow safe_increment to handle unsigned integers. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1092
* Fix formatting issues with gallery entry for linear assignment problem. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1095
* use sphinx references instead of hardcoded links. by fabianp in https://github.com/google-deepmind/optax/pull/1096
* Remove dtype safeguards by copybara-service in https://github.com/google-deepmind/optax/pull/1099
* cosmetic improvements perturbations module by fabianp in https://github.com/google-deepmind/optax/pull/1097
* update jax.tree.map to comply with jax 0.4.34 by a1302z in https://github.com/google-deepmind/optax/pull/1094
* Add Adan optimizer. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1090
* Fix typo in projection_simplex docstring. by copybara-service in https://github.com/google-deepmind/optax/pull/1105
* add config for link checking, and mark 429 (too many requests) as fine by fabianp in https://github.com/google-deepmind/optax/pull/1103
* Fix docstring for hungarian_algorithm. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1102
* Add optax.optimistic_adam. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1089
* Add projection_l1_sphere and projection_l1_ball. by copybara-service in https://github.com/google-deepmind/optax/pull/1106
* Add projection_l2_sphere and projection_l2_ball. by copybara-service in https://github.com/google-deepmind/optax/pull/1114
* Add tree_max. by copybara-service in https://github.com/google-deepmind/optax/pull/1115
* Add projection_linf_ball. by copybara-service in https://github.com/google-deepmind/optax/pull/1117
* remove test that leaked jax tracers by copybara-service in https://github.com/google-deepmind/optax/pull/1123
* Add a mathematical description for Lion by aman2304 in https://github.com/google-deepmind/optax/pull/1121
* Fix the sign for the update in the math equation for nadam in the docs. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1128
* ntxent fix by GrantMcConachie in https://github.com/google-deepmind/optax/pull/946
* Add Nesterov momentum to AdaBelief optimizer. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1127
* fix: Coherent dtypes of updates with and without MultiSteps by hlzl in https://github.com/google-deepmind/optax/pull/1122
* Fix AdaBelief implementation. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1130
* Revisiting linesearches and LBFGS. by copybara-service in https://github.com/google-deepmind/optax/pull/1133

New Contributors
* vroulet made their first contribution in https://github.com/google-deepmind/optax/pull/1005
* jungtaekkim made their first contribution in https://github.com/google-deepmind/optax/pull/1008
* Abhinavcode13 made their first contribution in https://github.com/google-deepmind/optax/pull/1009
* gil2rok made their first contribution in https://github.com/google-deepmind/optax/pull/1030
* bhargavyagnik made their first contribution in https://github.com/google-deepmind/optax/pull/1032
* n-gao made their first contribution in https://github.com/google-deepmind/optax/pull/1042
* enolan made their first contribution in https://github.com/google-deepmind/optax/pull/1059
* miguelcsx made their first contribution in https://github.com/google-deepmind/optax/pull/1087
* a1302z made their first contribution in https://github.com/google-deepmind/optax/pull/1094
* aman2304 made their first contribution in https://github.com/google-deepmind/optax/pull/1121
* hlzl made their first contribution in https://github.com/google-deepmind/optax/pull/1122

**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.2.3...v0.2.4

0.2.3

New Contributors
* gbruno16 made their first contribution in https://github.com/google-deepmind/optax/pull/894
* satyenkale made their first contribution in https://github.com/google-deepmind/optax/pull/901
* GrantMcConachie made their first contribution in https://github.com/google-deepmind/optax/pull/897
* fabian-sp made their first contribution in https://github.com/google-deepmind/optax/pull/721
* lukekulik made their first contribution in https://github.com/google-deepmind/optax/pull/974
* jakevdp made their first contribution in https://github.com/google-deepmind/optax/pull/987

**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.2.2...v0.2.3

0.2.2

What's Changed
* Added mathematical description to Noisy SGD by hmludwig in https://github.com/google-deepmind/optax/pull/857
* Use sphinx-contributors for an automated contributors list. by fabianp in https://github.com/google-deepmind/optax/pull/841
* Implementation of the Polyak SGD solver by copybara-service in https://github.com/google-deepmind/optax/pull/718
* Document the extra args of the update function in docstring by copybara-service in https://github.com/google-deepmind/optax/pull/864
* Utility to set value in a pytree (and so in state) by copybara-service in https://github.com/google-deepmind/optax/pull/865
* Added mathematical description to AdaBelief docstring by hmludwig in https://github.com/google-deepmind/optax/pull/869
* FIX RST formatting in inject hyperparams by fabianp in https://github.com/google-deepmind/optax/pull/867
* Warn that in future arguments after the initial (prediction, ground_truth) positional arguments will become keyword-only in optax losses. by copybara-service in https://github.com/google-deepmind/optax/pull/863
* Upstream missing jaxopt losses to optax - Part 2/N by copybara-service in https://github.com/google-deepmind/optax/pull/872
* Fix error `reduce_on_plateau.ipynb:20002: WARNING: No source code lexer found for notebook cell` by copybara-service in https://github.com/google-deepmind/optax/pull/875
* docstring cosmetic improvements by fabianp in https://github.com/google-deepmind/optax/pull/879
* Extend capabilities of tree_get, tree_set. by copybara-service in https://github.com/google-deepmind/optax/pull/878
* [DOC] Add to the gallery an example on a small language model by copybara-service in https://github.com/google-deepmind/optax/pull/866
* Update reduce_on_plateau to handle training average loss. by copybara-service in https://github.com/google-deepmind/optax/pull/883
* Fix notebook reduce_on_plateau by copybara-service in https://github.com/google-deepmind/optax/pull/887
* ENH: extend power_iteration to accept a matrix in implicit form by copybara-service in https://github.com/google-deepmind/optax/pull/858
* Document changes in power_iteration by copybara-service in https://github.com/google-deepmind/optax/pull/889
* Release of version 0.2.2 by copybara-service in https://github.com/google-deepmind/optax/pull/892


**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.2.1...v0.2.2

0.2.1

**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.2.0...v0.2.1

0.2.0

New Contributors
* JayJayleee made their first contribution in https://github.com/google-deepmind/optax/pull/762
* yixiaoer made their first contribution in https://github.com/google-deepmind/optax/pull/785
* mblondel made their first contribution in https://github.com/google-deepmind/optax/pull/790
* WangHanSolo made their first contribution in https://github.com/google-deepmind/optax/pull/782
* hmludwig made their first contribution in https://github.com/google-deepmind/optax/pull/804

**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.1.9...v0.2.0

0.1.9

What's Changed
* update URL github.com/deepmind -> github.com/google-deepmind and branch to main by copybara-service in https://github.com/google-deepmind/optax/pull/710
* Show CI results only for the main branch by fabianp in https://github.com/google-deepmind/optax/pull/716
* typo nesterov -> Nesterov by fabianp in https://github.com/google-deepmind/optax/pull/722
* Add `atol` option to `contrib.reduce_on_plateau()` by stefanocortinovis in https://github.com/google-deepmind/optax/pull/698
* add docs for tree_utils module by amosyou in https://github.com/google-deepmind/optax/pull/724
* Simplifications on the doc by copybara-service in https://github.com/google-deepmind/optax/pull/727
* add nadam and nadamw optimizers by copybara-service in https://github.com/google-deepmind/optax/pull/723
* Add `versionadded` and `seealso` metadata to (n)adam(w) solvers by copybara-service in https://github.com/google-deepmind/optax/pull/729
* Enable doctests in sphinx and fix failing doctests by fabianp in https://github.com/google-deepmind/optax/pull/733
* Add missing members to the doc by copybara-service in https://github.com/google-deepmind/optax/pull/734
* FIX sphinx warnings "this document is not included in any toctree" by fabianp in https://github.com/google-deepmind/optax/pull/736
* feat(ci): drop `setup.py` from publishing CI by SauravMaheshkar in https://github.com/google-deepmind/optax/pull/737
* Minor tweak in pypi-publish.yml by fabianp in https://github.com/google-deepmind/optax/pull/739
* [TEST] Install virtual environment in current directory instead of /tmp by fabianp in https://github.com/google-deepmind/optax/pull/746
* migrate to pyproject by copybara-service in https://github.com/google-deepmind/optax/pull/747
* Deprecate optax.inject_stateful_hyperparameters and replace it with optax.inject_hyperparameters by copybara-service in https://github.com/google-deepmind/optax/pull/730
* Clarify inclusion criteria into optax and optax.contrib by copybara-service in https://github.com/google-deepmind/optax/pull/742
* fix the default learning rate in prodigy by konstmish in https://github.com/google-deepmind/optax/pull/740
* update and merge quickstart notebooks by amosyou in https://github.com/google-deepmind/optax/pull/726
* Remove redundant examples from README by mmhamdy in https://github.com/google-deepmind/optax/pull/754
* Instructions to build the doc and option to build the docs without executing the notebooks. by copybara-service in https://github.com/google-deepmind/optax/pull/759
* Remove license statement from notebooks by renuka010 in https://github.com/google-deepmind/optax/pull/764

New Contributors
* amosyou made their first contribution in https://github.com/google-deepmind/optax/pull/724
* konstmish made their first contribution in https://github.com/google-deepmind/optax/pull/740
* renuka010 made their first contribution in https://github.com/google-deepmind/optax/pull/764

**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.1.8...v0.1.9

Page 1 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.