What's Changed
* Beginning of 0.2.4 development by copybara-service in https://github.com/google-deepmind/optax/pull/1003
* Fix gallery display, add images by vroulet in https://github.com/google-deepmind/optax/pull/1005
* Fix docs for dog by jungtaekkim in https://github.com/google-deepmind/optax/pull/1008
* docs: remove `multi_normal` from utilities.rst by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1009
* Fix docs for dowg by jungtaekkim in https://github.com/google-deepmind/optax/pull/1013
* feat: add a mathematical description of AdaGrad optimizer by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1011
* fix: refactor AdaGrad optimizer recent changes by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1016
* Make masking compatible with callable pytrees a la Equinox by copybara-service in https://github.com/google-deepmind/optax/pull/1015
* Enable zero lrate for schedule-free optimization by copybara-service in https://github.com/google-deepmind/optax/pull/1018
* keep a local .pylintrc file by fabianp in https://github.com/google-deepmind/optax/pull/1024
* Add bias_correction and eps_in_sqrt options to rmsprop and associated transforms by copybara-service in https://github.com/google-deepmind/optax/pull/1019
* Replace adam(b1=0) by rmsprop for schedule_free by copybara-service in https://github.com/google-deepmind/optax/pull/1025
* Update init value for zakharov problem from 1e4 to 1e3 by copybara-service in https://github.com/google-deepmind/optax/pull/1027
* Fix typo by gil2rok in https://github.com/google-deepmind/optax/pull/1030
* Updated docs cosine_decay_schedule by bhargavyagnik in https://github.com/google-deepmind/optax/pull/1032
* feat: add mathematical notation docs of SM3 optimizer by Abhinavcode13 in https://github.com/google-deepmind/optax/pull/1012
* DOC: misc improvements in docstring of softmax_cross_entropy* by copybara-service in https://github.com/google-deepmind/optax/pull/1033
* add doctest to constant_schedule by fabianp in https://github.com/google-deepmind/optax/pull/1034
* Add axis and where arguments to loss functions. by carlosgmartin in https://github.com/google-deepmind/optax/pull/912
* Fix doctest error in make_fenchel_young_loss by copybara-service in https://github.com/google-deepmind/optax/pull/1035
* add doctest for polynomial_schedule by fabianp in https://github.com/google-deepmind/optax/pull/1037
* add missing schedule_free_* methods by fabianp in https://github.com/google-deepmind/optax/pull/1043
* fix error in softmax_cross_entropy formula by fabianp in https://github.com/google-deepmind/optax/pull/1041
* Fix typo in formula of cosine_decay_schedule by fabianp in https://github.com/google-deepmind/optax/pull/1044
* schedule_free: fix broadcasting of scalar arrays to 1d arrays by n-gao in https://github.com/google-deepmind/optax/pull/1042
* Update polynomial_schedule doctest per vroulet's feedback by fabianp in https://github.com/google-deepmind/optax/pull/1045
* Fix linting schedule_free_test by copybara-service in https://github.com/google-deepmind/optax/pull/1048
* more robust tests by copybara-service in https://github.com/google-deepmind/optax/pull/1050
* Generalizes safe_int32_increment to safe_increment by copybara-service in https://github.com/google-deepmind/optax/pull/1054
* Add dtype option to tree_random_like by copybara-service in https://github.com/google-deepmind/optax/pull/1056
* Add double precision tests for safe_increment and fix warnings on float64_test.py by copybara-service in https://github.com/google-deepmind/optax/pull/1055
* Add optax.tree_utils.tree_random_split. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1063
* Fix test.sh, which uses set -u, so that it works when JAX_VERSION is unset. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1070
* Migrate from jax.tree_util legacy APIs to new jax.tree API. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1066
* Ensure optimizers return updates of same dtype as params. by copybara-service in https://github.com/google-deepmind/optax/pull/1060
* Fix test.sh to not modify .pylintrc. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1071
* Replace deprecated typing.Hashable with collections.abc.Hashable. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1068
* Relax absolute tolerance for failing tests involving chex.assert_trees_all_close. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1069
* Fix doctests by copybara-service in https://github.com/google-deepmind/optax/pull/1073
* Tidy up test.sh and make it clean up properly. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1074
* Add missing initializer argument of 0 to tree_reduce in tree_vdot and tree_sum. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1065
* bump chex version for 1076 by fabianp in https://github.com/google-deepmind/optax/pull/1078
* correct RST references by fabianp in https://github.com/google-deepmind/optax/pull/1079
* deprecate methods in `optax/monte_carlo`. by copybara-service in https://github.com/google-deepmind/optax/pull/1076
* schedule-free optimier: ensure it's possible to donate both the state and the params by enolan in https://github.com/google-deepmind/optax/pull/1059
* add link to examples from docstring by fabianp in https://github.com/google-deepmind/optax/pull/1085
* Adopt US spelling for documentation and fix typos by miguelcsx in https://github.com/google-deepmind/optax/pull/1087
* Update docs: Note RMSprop usage instead of Adam for memory savings in… by nasyxx in https://github.com/google-deepmind/optax/pull/1086
* adding a perturbations module. Can take pytrees as inputs by copybara-service in https://github.com/google-deepmind/optax/pull/827
* Fix initial step of scale_by_optimistic_gradient. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1088
* Add Hungarian algorithm for the linear assignment problem. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1083
* Allow safe_increment to handle unsigned integers. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1092
* Fix formatting issues with gallery entry for linear assignment problem. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1095
* use sphinx references instead of hardcoded links. by fabianp in https://github.com/google-deepmind/optax/pull/1096
* Remove dtype safeguards by copybara-service in https://github.com/google-deepmind/optax/pull/1099
* cosmetic improvements perturbations module by fabianp in https://github.com/google-deepmind/optax/pull/1097
* update jax.tree.map to comply with jax 0.4.34 by a1302z in https://github.com/google-deepmind/optax/pull/1094
* Add Adan optimizer. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1090
* Fix typo in projection_simplex docstring. by copybara-service in https://github.com/google-deepmind/optax/pull/1105
* add config for link checking, and mark 429 (too many requests) as fine by fabianp in https://github.com/google-deepmind/optax/pull/1103
* Fix docstring for hungarian_algorithm. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1102
* Add optax.optimistic_adam. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1089
* Add projection_l1_sphere and projection_l1_ball. by copybara-service in https://github.com/google-deepmind/optax/pull/1106
* Add projection_l2_sphere and projection_l2_ball. by copybara-service in https://github.com/google-deepmind/optax/pull/1114
* Add tree_max. by copybara-service in https://github.com/google-deepmind/optax/pull/1115
* Add projection_linf_ball. by copybara-service in https://github.com/google-deepmind/optax/pull/1117
* remove test that leaked jax tracers by copybara-service in https://github.com/google-deepmind/optax/pull/1123
* Add a mathematical description for Lion by aman2304 in https://github.com/google-deepmind/optax/pull/1121
* Fix the sign for the update in the math equation for nadam in the docs. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1128
* ntxent fix by GrantMcConachie in https://github.com/google-deepmind/optax/pull/946
* Add Nesterov momentum to AdaBelief optimizer. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1127
* fix: Coherent dtypes of updates with and without MultiSteps by hlzl in https://github.com/google-deepmind/optax/pull/1122
* Fix AdaBelief implementation. by carlosgmartin in https://github.com/google-deepmind/optax/pull/1130
* Revisiting linesearches and LBFGS. by copybara-service in https://github.com/google-deepmind/optax/pull/1133
New Contributors
* vroulet made their first contribution in https://github.com/google-deepmind/optax/pull/1005
* jungtaekkim made their first contribution in https://github.com/google-deepmind/optax/pull/1008
* Abhinavcode13 made their first contribution in https://github.com/google-deepmind/optax/pull/1009
* gil2rok made their first contribution in https://github.com/google-deepmind/optax/pull/1030
* bhargavyagnik made their first contribution in https://github.com/google-deepmind/optax/pull/1032
* n-gao made their first contribution in https://github.com/google-deepmind/optax/pull/1042
* enolan made their first contribution in https://github.com/google-deepmind/optax/pull/1059
* miguelcsx made their first contribution in https://github.com/google-deepmind/optax/pull/1087
* a1302z made their first contribution in https://github.com/google-deepmind/optax/pull/1094
* aman2304 made their first contribution in https://github.com/google-deepmind/optax/pull/1121
* hlzl made their first contribution in https://github.com/google-deepmind/optax/pull/1122
**Full Changelog**: https://github.com/google-deepmind/optax/compare/v0.2.3...v0.2.4