Optax

Latest version: v0.2.3

Safety actively analyzes 663899 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 4

0.0.5

Changelog

*Note: this is a first GitHub release of Optax. It includes all changes since the repo was created.*

[Full Changelog](https://github.com/deepmind/optax/compare/d0ba208a300d68a3f4e380ed2a0b70848380988b...HEAD)

**Implemented enhancements:**

- Implement lookahead optimiser [\17](https://github.com/deepmind/optax/issues/17)
- Implement support for Yogi optimiser [\9](https://github.com/deepmind/optax/issues/9)
- Implement rectified Adam [\8](https://github.com/deepmind/optax/issues/8)
- Implement gradient centralisation [\7](https://github.com/deepmind/optax/issues/7)
- Implement scaling by AdaBelief [\6](https://github.com/deepmind/optax/issues/6)

**Closed issues:**

- Multiple optimizers using optax [\59](https://github.com/deepmind/optax/issues/59)
- Change masked wrapper to use mask\_fn instead of mask [\57](https://github.com/deepmind/optax/issues/57)
- Prevent creating unnecessary momentum variables [\52](https://github.com/deepmind/optax/issues/52)
- Implement Differentially Private Stochastic Gradient Descent [\50](https://github.com/deepmind/optax/issues/50)
- RMSProp does not match original Tensorflow impl [\49](https://github.com/deepmind/optax/issues/49)
- JITted Adam results in NaN when setting decay to integer 0 [\46](https://github.com/deepmind/optax/issues/46)
- Option to not decay bias with additive\_weight\_decay [\25](https://github.com/deepmind/optax/issues/25)
- Support specifying end\_value for exponential\_decay [\21](https://github.com/deepmind/optax/issues/21)
- Schedules for Non-Learning Rate Hyper-parameters [\20](https://github.com/deepmind/optax/issues/20)
- Implement OneCycle Learning Rate Schedule [\19](https://github.com/deepmind/optax/issues/19)
- adam does not learn? [\18](https://github.com/deepmind/optax/issues/18)
- Which JAX-based libraries is optax compatible with? [\14](https://github.com/deepmind/optax/issues/14)
- Manually setting the learning\_rate? [\4](https://github.com/deepmind/optax/issues/4)

**Merged pull requests:**

- Fix pylint errors. [\73](https://github.com/deepmind/optax/pull/73) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add PyPI release workflow and increment the version. [\70](https://github.com/deepmind/optax/pull/70) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add flax to requirements for tests. [\69](https://github.com/deepmind/optax/pull/69) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add first flax equivalence test. [\68](https://github.com/deepmind/optax/pull/68) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Targets optional in l2loss and huberloss. [\67](https://github.com/deepmind/optax/pull/67) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add .pylintrc and run pylint checks in CI workflow. [\66](https://github.com/deepmind/optax/pull/66) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Increase optax version [\63](https://github.com/deepmind/optax/pull/63) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add utilities for eigenvector and matrix inverse pth root computation. [\62](https://github.com/deepmind/optax/pull/62) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add Callable option to optax.masked. [\60](https://github.com/deepmind/optax/pull/60) ([n2cholas](https://github.com/n2cholas))
- Increase optax version for PyPi release. [\58](https://github.com/deepmind/optax/pull/58) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add momentum and initial\_scale to RMSProp [\55](https://github.com/deepmind/optax/pull/55) ([rwightman](https://github.com/rwightman))
- Prevent creating unnecessary momentum variables. [\54](https://github.com/deepmind/optax/pull/54) ([n2cholas](https://github.com/n2cholas))
- Implement DPSGD [\53](https://github.com/deepmind/optax/pull/53) ([n2cholas](https://github.com/n2cholas))
- Add inject\_hyperparams wrapper [\48](https://github.com/deepmind/optax/pull/48) ([n2cholas](https://github.com/n2cholas))
- Format tests and parallelize pytest runs. [\47](https://github.com/deepmind/optax/pull/47) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Provide a canonical implementation of canonical losses used in gradient based optimisation. [\45](https://github.com/deepmind/optax/pull/45) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Expose optax transform's init and update function signatures to facilitate type annotation in user code. [\44](https://github.com/deepmind/optax/pull/44) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add a transformation and a transformation wrapper. [\43](https://github.com/deepmind/optax/pull/43) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Update reference arxiv link. [\41](https://github.com/deepmind/optax/pull/41) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Move equivalence tests to a separate file, as we will be adding more. [\40](https://github.com/deepmind/optax/pull/40) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Optax: Add MNIST example with Adam optimizer and lookahead wrapper. [\39](https://github.com/deepmind/optax/pull/39) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Optax: gradient transformation for non-negative parameters. [\38](https://github.com/deepmind/optax/pull/38) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Aliases support LR schedules in addition to constant scalar LRs. [\37](https://github.com/deepmind/optax/pull/37) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Optax: add datasets module for image classifier example. [\36](https://github.com/deepmind/optax/pull/36) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Ensure the number of update functions and states is the same in chain. [\34](https://github.com/deepmind/optax/pull/34) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Rename `additive_weight_decay` to `add_decayed_weights`. [\33](https://github.com/deepmind/optax/pull/33) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Remove `scale_by_fromage`. [\32](https://github.com/deepmind/optax/pull/32) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add AGC to optax \_\_init\_\_ and add comment noting regarding 1D conv weights. [\30](https://github.com/deepmind/optax/pull/30) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Clean nits to make loss and hk.transform\(\) slightly more clear. [\29](https://github.com/deepmind/optax/pull/29) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Disable macos-latest tests \(to speed up CI\) and add CI status badge. [\28](https://github.com/deepmind/optax/pull/28) ([copybara-service[bot]](https://github.com/apps/copybara-service))
- Add a mask wrapper. [\27](https://github.com/deepmind/optax/pull/27) ([n2cholas](https://github.com/n2cholas))
- Support end\_value for exponential\_decay [\26](https://github.com/deepmind/optax/pull/26) ([n2cholas](https://github.com/n2cholas))
- Add piecewise\_interpolate\_schedule, linear\_onecycle, and cos\_onecycle. [\22](https://github.com/deepmind/optax/pull/22) ([n2cholas](https://github.com/n2cholas))
- Yogi [\16](https://github.com/deepmind/optax/pull/16) ([joaogui1](https://github.com/joaogui1))
- Radam [\15](https://github.com/deepmind/optax/pull/15) ([joaogui1](https://github.com/joaogui1))
- gradient centralization [\13](https://github.com/deepmind/optax/pull/13) ([joaogui1](https://github.com/joaogui1))
- Fix haiku\_example.py [\5](https://github.com/deepmind/optax/pull/5) ([asmith26](https://github.com/asmith26))



\* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*

Page 4 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.