Optimtool

Latest version: v2.8.2

Safety actively analyzes 638819 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 4

2.0

New Traits:
- upgrade *Development Status* to Production/Stable instead of Beta.
- add *hybrid* methods about FISTA, Nesterov to solve a new optimization issue.
- test and verify all executable algorithms in newly developed component.

Hybrid Optimization:
python
import optimtool.hybrid as oh
from optimtool.base import sp
x = sp.symbols("x1:3")
f = (2 - (sp.cos(x[0]) + sp.cos(x[1])) + (1 - sp.cos(x[0])) - sp.sin(x[0]))**2 + \
(2 - (sp.cos(x[0]) + sp.cos(x[1])) + 2 * (1 - sp.cos(x[1])) - sp.sin(x[1]))**2
x_0 = (0.2, 0.2) Random given
oh.fista.normal(f, x, x_0, verbose=True, epsilon=1e-4)

see [tests](https://github.com/linjing-lab/optimtool/tree/v2.0/tests) and [examples](https://github.com/linjing-lab/optimtool/tree/v2.0/examples/doc) for more compatible use, hope more issues will be discussed.

1.9

New Traits:
- more robust detection for illegal input, support *FuncArray*, *ArgArray*, *PointArray* with *optimtool._typing.py*.
- add *base.py* to make *numpy*, *sympy*, *matplotlib.pyplot* as the topest level of optimtool to convenient operations.
- support any method of optimtool to print the details about *point, f, k* with *verbose* defaulted with bool *False*.
- optimize the `__doc__` of each method into en, write more logical context for users to execute experiments.
- retest algorithms with configured parameters to prepare for the next version about *hybrid* and existed files.

Import:
python
import optimtool as oo
from optimtool.base import np, sp, plt
see optimtool/base.py

see [tests](https://github.com/linjing-lab/optimtool/tree/v1.9/tests) and [examples](https://github.com/linjing-lab/optimtool/tree/v1.9/examples/doc) for more compatible use, users can expand *hybrid* methods according to *_proxim.py* and need.

1.8

Simple Case:
python
import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) funcs, args, x_0

Bugs Fixed:
- update _kernel for more concise and comprehensive kernel selectors: kernel, linear_search, nonmonotonic_search
- add the setting variable of unconstrained break precision for constrained optimization method, like penalty_quadratic.
python
import optimtool.constrain as oc
f, x1, x2 = sp.symbols("f x1 x2")
f = (x1 - 2)**2 + (x2 - 1)**2
c1 = x1 - x2 - 1
c2 = 0.25*x1**2 - x2 - 1
oc.mixequal.penalty_L1(f, (x1, x2), c1, c2, (1.5, 0.5), epsk=1e-4) use `epsk` to set break epsilon of `kernel`

introduce to hybrid: (will be upload to optimtool in v2.5.0)
python
delta = x_0 - tk * gradient gradient=f(x).jacobian, g(x) is not differentiable.

proximity operators and iteration:
python
x_0 = np.sign(delta) * np.max(np.abs(delta) - tk, 0) L1

python
norm = np.linalg.norm(delta)
x_0 = (1 - tk / norm) * delta if norm > tk else 0 L2

python
x_0 = (delta + np.sqrt(delta**2 + 4 * tk)) / 2 -\sum_{i=1}^{n}\ln(xi), n=len(x_0)


Reference:
- [tests](https://github.com/linjing-lab/optimtool/tree/v1.8/tests)
- [examples/doc](https://github.com/linjing-lab/optimtool/tree/v1.8/examples/doc)

1.7

Simple Case:
python
import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) funcs, args, x_0

Bugs Fixed:
- update _convert/h2h: reduce corrected times of hessian matrix
python
import optimtool.unconstrain as ou
ou.newton.modified(f, [x1, x2, x3, x4], (1, 2, 3, 4)) funcs, args, x_0

Reference:
- [tests](https://github.com/linjing-lab/optimtool/tree/v1.7/tests)
- [examples/doc](https://github.com/linjing-lab/optimtool/tree/v1.7/examples/doc)

1.6

Simple Case:
python
import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) funcs, args, x_0

Bugs Fixed:
- update _convert/h2h: all eigenvalues of the hessian > 0 → rank of matrix == n.
- simplify assignment when setting the initialized space for search, point, and f.
- reduced redundant assignment of iteration points in some methods, like trust_region/steihaug_CG and etc.
- select trust_region method serving as the default configuration of the constrained optimization.

Reference:
- [tests](https://github.com/linjing-lab/optimtool/tree/v1.6/tests)
- [examples/doc](https://github.com/linjing-lab/optimtool/tree/v1.6/examples/doc)

1.5

Simple Case:
python
import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) funcs, args, x_0


- [tests](https://github.com/linjing-lab/optimtool/tree/v1.5/tests)
- [examples/doc](https://github.com/linjing-lab/optimtool/tree/v1.5/examples/doc)

Page 3 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.