Optimtool

Latest version: v2.8.2

Safety actively analyzes 682244 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

2.6

Improved Details:
- optimized the calculation times of delta value in newton_quasi methods by reusing delta (sk).
- reshape dk returned from conjugate method of levenberg_marquardt algorithm to pass test.

2.5

Upgraded Traits:
- add *p2t* function to verify the input value named *x_0* in *optimtool.example.Lasso.py*.
- adjust break epsilon to *1e-4* in optimtool.hybrid.*, shorten algorithms execution time.

see [*test_hybrid.py*](https://github.com/linjing-lab/optimtool/blob/v2.5/test_hybrid.py) which solved the underling issue:

$$
\min x^2+2xy+y^2+2x-2y \\
\ \mathrm{s.t.} x \geq 0, y \geq 0
$$

with optimal points around (0, 1).

2.4

Improved Trait:
- add assertion in *L_BFGS* to make users not to set value m with a negative which exists in *double_loop*.
- add assertion in *example* to provide less rigorous validation for the methods involved.

see [*test_L_BFGS.py*](https://github.com/linjing-lab/optimtool/blob/v2.4/test_L_BFGS.py) which tested *L_BFGS* method.

2.3

Fixed Bug:
- fixed bug returned from *numpy.core._exceptions._UFuncOutputCastingError: Cannot cast ufunc 'add' output from dtype('float64') to dtype('int64') with casting rule 'same_kind'*, though manually set default parameter to float can avoid this issue.

see [*test_lag.py*](https://github.com/linjing-lab/optimtool/blob/v2.3/test_lag.py) which solved the above issue in this version.

2.2

Fixed Bug:
- add assertion in *_search/ZhangHanger* for users to adjust default parameters by AssertionError with `alpha > 0`.

New Example:
python
import optimtool.unconstrain as ou
from optimtool.base import sp
x = sp.symbols("x1:5")
f = 100 * (x[1] - x[0]**2)**2 + \
(1 - x[0])**2 + \
100 * (x[3] - x[2]**2)**2 + \
(1 - x[2])**2
x_0 = (-1.2, 1, -1.2, 1)
barzilar_borwein = ou.gradient_descent.barzilar_borwein
barzilar_borwein(f, x, x_0, verbose=True, method="ZhangHanger", c1=0.8, beta=0.8, eta=0.6)

see [tests](https://github.com/linjing-lab/optimtool/tree/v2.2/tests) and [examples](https://github.com/linjing-lab/optimtool/tree/v2.2/examples/doc) for fine-tuning default parameters of more algorithms.

2.1

Fixed Bug:
- step size of Lasso in example should conform to lipschitz continuity condition, same as *tk* in any module of *hybrid* file.
- test *lagrange_augmented* algorithm and correct the name of inner gradient normalized epsilon with default parameter.

python
import optimtool.constrain as oc
from optimtool.base import sp
f, x1, x2 = sp.symbols("f x1 x2")
f = (x1 - 2)**2 + (x2 - 1)**2
c1 = x1 - x2 - 1
c2 = 0.25*x1**2 - x2 - 1
oc.mixequal.lagrange_augmentedm(f, (x1, x2), c1, c2, (1., 0.), verbose=True)

see [tests](https://github.com/linjing-lab/optimtool/tree/v2.1/tests) and [examples](https://github.com/linjing-lab/optimtool/tree/v2.1/examples/doc) for a full ecology of optimization.

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.