Optimtool

Latest version: v2.8.2

Safety actively analyzes 638819 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 4

1.4

Simple Case:
python
import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) funcs, args, x_0

Use `FuncArray`, `ArgArray`, `PointArray`, `IterPointType`, `OutputType` in typing, and delete `functions/` folder. I use many means to accelerate the method, I can't enumerate them here.

1.3

In [`v2.3.4`](https://pypi.org/project/optimtool/2.3.4/), We call a method as follows:
python
import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4")
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
funcs = sp.Matrix([f])
args = sp.Matrix([x1, x2, x3, x4])
x_0 = (1, 2, 3, 4)
oo.unconstrain.gradient_descent.barzilar_borwein(funcs, args, x_0)

But in [`v2.3.5`](https://pypi.org/project/optimtool/2.3.5/), We now call a method as follows: (It reduces the trouble of constructing data externally.)
python
import optimtool as oo
x1, x2, x3, x4 = sp.symbols("x1 x2 x3 x4") Declare symbolic variables
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
oo.unconstrain.gradient_descent.barzilar_borwein(f, [x1, x2, x3, x4], (1, 2, 3, 4)) funcs, args, x_0
funcs(args) can be list, tuple, sp.Matrix

functional parameters of bulit-in method are similar to `MATLAB Optimization Tool`, and supports more methods than it.

1.2

- optimtool.unconstrain (more improvements and corrections)
- optimtool.constrain (same as v1.1)
- optimtool.example (same as v1.1)

Case for Unconstrain:
python
from optimtool.unconstrain import nonlinear_least_square

r1, r2, x1, x2 = sp.symbols("r1 r2 x1 x2")
r1 = x1**3 - 2*x2**2 - 1
r2 = 2*x1 + x2 - 2
funcr = sp.Matrix([r1, r2]) with `sympy.Matrix()`
args = sp.Matrix([x1, x2])
x_0 = [2, 2] only allows `list` data

nonlinear_least_square.gauss_newton(funcr, args, x_0)
nonlinear_least_square.levenberg_marquardt(funcr, args, x_0)


Method Comparsion:
![](https://github.com/linjing-lab/optimtool/blob/v1.2-pre/visualization%20algorithms/%E9%9D%9E%E7%BA%BF%E6%80%A7%E6%9C%80%E5%B0%8F%E4%BA%8C%E4%B9%98%E5%87%BD%E6%95%B0%E6%B5%8B%E8%AF%95.png)

You can refer to https://github.com/linjing-lab/optimtool/tree/v1.2-pre#readme for more details.

1.2pre

1.1

- optimtool.unconstrain
- optimtool.constrain (may not stable)
- optimtool.example (only contains `Lasso` problem)

Basic Construct Case:
python
f, x1, x2, x3, x4 = sp.symbols("f x1 x2 x3 x4")
f = (x1 - 1)**2 + (x2 - 1)**2 + (x3 - 1)**2 + (x1**2 + x2**2 + x3**2 + x4**2 - 0.25)**2
funcs = sp.Matrix([f]) with `sympy.Matrix([f])`
args = sp.Matrix([x1, x2, x3, x4])
x_0 = [1, 2, 3, 4] only allows `list`

Method Comparison:
![](https://github.com/linjing-lab/optimtool/blob/v1.1/visualization%20algorithms/%E6%97%A0%E7%BA%A6%E6%9D%9F%E4%BC%98%E5%8C%96%E5%87%BD%E6%95%B0%E6%B5%8B%E8%AF%95.png)

You can refer to https://github.com/linjing-lab/optimtool/tree/v1.1#readme for more details.

1.0

How I Developed optimtool:
I started to develop from around October 2021. Since I referred to a Chinese book published in December 2020, the development progress was relatively slow, and experienced a lot of testing and verification. So the specific developed project was hosted on a Chinese hosting platform at the end of December, and later moved to Github for further development and iteration, so DeeGLMath and linjing-lab in git log are the same person. (DeeGLMath is my blog account name and [PyPI](https://pypi.org/user/DeeGLMath/) user name, and linjing-lab is my Github account name.)

Why Do I Develop optimtool:
I think mathematical optimization algorithms are relatively rare in many softwares, and I am a student of mathematics department. I need to design a software with personal thoughts, and constantly optimize and update it to create new ideas and properties.

What‘s My Core Direction:
I think it is critical to design an optimization toolbox for Python developers. Many historical methods can accelerate the research of symbolic functions or loss functions. Moreover, optimization algorithms are the core strategy of machine learning.

You may refers to https://github.com/linjing-lab/optimtool/tree/v1.0#readme for more details about v2.2.2, this version is my first successful product of optimization.

Page 4 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.