Zfit

Latest version: v0.24.1

Safety actively analyzes 687881 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 12

2.1

Major Features and Improvements
-------------------------------

- Fixed the comparison in caching the graph (implementation detail) that leads to an error.

0.24.1

=====================

Bug fixes and small changes
---------------------------
- remain order of some internal parameters to increase full reproducibility
- minuit minimizer could have no covariance albeit being advertised, lead to error in some cases

0.24.0

=======================

Upgrade to new TF version

Requirement changes
-------------------
- TensorFlow ~2.18
- TensorFlow Probability ~0.25

0.23.0

======================

Major Features and Improvements
-------------------------------
- Minimizers can use the new ``SimpleLoss.from_any`` method that allows other libraries to hook into the minimization.
For example, using zfit-physics, minimizers can directly minimize RooFit ``RooNllVar`` (as created by ``createNLL`` described `here <https://root.cern.ch/doc/master/classRooAbsPdf.html#a24b1afec4fd149e08967eac4285800de>`_
- Added the well performing ``LevenbergMarquardt`` minimizer, a new implementation of the Levenberg-Marquardt algorithm.
- New BFGS minimizer implementation of Scipy, ``ScipyBFGS``.
- Reactivate a few minimizers: ``ScipyDogleg``, ``ScipyNCG``, ``ScipyCOBYLA`` and ``ScipyNewtonCG``
- Add ``GeneralizedGauss`` PDF, where the exponent is something else than squared, taken from `tensorflow-probability<https://www.tensorflow.org/probability/api_docs/python/tfp/distributions/GeneralizedNormal>`_.

Breaking changes
------------------
- removed multiple, old deprecated methods and arguments


Deprecations
-------------
- use ``stepsize`` instead of ``step_size`` in the ``zfit.Parameter`` constructor

Bug fixes and small changes
---------------------------
- add possibility to not jit by using ``force_eager`` in ``tf.function`` or raise a ``z.DoNotCompile`` error
- ``SimpleLoss`` can now be added together with another SimpleLoss
- ``get_params`` supports an ``autograd`` argument to filter parameters that do not support automatic differentiation.
An object with parameters can advertise, which parameters are differentiable (with ``autograd_params``); by default, all
parameters are assumed to be differentiable, the same effect as ``True``. If autograd is performed on parameters that
do not support it, an error is raised.
- Use ``kanah`` sum for larger likelihoods by default to improve numerical stability
- Using the same ``zfit.Parameter`` for multiple arguments (i.e. to specify a common width in a PDF with a different width
for left and right) could cause a crash due to some internal caching. This is now fixed.
- Minimizers have now been renamed without the trailing ``V1``. The old names are still available but will be removed in the future.

Experimental
------------

Requirement changes
-------------------

Thanks
------

0.22.0

===================

Bug fixes and small changes
---------------------------
- change the truncated PDF with a yield to reflect a dynamic change in shape

Requirement changes
-------------------
- Upgrade from Pydantic V1 to V2

0.21.1

========================

Bug fixes and small changes
---------------------------
- ``full`` argument for binned NLLs was not working properly and return a partially optimized loss value.
- jit compile all methods of the loss (gradient, hessian) to avoid recompilation every time. This can possibly speed up
different minimizers significantly.

Page 1 of 12

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.