What's Changed
* Fix jit issue with relative epsilon and copying epsilons by JTT94 in https://github.com/ott-jax/ott/pull/52
* Bug/scale cost by MUCDK in https://github.com/ott-jax/ott/pull/54
* ensure 3 geom objects returned when copying epsilons by JTT94 in https://github.com/ott-jax/ott/pull/56
* fixed logic of __call__ of LRSinkhorn to prevent uninterpretable error by MUCDK in https://github.com/ott-jax/ott/pull/57
* Project import generated by Copybara. by LaetitiaPapaxanthos in https://github.com/ott-jax/ott/pull/58
* General cost fn by JTT94 in https://github.com/ott-jax/ott/pull/61
* Set converged logic by zoepiran in https://github.com/ott-jax/ott/pull/62
* Fix `LRSinkhornOutput.apply` with batches by michalk8 in https://github.com/ott-jax/ott/pull/63
* Fix is_all_geoms_lr not returning by michalk8 in https://github.com/ott-jax/ott/pull/67
* Feature/quadratic problem scale by michalk8 in https://github.com/ott-jax/ott/pull/66
* Fix LR tests, fused_penalty property, tolerances by michalk8 in https://github.com/ott-jax/ott/pull/68
* add first version of SCOT demonstration notebook (Gromov-Wasserstein for multi-omics) by antoinebelloir in https://github.com/ott-jax/ott/pull/64
* Fix LR Gromov memory by michalk8 in https://github.com/ott-jax/ott/pull/70
* Fix `scale_cost` to LRGeometry, float/max-bound by michalk8 in https://github.com/ott-jax/ott/pull/72
* Update README to use Markdown math by adrhill in https://github.com/ott-jax/ott/pull/75
* Fix neural dual notebook by lucaeyring in https://github.com/ott-jax/ott/pull/76
New Contributors
* JTT94 made their first contribution in https://github.com/ott-jax/ott/pull/52
* MUCDK made their first contribution in https://github.com/ott-jax/ott/pull/54
* antoinebelloir made their first contribution in https://github.com/ott-jax/ott/pull/64
* adrhill made their first contribution in https://github.com/ott-jax/ott/pull/75
* lucaeyring made their first contribution in https://github.com/ott-jax/ott/pull/76
**Full Changelog**: https://github.com/ott-jax/ott/compare/0.2.5...0.2.6