We are glad to finally release PyDDA 1.0. In this version we have fixed the problems noted by issues 58 ,63 , and 64. In addition, we have been working hard to enhance the optimizer inside PyDDA. Therefore, we now provide the user with an option to use either Jax or TensorFlow (in addition to the current SciPy engine) to solve the optimization problem.
There are several advantages to the Jax and TensorFlow-based engines that we *highly* encourage our users to use. With Jax and TensorFlow, we are able to use automatic differentiation to calculate the gradients to the cost function. This makes the gradient calculation less susceptible to roundoff and boundary errors. In addition, Jax and TensorFlow both support CUDA-enabled GPUs. Therefore, if you are using a GPU, PyDDA is now capable of harnessing it in order to dramatically speed up the wind retrieval calculation. Even on a CPU-based system, the TensorFlow-based algorithm typically converges faster than the original SciPy-based algorithm. Therefore, we strongly encourage users to use these two other engines in their retrievals.
TensorFlow and Jax are optional dependencies. You need both TensorFlow 2.6.0 and tensorflow-probability in order to use the TensorFlow functionality.