Dynesty

Latest version: v2.1.4

Safety actively analyzes 687959 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 5

2.0.2

Added
Changed
- When checkpointing is on the dynamic sampler will always checkpoint in the end of the run_nested() irrespective of checkpoint_time
- Equally weighted samples are now randomly shuffled ( 408 )
Fixed
- The live_points option was somewhat broken when blob option was introduced requiring a tuple of 4 elements irrespective of whether your likelihood returns blobs or not. Now if you use blob=True and want to provide live_points you need to provide 4 elements (u,v,logl,blobs). If you use blob=False you will need to provide just 3 elements as before (u,v,logl)

2.0.1

Added
Changed
- Speed up sampling when a single thread is used and the logl is fast by avoiding generating generating random seed sequence (by segasai)
Fixed
- Fix the non-working custom samplers (401 , 402 ; by ColmTalbot , segasai)
- Fix the broken resume when using dynesty pool (403; by segasai)

2.0.0

This is a major release with several significant improvements.
- The implementation of the check-points to save progress and allow restarting of fits [See here](https://dynesty.readthedocs.io/en/latest/quickstart.html#checkpointing)
- A new simple interface to obtain equally weighted samples directly from results object. [See here](https://dynesty.readthedocs.io/en/latest/crashcourse.html)
- Allow likelihood functions to return additional computed quantities (blobs) that will be saved together with samples. [See here](https://dynesty.readthedocs.io/en/latest/quickstart.html#saving-auxialiary-information-from-log-likelihood-function)

*IMPORTANT* This release includes some major refactoring of class structure in dynesty in to implement the check-pointing. While we haven't seen any breakage in our tests, it is possible that some of more-unusual workflows may be affected. Please submit a bug report on github if you see anything that doesn't look correct. Also, while every effort was made that checkpointing works correctly, it is possible that some corner-cases have been missed. Please report if you see any issues.

Added
- The nested samplers can now be saved and restored from the file using .save()
.restore() interface ( 386 ; by segasai )
- When sampling is performed using run_nested() it is now possible to perform checkpoints at regular intervals, allowing you then resume sampling if it was interrupted ( 386 ; by segasai )
- The nested sampler results object now allows to retrieve the equal weighted samples directly with results.equal_samples() method as well as allows you to retrieve the importance weights through .importance_weights() method ( 390 ; by segasai)
- A code for a multiprocessing pool that is specifically adapted for dynesty was added to dynesty.pool. It enables faster performance in the case if the likelihood function takes a long time to pickle or the data/(likelihood arguments) take long time to pickle ( 394 ; by segasai )
- The likelihood functions can now return auxiliary information (i.e. derived quantities) that will be preserved with the samples. This is done with the blob interface ( 395 ; by segasai )

Fixed
- Sampler.n_effective is no longer unnecessarily computed when sampling with
an infinite limit on n_effective. ( 379 ; by edbennett )
- In rare occasions, dynamic nested sampling fits with maxiter set could have failed with 'list index out of range' errors. That was addressed ( 392 ; by segasai )
- The Monte-Carlo volume calculations by RadFriends/SupFriends/MultiEllipsoid were inaccurate (fix 398; 399 ; by segasai )

Changed
- Setting n_effective for Sampler.run_nested() and DynamicSampler.sample_initial(), and n_effective_init for DynamicSampler.run_nested(), are deprecated ( 379 ; by edbennett )
- The slice sampling can now switch to doubling interval expansion algorithm from Neal(2003), if at any point of the sampling the interval was expanded more than 1000 times. It should help slice/rslice sampling of difficult posteriors ( 382 ; by segasai )
- The accumulation of statistics using to tune proposal distribution is now more robust when multi-threading/pool is used. Previously statistics from every queue_size call were used and all other were discarded. Now the statistics are accumulated from all the parallel sampling calls. That should help sampling of complex distributions. ( 385 ; by segasai )
- The .update_proposal() function that updates the states of samplers
now has an additional keyword which allows to either just accumulate the statistics from repeated function calls or actual update of the proposal. This was needed to not loose information when queue_size>1 ( 385 ; by segasai )
- The ellipsoid bounding has been sped up by not using the Cholesky transform , also a check was added/test expanded for possible numerical issues when sampling from multiple ellipsoids potentially causing assert q>0 ( 397 ; by segasai )
- The individual samplers now take as input a special Namedtuple SamplerArgument rather than just a tuple ( 400 ; by segasai ).

1.2.3

Added
- The .copy() method was added to the results class, as the previous versions
of dynesty had it.

Fixed
- Fix the bug where previously you couldn't repeatedly pickle and unpickle
a sampler
- Small speed-up of ellipsoidal sampling

1.2.2

Added

Fixed
- The problem with biased posteriors was fixed when using multi-ellipsoid
bounds and rslice and rwalk samplers. Previously the chains did not satisfy
detailed balance. (issue 364). Original discovery of the problem
and help by Colm Talbot. In the case of complex posteriors, somewhat slower
performance may be seen.
- Fix the issue introduced in 1.2.1 when the prior_transform returns a tuple or
or a list (rather than numpy array). Now that should be accepted.

1.2.1

Added

Fixed

- The arguments of prior_transform and likelihood function are now explicitly copied, so the sampling can work if those function apply changes to argument vectors ( 362 )
- Fix the compilation of the docs, and update them a bit

Page 2 of 5

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.