Netket

Latest version: v3.12.0

Safety actively analyzes 628350 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 6

3.13

Bug Fixes

3.12

New Features
* Discrete Hilbert spaces now use a special {class}`nk.utils.StaticRange` object to store the local values that label the local degree of freedom. This special object is jax friendly and can be converted to arrays, and allows for easy conversion from the local degrees of freedom to integers that can be used to index into arrays, and back. While those objects are not really used internally yet, in the future they will be used to simplify the implementations of operators and other objects [1732](https://github.com/netket/netket/issues/1732).
* Some utilities to time execution of training loop are now provided, that can be used to coarsely see what part of the algorithm is dominating the training cost. To use it, pass `driver.run(..., timeit=True)` to all drivers when running them.
* Added several new tensor network ansatze to the `nk.models.tensor_networks` namespace. Those also replace previous tensor network implementations, that were de-facto broken [1745](https://github.com/netket/netket/issues/1745).
* Add jax implementation of Bose Hubbard Operator, named {class}`netket.operator.BoseHubbardJax` and split numba implementation in a separate class [1773](https://github.com/netket/netket/issues/1773).
* NetKet now automatically sets the visible GPUs when running under MPI with GPUs, by enumerating local GPUs and setting `jax_default_device` according to some local rank. This behaviour should allow users to not have to specify `CUDA_VISIBLE_DEVICES` and local mpi ranks on their scripts. This behaviour is only activated when running using MPI, and not used when using experimental sharding mode. To disable this functionality, set `NETKET_MPI_AUTODETECT_LOCAL_GPU=0` [1757](https://github.com/netket/netket/issues/1757).
* {class}`netket.experimental.models.Slater2nd` now implements also the generalized hartree fock, as well as the restricted and unrestricted HF of before [1765](https://github.com/netket/netket/issues/1765).
* A new variational state computing the sum of multiple slater determinants has been added, named {class}`nk.experimental.models.MultiSlater2nd`. This state has the same options of {class}`~netket.experimental.models.Slater2nd` [1765](https://github.com/netket/netket/issues/1765).
* Support for `jax>=0.4.27` [1801](https://github.com/netket/netket/issues/1801).

Breaking Changes
* The `out` keyword of Discrete Hilbert indexing methods (`all_states`, `numbers_to_states` and `states_to_numbers`) deprecated in the last release has been removed completely [1722](https://github.com/netket/netket/issues/1722).
* The Homogeneous Hilbert spaces now must store the list of valid local values for the states with a {class}`nk.utils.StaticRange` objects instead of list of floats. The constructors have been updated accordingly. {class}`~nk.utils.StaticRange` is a range-like object that is jax-compatible and from now on should be used to index into local hilbert spaces [1732](https://github.com/netket/netket/issues/1732).
* The `numbers_to_states` and `states_to_numbers` methods of {class}`netket.hilbert.DiscreteHilbert` must now be jax jittable. Custom Hilbert spaces using non-jittable functions have to be adapted by including a {func}`jax.pure_callback` in the `numbers_to_states`/`states_to_numbers` member functions [1748](https://github.com/netket/netket/issues/1748).
* {attr}`~netket.vqs.MCState.chunk_size` must be set to an integer and will error immediately otherwise. This might break some code, but in general should give more informative error messages overall [1798](https://github.com/netket/netket/issues/1798).

Deprecations
* The method {func}`netket.nn.states_to_numbers` is now deprecated. Please use {meth}`~DiscreteHilbert.numbers_to_states` directly.

Improvements
* Rewrite the code for generating random states of `netket.hilbert.Fock` and `netket.hilbert.Spin` in Jax and jit the `init` and `reset` functions of `netket.sampler.MetropolisSampler` for better performance and improved compatibility with sharding [1721](https://github.com/netket/netket/pull/1721).
* Rewrite `netket.hilbert.index` used by `HomogeneousHilbert` (including `Spin` and `Fock`) so that larger spaces with a sum constraint can be indexed. This can be useful for `netket.sampler.Exactsampler`, `netket.vqs.FullSumState` as well as for ED calculations [1720](https://github.com/netket/netket/pull/1720).
* Duplicating a `netket.vqs.MCState` now leads to perfectly deterministic, identical samples between two different copies of the same `MCState` even if the sampler is changed. Previously, duplicating an `MCState` and changing the sampler on two copies of the same state would lead to some completely random seed being used and therefore different samples to be generated. This change is needed to eventually achieve proper checkpointing of our calculations [1778](https://github.com/netket/netket/pull/1778).
* The methods converting Jax Operators to another kind (such as LocalOperators to PauliOperators) will return the Jax version of those operators if available [1781](https://github.com/netket/netket/pull/1781).
* Parallel Tempering samplers {class}`netket.experimental.sampler.MetropolisPt` now accept a distribution (`lin` or `log`) for the distribution of the temperatures, or a custom array [1786](https://github.com/netket/netket/pull/1786).

Finalized Deprecations
* Removed module function `netket.sampler.sample_next` that was deprecated in NetKet 3.3 (December 2021) [17XX](https://github.com/netket/netket/pull/17XX).

Internal changes
* Initialize the MetropolisSamplerState in a way that avoids recompilation when using sharding [1776](https://github.com/netket/netket/pull/1776).
* Wrap several functions in the samplers and operators with a `shard_map` to avoid unnecessary collective communication when doing batched indexing of sharded arrays [1777](https://github.com/netket/netket/pull/1777).
* Callbacks are now Pytree and can be flattened/unflatted and serialized with flax [1666](https://github.com/netket/netket/pull/1666).

Bug Fixes
* Fixed the gradient of variational states w.r.t. complex parameters which was missing a factor of 2. The learning rate needs to be halved to reproduce simulations made with previous versions of NetKet [1785](https://github.com/netket/netket/pull/1785).
* Fixed the bug [1791](https://github.com/netket/netket/pull/1791). where MetropolisHamiltonian with jax operators was leaking tracers and crashing [#1792](https://github.com/netket/netket/pull/1792).
* The bug in Parallel Tempering samplers was found and they have now been fixed. In short, usages until now were most likely returning garbage samples, but not anymore! [1769](https://github.com/netket/netket/pull/1769).

3.11.3

Bugfix release addressing the following issues:
* Fixes a bug where the conjugate of a fermionic operator was the conjugate-transpose, and the hermitian transpose `.H` was the identity. This could break code relying on complex-valued fermionic operators [1743](https://github.com/netket/netket/pull/1743).
* Fixed a bug when converting jax operators to qutip format [1749](https://github.com/netket/netket/pull/1749).
* Fixed an internal bug of `netket.utils.struct.Pytree`, where the cached properties's cache was not cleared when `replace` was used to copy and modify the Pytree [1750](https://github.com/netket/netket/pull/1750).
* Update upper bound on optax to `optax<0.3`, following the release of `optax` 0.2 [1751](https://github.com/netket/netket/pull/1751).
* Support QuTiP 5, released in march 2024 [1762](https://github.com/netket/netket/pull/1762).

3.11.2

Bugfix release to solve the following issues:
* Fix error thrown in repr method of error thrown in TDVP integrators.
* Fix repr error of {class}`nk.sampler.rules.MultipleRules` [1729](https://github.com/netket/netket/pull/1729).
* Solve an issue with RK Integrators that could not be initialised with integer `t0` initial time if `dt` was a float, as well as a wrong `repr` method leading to uncomprehensible stacktraces [1736](https://github.com/netket/netket/pull/1736).

3.11.1

Bugfix release to solve two issues:

* Fix `reset_chains=True` does not work in `NETKET_EXPERIMENTAL_SHARDING` mode [1727](https://github.com/netket/netket/pull/1727).
* Fix unsolvable deprecation warning when using `DoubledHilbert` [1728](https://github.com/netket/netket/pull/1728).

3.11

This release supports Python 3.12 through the latest release of Numba, introduces several new jax-compatible operators and adds a new experimental way to distribute calculations among multiple GPUs without using MPI.

We have a few breaking changes as well: deprecations that were issued more than 18 months ago have now been finalized, most notable the `dtype` argument to several models and layers, some keywords to GCNN and setting the number of chains of exact samplers.

New Features

* Recurrent neural networks and layers have been added to `nkx.models` and `nkx.nn` [1305](https://github.com/netket/netket/pull/1305).
* Added experimental support for running NetKet on multiple jax devices (as an alternative to MPI). It is enabled by setting the environment variable/configuration flag `NETKET_EXPERIMENTAL_SHARDING=1`. Parallelization is achieved by distributing the Markov chains / samples equally across all available devices utilizing [`jax.Array` sharding](https://jax.readthedocs.io/en/latest/notebooks/Distributed_arrays_and_automatic_parallelization.html). On GPU multi-node setups are supported via [jax.distribued](https://jax.readthedocs.io/en/latest/multi_process.html), whereas on CPU it is limited to a single process but several threads can be used by setting `XLA_FLAGS='--xla_force_host_platform_device_count=XX'` [#1511](https://github.com/netket/netket/pull/1511).
* {class}`netket.experimental.operator.FermionOperator2nd` is a new Jax-compatible implementation of fermionic operators. It can also be constructed starting from a standard fermionic operator by calling `operator.to_jax_operator()`, or used in combination with `pyscf` converters[1675](https://github.com/netket/netket/pull/1675),[#1684](https://github.com/netket/netket/pull/1684).
* {class}`netket.operator.LocalOperatorJax` is a new Jax-compatible implementation of local operators. It can also be constructed starting from a standard operator by calling `operator.to_jax_operator()` [1654](https://github.com/netket/netket/pull/1654).
* The logger interface has been formalised and documented in the abstract base class {class}`netket.logging.AbstractLog` [1665](https://github.com/netket/netket/pull/1665).
* The {class}`~netket.experimental.sampler.ParticleExchange` sampler and corresponding rule {class}`~netket.experimental.sampler.rules.ParticleExchangeRule` has been added, which special cases {class}`~netket.sampler.ExchangeSampler` to fermionic spaces in order to avoid proposing moves where the two site exchanged have the same population [1683](https://github.com/netket/netket/issues/1683).

Breaking Changes

* The {class}`netket.models.Jastrow` wave-function now only has {math}`N (N-1)` variational parameters, instead of the {math}`N^2` redundant ones it had before. Saving and loading format has now changed and won't be compatible with previous versions[1664](https://github.com/netket/netket/pull/1664).
* Finalize deprecations of some old methods in `nk.sampler` namespace (see original commit [1f77ad8267e16fe8b2b2641d1d48a0e7ae94832e](https://github.com/netket/netket/commit/1f77ad8267e16fe8b2b2641d1d48a0e7ae94832e))
* Finalize deprecations of 2D input to DenseSymm layers, which now turn into error and `extra_bias` option of Equivariant Networks/GCNNs (see original commit [c61ea542e9d0f3e899d87a7471dea96d4f6b152d](https://github.com/netket/netket/commit/c61ea542e9d0f3e899d87a7471dea96d4f6b152d))
* Finalize deprecations of very old input/properties to Lattices [0f6f520da9cb6afcd2361dd6fd029e7ad6a2693e](https://github.com/netket/netket/commit/0f6f520da9cb6afcd2361dd6fd029e7ad6a2693e))
* Finalie the deprecation for `dtype=` attribute of several modules in `nk.nn` and `nk.models`, which has been printing an error since April 2022. You should update usages of `dtype=` to `param_dtype=` [1724](https://github.com/netket/netket/issues/1724)


Deprecations

* `MetropolisSampler.n_sweeps` has been renamed to {attr}`~netket.sampler.MetropolisSampler.MetropolisSampler.sweep_size` for clarity. Using `n_sweeps` when constructing the sampler now throws a deprecation warning; `sweep_size` should be used instead going forward [1657](https://github.com/netket/netket/issues/1657).
* Samplers and metropolis rules defined as {func}`netket.utils.struct.dataclass` are deprecated because the base class is now a {class}`netket.utils.struct.Pytree`. The only change needed is to remove the dataclass decorator and define a standard init method [1653](https://github.com/netket/netket/issues/1653).
* The `out` keyword of Discrete Hilbert indexing methods (`all_states`, `numbers_to_states` and `states_to_numbers`) is deprecated and will be removed in the next release. Plan ahead and remove usages to avoid breaking your code 3 months from now [1725](https://github.com/netket/netket/issues/1725)!

Internal changes
* A new class {class}`netket.utils.struct.Pytree`, can be used to create Pytrees for which inheritance autoamtically works and for which it is possible to define `__init__`. Several structures such as samplers and rules have been transitioned to this new interface instead of old style `struct.dataclass` [1653](https://github.com/netket/netket/issues/1653).
* The {class}`~netket.experimental.operator.FermionOperator2nd` and related classes now store the constant diagonal shift as another term instead of a completely special cased scalar value. The same operators now also respect the `cutoff` keyword argument more strictly [1686](https://github.com/netket/netket/issues/1686).
* Dtypes of the matrix elements of operators are now handled more correctly, and fewer warnings are raised when running NetKet in X32 mode. Moreover, operators like Ising now default to floating point dtype even if the coefficients are integers [1697](https://github.com/netket/netket/issues/1697).

Bug Fixes
* Support multiplication of Discrete Operators by Sparse arrays [1661](https://github.com/netket/netket/issues/1661).

Page 1 of 6

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.