Neurobayes

Latest version: v0.0.12

Safety actively analyzes 722491 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

0.0.12

_This release introduces partially Bayesian Transformers and neuron-level control over model stochasticity._

**Key Additions:**

**Partially Bayesian Transformers:** Transformer neural networks are at the heart of modern AI systems and are increasingly used in physical sciences. However, robust uncertainty quantification with Transformers remains challenging. While replacing all weights with probabilistic distributions and using advanced sampling techniques works for smaller networks, this approach is computationally prohibitive for Transformers. Our new partially Bayesian Transformer implementation allows you to selectively make specific modules (embedding, attention, etc.) probabilistic while keeping others deterministic, significantly reducing computational costs while still delivering reliable uncertainty quantification.

**Fine-grained Stochasticity Control:** Even with only some layers probabilistic, training deep learning models can be resource-intensive. You can now specify exactly which weights in particular layers should be stochastic, providing a finer control over the computational cost vs. uncertainty trade-off.

What's Changed
* Add layer-by-layer convergence diagnostics by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/31
* Add a classifier option to convnets by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/32
* Add basic Transformer (deterministic and partially Bayesian) by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/33
* Add hybrid-layers for PBNNs by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/36
* Fix classification option in Transformer by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/38
* Partial attention by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/39
* Add a simple option to view flax model layer configs by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/40
* Ensure num of features is displayed properly for attention layers by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/42


**Full Changelog**: https://github.com/ziatdinovmax/NeuroBayes/compare/0.0.10...0.0.12

0.0.10

Key updates:

* Classification Support for Full and Partial BNNs: While the initial focus was on regression with (P)BNNs - since most tasks in physical sciences deal with (quasi-)continuous variables - it was brought to my attention that some research domains can benefit from classification capabilities. So, the new update introduces classification support. To help you get started, I've provided two toy data examples, which can easily be generalized to real-world problems.
* Expanded SWA Options in JAX/Flax: This update enhances the Stochastic Weight Averaging options, providing more robust priors for both Full and Partial BNNs.
* Automatic Restart for HMC/NUTS: Now, HMC/NUTS for (P)BNNs can automatically restart in case of bad initializations, which helps during the autonomous exploration of parameter spaces in experiments and simulations.
* Additional Metrics for Active Learning and UQ: New metrics have been added to enhance the active learning and uncertainty quantification evaluation processes.
* Minor bug fixes, improved documentation, and more examples!

Looking ahead, the next major step will be expanding Partial BNNs beyond the current MLP and ConvNet architectures to include RNNs, GNNs, and Transformers.

0.0.9

What's Changed
Add an option to specify which layers in the provided architecture will be treated as probabilistic. For example,
python3
Initialize NN architecture
architecture = nb.FlaxMLP(hidden_dims = [64, 32, 16, 8], target_dim=1)

Make the first and output layers probabilistic and the rest deterministic
probabilistic_layer_names = ['Dense0', 'Dense4']

Intitalize and train a PBNN model
model = nb.PartialBNN(architecture, probabilistic_layer_names=probabilistic_layer_names)
model.fit(X_measured, y_measured, num_warmup=1000, num_samples=1000)



**Full Changelog**: https://github.com/ziatdinovmax/NeuroBayes/compare/0.0.7...0.0.9

0.0.7

What's Changed
* Add ConvNets by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/10
* Add option to set a custom prior over pre-trained priors by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/13
* Add example with heteroskedastic BNN and PBNN by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/17

**Full Changelog**: https://github.com/ziatdinovmax/NeuroBayes/compare/0.0.5...0.0.7

0.0.5

What's Changed
* Flax nets by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/2
* Trained priors by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/4
* Jax version by ziatdinovmax in https://github.com/ziatdinovmax/NeuroBayes/pull/5


**Full Changelog**: https://github.com/ziatdinovmax/NeuroBayes/compare/0.0.2...0.0.5

0.0.2

Links

Releases

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.