Baal

Latest version: v2.0.0

Safety actively analyzes 723685 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 3

1.3.0

BaaL 1.3.0 is a release focused on UX.

Features

* Initial support for HF Trainer along with tutorials to use HuggingFace.
* Initial support for Semi-supervised learning, we are eager to see what the community will do with such a powerful tool!
* Fixes in ECE computation

Documentation

The biggest change in this release is the new website along with tons of content.
1. Tutorial on Deep Ensembles 94
2. Tutorial on NLP Classification 87
3. Tutorial on visualisation.
4. Added a BaaL cheatsheet to translate equations to code easily.
5. Added a list of "Core papers" to get new users started in Bayesian deep learning.

1.2.1

Changelogs

Features
* Initial support for ensembles. Example to come.
* Initial support for Pytorch Lightning. Example [here](https://github.com/ElementAI/baal/blob/master/experiments/pl_baal_example.py).

Bugfixes
* Fix BALD for binary classification
* Fix Random heuristic for generators
* Fix `to_cuda` for strings.
* Fix a bug where MCDropconnect would not work with DataParallel

Misc
* Warning when no layers are affected by `patch_layers` in MCDropout, MCDropconnect.

1.2.0

* Add DirichletCalibration (Kull et al. 2019), see our [blog post](https://baal.readthedocs.io/en/latest/reports/dirichlet_calibration.html).
* Add ECE Metrics for computing model's calibration.
* Add support for Multi-Input/Output for ModelWrapper
* Fix BatchBALD to be consistent with the official implementation
* Add ConsistentDropout, where the masks used in MC-Dropout are the same for each input.

Important notes
* BaaL is now part of Pytorch Ecosystem!

1.1

Changelog

* Support for MC-Dropconnect (Mobiny, 2019)
* `ActiveLearningDataset` now has better support for attributes specifics to the pool (see below).
* More flexible support multi-inputs/outputs in `ModelWrapper`.
* Can support list of inputs or outputs.
* QoL features on `ActiveLearningDataset`
* Can use a RandomState and add `load_state_dict`.
* Add `replicate_in_memory` flag to `ModelWrapper`.
* If False, the MC iterations are done in a for-loop instead of making a batch in memory.
* (This means `predict_on_batch` would not take up more memory than e.g. `test_on_batch`)
* Add `patience` and `min_epoch_for_es` to `ModelWrapper.train_and_test_on_datasets`.
* Allows early stopping.
* New [tutorial](https://baal.readthedocs.io/en/latest/sklearn_tutorial.html) on how to use BaaL with scikit-learn.
* Can now combine heuristics for multi-outputs models (see baal.active.heuristics.Combine).
* Fix documentation



New ActiveLearningDataset

To better support new tasks, `ActiveLearningDataset` can now support any attributes to be overrided when the pool is created.

**Example**:
python
from PIL import Image
from torch.utils.data import Dataset
from torchvision.transforms import Compose, ToTensor, RandomHorizontalFlip
from baal.active.dataset import ActiveLearningDataset


class MyDataset(Dataset):
def __init__(self):
self.my_tansforms = Compose([RandomHorizontalFlip(), ToTensor()])

def __len__(self):
return 10

def __getitem__(self, idx):
x = Image.open('an_image.png')
return self.my_tansforms(x)

al_dataset = ActiveLearningDataset(MyDataset(),
pool_specifics={
'my_tansforms': ToTensor()
})

Now `pool.my_tansforms = ToTensor()`
pool = al_dataset.pool

1.1.0

Page 3 of 3

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.