Lava-dl

Latest version: v0.6.0

Safety actively analyzes 693883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 2

0.3.0

The [_lava-dl_](https://github.com/lava-nc/lava-dl) library version 0.3.0 now enables inference for trained spiking networks seamlessly on CPU or Loihi 2 backends and can leverage Loihi 2’s convolutional network compression and graded spike features for improved memory usage and performance.

New Features and Improvements
* Added Loihi 2 support in lava-dl NetX utilizing Loihi 2 convolution support and graded spikes (PR 88, 107).
* Added a tutorial demonstrating PilotNet application running on Intel Loihi 2 (PR 107).
* Added accelerated training of recurrent topologies in lava-dl SLAYER (PR 103)
* Added Transposed Convolution and Unpool support in lava-dl SLAYER (PR 80)

Bug Fixes and Other Changes
* Improved lava-dl SLAYER codebase with bugfixes and additional documentation (PR 78, 105)

Breaking Changes
- No breaking changes in this release

Known Issues
- Issue training with GPU for lava-dl-slayer on Windows machine.

What's Changed
* Remove unnecessary imports by Tobias-Fischer in https://github.com/lava-nc/lava-dl/pull/56
* Update mnist.py by uslumt in https://github.com/lava-nc/lava-dl/pull/71
* Updated notebooks with new hyperparameters and typo fixes by bamsumit in https://github.com/lava-nc/lava-dl/pull/76
* Add pilotnet integration tests by mgkwill in https://github.com/lava-nc/lava-dl/pull/79
* Slayer fixes by bamsumit in https://github.com/lava-nc/lava-dl/pull/78
* Changes to lava-dl to reflect api changes in lava 0.4.0 by bamsumit in https://github.com/lava-nc/lava-dl/pull/88
* Added ConvT and Unpool block for neurons by alexggener in https://github.com/lava-nc/lava-dl/pull/80
* Update ci-build.yml, Remove redundant poetry updates by mgkwill in https://github.com/lava-nc/lava-dl/pull/89
* Recurrent mechanic by timcheck in https://github.com/lava-nc/lava-dl/pull/103
* Bump nbconvert from 6.5.0 to 6.5.1 by dependabot in https://github.com/lava-nc/lava-dl/pull/93
* fix bug of `block.AbstractInput` by fangwei123456 in https://github.com/lava-nc/lava-dl/pull/105
* Loihi Tutorials by bamsumit in https://github.com/lava-nc/lava-dl/pull/107
* Add conda install instructions with intel-numpy by mgkwill in https://github.com/lava-nc/lava-dl/pull/91
* Version 0.3.0 by mgkwill in https://github.com/lava-nc/lava-dl/pull/109

New Contributors
* Tobias-Fischer made their first contribution in https://github.com/lava-nc/lava-dl/pull/56
* uslumt made their first contribution in https://github.com/lava-nc/lava-dl/pull/71
* alexggener made their first contribution in https://github.com/lava-nc/lava-dl/pull/80
* timcheck made their first contribution in https://github.com/lava-nc/lava-dl/pull/103
* dependabot made their first contribution in https://github.com/lava-nc/lava-dl/pull/93
* fangwei123456 made their first contribution in https://github.com/lava-nc/lava-dl/pull/105

**Full Changelog**: https://github.com/lava-nc/lava-dl/compare/v0.2.0...v0.3.0

0.2.0

The [_lava-dl_](https://github.com/lava-nc/lava-dl) library version 0.2.0 now supports automated generation of Lava processes for a trained network described by hdf5 network configuration using our Network Exchange (NetX) library.

New Features and Improvements

* Released Network Exchange (NetX) library to support automated creation of Lava process for a deep network. We support hdf5 network exchange format. Support for more formats will be introduced in future. ([PR 30](https://github.com/lava-nc/lava-dl/pull/30), [Issue #29](https://github.com/lava-nc/lava-dl/issues/29))

Bug Fixes and Other Changes
- Fixed bug with pre-hook quantization function on conv blocks ([PR13](https://github.com/lava-nc/lava-dl/pull/13))

Breaking Changes
- No breaking changes in this release

Known Issues
- Issue training with GPU for lava-dl-slayer on Windows machine.

What's Changed
* Create PULL_REQUEST_TEMPLATE.md & ISSUE_TEMPLATE.md by mgkwill in https://github.com/lava-nc/lava-dl/pull/27
* Hardware neuron parameters exchange and fixed precision instruction precision compatibility by bamsumit in https://github.com/lava-nc/lava-dl/pull/25
* Pilotnet link fix by bamsumit in https://github.com/lava-nc/lava-dl/pull/31
* Bugfix: CUBA neuron normalization applied to current state by bamsumit in https://github.com/lava-nc/lava-dl/pull/35
* Netx by bamsumit in https://github.com/lava-nc/lava-dl/pull/30
* Streamline PilotNet SNN notebook with RefPorts by bamsumit in https://github.com/lava-nc/lava-dl/pull/37
* Fix for failing tests/lava/lib/dl/netx/test_hdf5.py by bamsumit in https://github.com/lava-nc/lava-dl/pull/44
* Update ci-build.yml by mgkwill in https://github.com/lava-nc/lava-dl/pull/42
* Install by mgkwill in https://github.com/lava-nc/lava-dl/pull/45
* Lava Deep Learning 0.2.0 by mgkwill in https://github.com/lava-nc/lava-dl/pull/46
* Lava Deep Learning 0.2.0 - update lock by mgkwill in https://github.com/lava-nc/lava-dl/pull/47


**Full Changelog**: https://github.com/lava-nc/lava-dl/compare/v0.1.1...v0.2.0

0.1.1

Lava Deep Learning 0.1.1 is a bugfix dot release.

Features and Improvements
* Added more content to tutorial_01. Some tuning guidelines of learning rates α and β for the QP solver have been added

Bug Fixes and Other Changes
* Fixed bug with pre-hook quantization function on conv blocks. (PR13)

Known Issues
* No known issues at this point

What's Changed
* Adding __init__.py to lava-dl/lava by awintel in https://github.com/lava-nc/lava-dl/pull/10
* Clean up of explicit namespace declaration by bamsumit in https://github.com/lava-nc/lava-dl/pull/11
* Fix Pool layer when pre_hook function is not None by valmat07 in https://github.com/lava-nc/lava-dl/pull/13

New Contributors
* awintel made their first contribution in https://github.com/lava-nc/lava-dl/pull/10
* valmat07 made their first contribution in https://github.com/lava-nc/lava-dl/pull/13

**Full Changelog**: https://github.com/lava-nc/lava-dl/compare/v0.1.0...v0.1.1

0.1.0

Lava Deep Learning Library
This first release of lava-dl under BSD-3 license provides two new modes of training deep event-based neural networks, either directly with SLAYER 2.0 or through hybrid ANN/SNN training using the Bootstrap module.

SLAYER 2.0 (lava.lib.dl.slayer) provides direct training of heterogenous event-based computational blocks with support for a variety of learnable neuron models, complex synaptic computation, arbitrary recurrent connection, and many more new features. The API provides high level building blocks that are fully autograd enabled and training utilities that make getting started with training SNNs extremely simple.

Bootstrap (lava.lib.dl.bootstrap) is a new training method for rate-coded SNNs. In contrast to prior ANNto-SNN conversion schemes, it relies on an equivalent “shadow” ANN during training to maintain fast training speed but to also accelerate SNN inference post-training dramatically with only few spikes. Although Bootstrap is currently separate from SLAYER, its API mirrors the familiar SLAYER API, enabling fast hybrid ANN-SNN training for minimal performance loss in ANN to SNN conversion.

At this point in time, Lava processes cannot be trained directly with backpropagation. Therefore, we will soon release the Network Exchange (lava.lib.dl.netx) module for automatic generation of Lava processes from SLAYER or Bootstrap-trained networks. At that point, networks trained with SLAYER or Bootstrap can be executed in Lava.

Open-source contributions to these libraries are highly welcome. You are invited to extend the collection neuron models supported by both SLAYER and Bootstrap. Check out the Neurons and Dynamics tutorial to learn how to create custom neuron models from the fundamental linear dynamics’ API.

New Features and Improvements
* lava.lib.dl.slayer is an extension of SLAYER for natively training a combination of different neuron models and architectures including arbitrary recurrent connections. The library is fully autograd compatible with custom CUDA acceleration when supported by the hardware.
* lava.lib.dl.bootstrap is a new training method for accelerated training of rate based SNN using dynamically estimated ANN as well as hybrid training with fully spiking layers for low latency rate coded SNNs.

Bug Fixes and Other Changes
* This is the first release of Lava. No bug fixes or other changes.

Breaking Changes
* This is the first release of Lava. No breaking or other changes.

Known Issues
* No known issues at this point.

What's Changed
* Lava-DL Release v0.1.0 by bamsumit in https://github.com/lava-nc/lava-dl/pull/5

New Contributors
* bamsumit made their first contribution in https://github.com/lava-nc/lava-dl/pull/5
* mgkwill made their first contribution in https://github.com/lava-nc/lava-dl/pull/1
* mathisrichter made their first contribution in https://github.com/lava-nc/lava-dl/pull/6

**Full Changelog**: https://github.com/lava-nc/lava-dl/commits/v0.1.0

Page 2 of 2

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.