Lava-nc

Latest version: v0.10.0

Safety actively analyzes 682441 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 4

0.1.2

Lava Optimization 0.1.2 is a bugfix dot release.

Features and Improvements
* Added more content to tutorial_01. Some tuning guidelines of learning rates α and β for the QP solver have been added

Bug Fixes and Other Changes
* Fixed bug affecting accuracy of floating-point QP solver (Issue 9) (PR69)
* Achieved speed up of QP solver execution speed by 50x eliminating busy waiting on CSP channel (Issue 16)(lava PR87)
* Changed solver interface to return solution values, removed default preconditioning in solver (Issue 15 & PR14 & PR19)
* Corrected typos in tutorials (Issue 15 & PR14 & PR19)

Breaking Changes
* No breaking changes in this release.

Known Issues
* No known issues at this point

What's Changed
* Lava-optimization v0.1.1 Changes by ashishrao7 in https://github.com/lava-nc/lava-optimization/pull/14
* Fixed Location of tutorials folder and fixed broken links to Lava core by ashishrao7 in https://github.com/lava-nc/lava-optimization/pull/19
* Clean up of explicit namespace declaration by bamsumit in https://github.com/lava-nc/lava-optimization/pull/18
* Lava Optimization 0.1.2 by mgkwill in https://github.com/lava-nc/lava-optimization/pull/20


**Full Changelog**: https://github.com/lava-nc/lava-optimization/compare/v0.1.1...v0.1.2

0.1.1

Lava Constraint Optimization Library
The BSD-3 licensed lava-optimization library will soon include neuromorphic optimization solvers for linear (LP), quadratic (QP), mixed-integer linear (MILP), mixed-integer quadratic (MIQP) and quadratically constrained quadratic (QCQP) programming, as well as solvers for quadratically unconstrained binary optimization (QUBO) and constraint satisfaction problems (CSP). This first release includes a quadratic programming (QP) solver, following the general design principles of all future solvers in the library. As a first example to demonstrate the basic usage concepts of the solver, we provide a tutorial for solving a LASSO/sparse-coding problem.

This first version is implemented using floating point arithmetic and executes on CPU only. In future releases, we will release additional solvers, with support for Loihi-compatible fixed-point arithmetic on CPU and support for execution on Loihi platforms. The various features and API of the solvers will be described at https://lava-nc.org/optimization.html. In the meantime, we appreciate any feedback on the API design and welcome contributions in areas such as enabling pre-conditioning, pre-solving, heuristics, and meta-heuristics for the solvers where such features are pertinent or interfacing with other packages like PuLP.

New Features and Improvements
* lava.lib.optimization.solver.qp is a first implementation of QP solver dynamics with equality and inequality constraints as well as unconstrained QPs in LAVA.

Bug Fixes and Other Changes
* This is the first release of Lava. No bug fixes or other changes.

Breaking Changes
* This is the first release of Lava. No breaking or other changes.

Known Issues
* Floating-point solutions from the solver compared to numpy implementation of the same dynamics are a little off (precision falls). However, the behavior is convergent towards the actual solution.
* The solver dynamics in Lava take longer than the dynamics in pure numpy resulting for the preliminary message passing implementation.
* The growth-rate of the constraint-correction constant is currently set experimentally. Principled tuning guidelines will be part of the next release. Without this, solution will in most cases show convergent behavior, but convergence will start slowing down/stop away from the optimal solution.

What's Changed
* V0.1.0 of Lava-optimization
* Removing __init__.py from lava/lib to avoid module clash across lava libraries by bamsumit in https://github.com/lava-nc/lava-optimization/pull/12
* Update version to 0.1.1 by mgkwill in https://github.com/lava-nc/lava-optimization/pull/13

New Contributors
* bamsumit made their first contribution in https://github.com/lava-nc/lava-optimization/pull/12
* mgkwill made their first contribution in https://github.com/lava-nc/lava-optimization/pull/1
* mathisrichter made their first contribution in https://github.com/lava-nc/lava-optimization/pull/8
* ashishrao7 made their first contribution in https://github.com/lava-nc/lava-optimization/pull/7

**Full Changelog**: https://github.com/lava-nc/lava-optimization/commits/v0.1.1

0.1.0

Lava Deep Learning Library
This first release of lava-dl under BSD-3 license provides two new modes of training deep event-based neural networks, either directly with SLAYER 2.0 or through hybrid ANN/SNN training using the Bootstrap module.

SLAYER 2.0 (lava.lib.dl.slayer) provides direct training of heterogenous event-based computational blocks with support for a variety of learnable neuron models, complex synaptic computation, arbitrary recurrent connection, and many more new features. The API provides high level building blocks that are fully autograd enabled and training utilities that make getting started with training SNNs extremely simple.

Bootstrap (lava.lib.dl.bootstrap) is a new training method for rate-coded SNNs. In contrast to prior ANNto-SNN conversion schemes, it relies on an equivalent “shadow” ANN during training to maintain fast training speed but to also accelerate SNN inference post-training dramatically with only few spikes. Although Bootstrap is currently separate from SLAYER, its API mirrors the familiar SLAYER API, enabling fast hybrid ANN-SNN training for minimal performance loss in ANN to SNN conversion.

At this point in time, Lava processes cannot be trained directly with backpropagation. Therefore, we will soon release the Network Exchange (lava.lib.dl.netx) module for automatic generation of Lava processes from SLAYER or Bootstrap-trained networks. At that point, networks trained with SLAYER or Bootstrap can be executed in Lava.

Open-source contributions to these libraries are highly welcome. You are invited to extend the collection neuron models supported by both SLAYER and Bootstrap. Check out the Neurons and Dynamics tutorial to learn how to create custom neuron models from the fundamental linear dynamics’ API.

New Features and Improvements
* lava.lib.dl.slayer is an extension of SLAYER for natively training a combination of different neuron models and architectures including arbitrary recurrent connections. The library is fully autograd compatible with custom CUDA acceleration when supported by the hardware.
* lava.lib.dl.bootstrap is a new training method for accelerated training of rate based SNN using dynamically estimated ANN as well as hybrid training with fully spiking layers for low latency rate coded SNNs.

Bug Fixes and Other Changes
* This is the first release of Lava. No bug fixes or other changes.

Breaking Changes
* This is the first release of Lava. No breaking or other changes.

Known Issues
* No known issues at this point.

What's Changed
* Lava-DL Release v0.1.0 by bamsumit in https://github.com/lava-nc/lava-dl/pull/5

New Contributors
* bamsumit made their first contribution in https://github.com/lava-nc/lava-dl/pull/5
* mgkwill made their first contribution in https://github.com/lava-nc/lava-dl/pull/1
* mathisrichter made their first contribution in https://github.com/lava-nc/lava-dl/pull/6

**Full Changelog**: https://github.com/lava-nc/lava-dl/commits/v0.1.0

Page 4 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.