Rockpool

Latest version: v2.9.1

Safety actively analyzes 682361 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

2.4

Major changes

- `Linear...` modules now *do not* have a bias parameter, by default.

Added

- Support for Xylo SNN core v2, via XyloSim. Including biases and quantisation support; mapping and deployment for Xylo SNN core v2 (SYNS61201)
- Added support for Xylo-A2 test board, with audio recording support from Xylo AFE (`AFESamna` and `XyloSamna`)
- Support for an LIF neuron including a trainable adaptive threshold (`aLIFTorch`). Deployable to Xylo
- New module `BooleanState`, which maintains a boolean state
- Support for membrane potential training using `LIFExodus`

Changed

- Xylo package support for HW versioning (SYNS61300; SYNS61201)
- Ability to return events, membrane potentials or synaptic currents as output from `XyloSim` and `XyloSamna`
- Enhanced Xylo `mapper` to be more lenient about weight matrix size --- now assumes missing weights are zero
- Xylo `mapper` is now more lenient about HW constraints, permitting larger numbers of input and output channels than supported by existing HDKs
- Xylo `mapper` supports a configurable number of maxmimum hidden and output neurons
- Running `black` is enforced by the CI pipeline
- `Linear...` modules now export bias parameters, if they are present
- `Linear...` modules now do not include bias parameters by default
- Xylo `mapper` now raises a warning if any linear weights have biases
- `LIFSlayer` renamed to `LIFExodus`, corresponding to `sinabs.exodus` library name change
- Periodic exponetial surrogate function now supports training thresholds

Fixed

- Fixes related to torch modules moved to simulation devices
- Fixed issue in `dropout.py`, where if jax was missing an ImportError was raised
- Fixed an issue with `Constant` `torch` parameters, where `deepcopy` would raise an error
- Fixed issue with newer versions of torch; torch v1.12 is now supported
- Updated to support changes in latest jax api
- Fixed bug in `WavesenseNet`, where neuron class would not be checked properly
- Fixed bug in `channel_quantize`, where *un*quantized weights were returned instead of quantized weights

Deprecated

- `LIFSlayer` is now deprecated

2.3.1

Hotfix

- Improved CI pipeline such that pipline is not blocked with sinabs.exodus cannot be installed
- Fixed UserWarning raised by some torch-backed modules
- Improved some unit tests

2.3

Added

- Standard dynamics introduced for LIF, Rate, Linear, Instant, ExpSyn. These are standardised across Jax, Torch and Numpy backends. We make efforts to guarantee identical dynamics for the standard modules across these backends, down to numerical precision
- LIF modules can now train threhsolds and biases as well as time constants
- New `JaxODELIF` module, which implements a trainable LIF neuron following common dynamical equations for LIF neurons
- New addition of the WaveSense network architecture, for temporal signal processing with SNNs. This is available in `rockpool.networks`, and is documented with a tutorial
- A new system for managing computational graphs, and mapping these graphs onto hardware architectures was introduced. These are documented in the Xylo quick-start tutorial, and in more detail in tutorials covering Computational Graphs and Graph Mapping. The mapping system performs design-rule checks for Xylo HDK
- Included methods for post-traning quantisation for Xylo, in `rockpool.transform`
- Added simulation of a divisive normalisation block for Xylo audio applications
- Added a `Residual` combinator, for convenient generation of networks with residual blocks
- Support for `sinabs` layers and Exodus
- `Module`, `JaxModule` and `TorchModule` provide facility for auto-batching of input data. Input data shape is `(B, T, Nin)`, or `(T, Nin)` when only a single batch is provided
- Expanded documentation on parameters and type-hinting

Changed

- Python > 3.6 is now required
- Improved import handling, when various computational back-ends are missing
- Updated for new versions of `samna`
- Renamed Cimulator -> XyloSim
- Better parameter handling and rockpool/torch parameter registration for Torch modules
- (Most) modules can accept batched input data
- Improved / additional documentation for Xylo

Fixed

- Improved type casting and device handling for Torch modules
- Fixed bug in Module, where `modules()` would return a non-ordered dict. This caused issues with `JaxModule`

Removed

- Removed several obsolete `Layer`s and `Network`s from Rockpool v1

2.2

Added

- Added support for the Xylo development kit in `.devices.xylo`, including several tutorials
- Added CTC loss implementations in `.training.ctc_loss`
- New trainable `torch` modules: `LIFTorch` and others in `.nn.modules.torch`, including an asynchronous delta modulator `UpDownTorch`
- Added `torch` training utilities and loss functions in `.training.torch_loss`
- New `TorchSequential` class to support `Sequential` combinator for `torch` modules
- Added a `FFwdStackTorch` class to support `FFwdStack` combinator for `torch` modules

Changed

- Existing `LIFTorch` module renamed to `LIFBitshiftTorch`; updated module to align better with Rockpool API
- Improvements to `.typehints` package
- `TorchModule` now raises an error if submodules are not `Torchmodules`

Fixed

- Updated LIF torch training tutorial to use new `LIFBitshiftTorch` module
- Improved installation instructions for `zsh`

2.1

Added

- 👹 Adversarial training of parameters using the *Jax* back-end, including a tutorial
- 🐰 "Easter" tutorial demonstrating an SNN trained to generate images
- 🔥 Torch tutorials for training non-spiking and spiking networks with Torch back-ends
- Added new method `nn.Module.timed()`, to automatically convert a module to a `TimedModule`
- New `LIFTorch` module that permits training of neuron and synaptic time constants in addition to other network parameters
- New `ExpSynTorch` module: exponential leak synapses with Torch back-end
- New `LinearTorch` module: linear model with Torch back-end
- New `LowPass` module: exponential smoothing with Torch back-end
- New `ExpSmoothJax` module: single time-constant exponential smoothing layer, supporting arbitrary transfer functions on output
- New `softmax` and `log_softmax` losses in `jax_loss` package
- New `utilities.jax_tree_utils` package containing useful parameter tree handling functions
- New `TSContinuous.to_clocked()` convenience method, to easily rasterise a continuous time series
- Alpha: Optional `_wrap_recorded_state()` method added to `nn.Module` base class, which supports wrapping recorded state dictionaries as `TimeSeries` objects, when using the high-level `TimeSeries` API
- Support for `add_events` flag for time-series wrapper class
- New Parameter dictionary classes to simplify conversion and handling of *Torch* and *Jax* module parameters
- Added `astorch()` method to parameter dictionaries returned form `TorchModule`
- Improved type hinting

Changed

- Old `LIFTorch` module renamed to `LIFBitshiftTorch`
- Kaiming and Xavier initialisation support for `Linear` modules
- `Linear` modules provide a bias by default
- Moved `filter_bank` package from V1 layers into `nn.modules`
- Update *Jax* requirement to > v2.13

Fixed

- Fixed *binder* links for tutorial notebooks
- Fixed bug in `Module` for multiple inheritance, where the incorrect `__repr__()` method would be called
- Fixed `TimedModuleWrapper.reset_state()` method
- Fixed axis limit bug in `TSEvent.plot()` method
- Removed page width constraint for docs
- Enable `FFExpSyn` module by making it independent of old `RRTrainedLayer`

Deprecated

- Removed `rpyc` dependency

Removed

2.0

- **New Rockpool API. Breaking change from v1.x**
- Documentation for new API
- Native support for Jax and Torch backends
- Many v1 Layers transferred

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.