Rockpool

Latest version: v2.8

Safety actively analyzes 642295 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 4

2.3.1

Hotfix

- Improved CI pipeline such that pipline is not blocked with sinabs.exodus cannot be installed
- Fixed UserWarning raised by some torch-backed modules
- Improved some unit tests

2.3

Added

- Standard dynamics introduced for LIF, Rate, Linear, Instant, ExpSyn. These are standardised across Jax, Torch and Numpy backends. We make efforts to guarantee identical dynamics for the standard modules across these backends, down to numerical precision
- LIF modules can now train threhsolds and biases as well as time constants
- New `JaxODELIF` module, which implements a trainable LIF neuron following common dynamical equations for LIF neurons
- New addition of the WaveSense network architecture, for temporal signal processing with SNNs. This is available in `rockpool.networks`, and is documented with a tutorial
- A new system for managing computational graphs, and mapping these graphs onto hardware architectures was introduced. These are documented in the Xylo quick-start tutorial, and in more detail in tutorials covering Computational Graphs and Graph Mapping. The mapping system performs design-rule checks for Xylo HDK
- Included methods for post-traning quantisation for Xylo, in `rockpool.transform`
- Added simulation of a divisive normalisation block for Xylo audio applications
- Added a `Residual` combinator, for convenient generation of networks with residual blocks
- Support for `sinabs` layers and Exodus
- `Module`, `JaxModule` and `TorchModule` provide facility for auto-batching of input data. Input data shape is `(B, T, Nin)`, or `(T, Nin)` when only a single batch is provided
- Expanded documentation on parameters and type-hinting

Changed

- Python > 3.6 is now required
- Improved import handling, when various computational back-ends are missing
- Updated for new versions of `samna`
- Renamed Cimulator -> XyloSim
- Better parameter handling and rockpool/torch parameter registration for Torch modules
- (Most) modules can accept batched input data
- Improved / additional documentation for Xylo

Fixed

- Improved type casting and device handling for Torch modules
- Fixed bug in Module, where `modules()` would return a non-ordered dict. This caused issues with `JaxModule`

Removed

- Removed several obsolete `Layer`s and `Network`s from Rockpool v1

2.2

Added

- Added support for the Xylo development kit in `.devices.xylo`, including several tutorials
- Added CTC loss implementations in `.training.ctc_loss`
- New trainable `torch` modules: `LIFTorch` and others in `.nn.modules.torch`, including an asynchronous delta modulator `UpDownTorch`
- Added `torch` training utilities and loss functions in `.training.torch_loss`
- New `TorchSequential` class to support `Sequential` combinator for `torch` modules
- Added a `FFwdStackTorch` class to support `FFwdStack` combinator for `torch` modules

Changed

- Existing `LIFTorch` module renamed to `LIFBitshiftTorch`; updated module to align better with Rockpool API
- Improvements to `.typehints` package
- `TorchModule` now raises an error if submodules are not `Torchmodules`

Fixed

- Updated LIF torch training tutorial to use new `LIFBitshiftTorch` module
- Improved installation instructions for `zsh`

2.1

Added

- 👹 Adversarial training of parameters using the *Jax* back-end, including a tutorial
- 🐰 "Easter" tutorial demonstrating an SNN trained to generate images
- 🔥 Torch tutorials for training non-spiking and spiking networks with Torch back-ends
- Added new method `nn.Module.timed()`, to automatically convert a module to a `TimedModule`
- New `LIFTorch` module that permits training of neuron and synaptic time constants in addition to other network parameters
- New `ExpSynTorch` module: exponential leak synapses with Torch back-end
- New `LinearTorch` module: linear model with Torch back-end
- New `LowPass` module: exponential smoothing with Torch back-end
- New `ExpSmoothJax` module: single time-constant exponential smoothing layer, supporting arbitrary transfer functions on output
- New `softmax` and `log_softmax` losses in `jax_loss` package
- New `utilities.jax_tree_utils` package containing useful parameter tree handling functions
- New `TSContinuous.to_clocked()` convenience method, to easily rasterise a continuous time series
- Alpha: Optional `_wrap_recorded_state()` method added to `nn.Module` base class, which supports wrapping recorded state dictionaries as `TimeSeries` objects, when using the high-level `TimeSeries` API
- Support for `add_events` flag for time-series wrapper class
- New Parameter dictionary classes to simplify conversion and handling of *Torch* and *Jax* module parameters
- Added `astorch()` method to parameter dictionaries returned form `TorchModule`
- Improved type hinting

Changed

- Old `LIFTorch` module renamed to `LIFBitshiftTorch`
- Kaiming and Xavier initialisation support for `Linear` modules
- `Linear` modules provide a bias by default
- Moved `filter_bank` package from V1 layers into `nn.modules`
- Update *Jax* requirement to > v2.13

Fixed

- Fixed *binder* links for tutorial notebooks
- Fixed bug in `Module` for multiple inheritance, where the incorrect `__repr__()` method would be called
- Fixed `TimedModuleWrapper.reset_state()` method
- Fixed axis limit bug in `TSEvent.plot()` method
- Removed page width constraint for docs
- Enable `FFExpSyn` module by making it independent of old `RRTrainedLayer`

Deprecated

- Removed `rpyc` dependency

Removed

2.0

- **New Rockpool API. Breaking change from v1.x**
- Documentation for new API
- Native support for Jax and Torch backends
- Many v1 Layers transferred

1.1.0.4

- Hotfix to remove references to ctxctl and aiCTX
- Hotfix to include NEST documentation in CI-built docs
- Hotfix to include change log in build docs

Page 2 of 4

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.