Bot releases are hidden (Show)
Published by mgkwill over 1 year ago
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.3.2...v0.3.3
Published by PhilippPlank almost 2 years ago
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.3.1...v0.3.2
Published by mathisrichter almost 2 years ago
October 31, 2022
The lava-dl library version 0.3.1 now includes additional deep SNN inference and benchmarking tutorials.
lava.lib.dl.slayer
neuron normalization (#116)Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.3.0...v0.3.1
[^1]: Intel Core i5-5257U with 32GB RAM, running Ubuntu 20.04.2 LTS with lava v0.5.1. Performance results are based on testing as of November 2022 and may not reflect all publicly available security updates. Results may vary.
Published by mgkwill about 2 years ago
The lava-dl library version 0.3.0 now enables inference for trained spiking networks seamlessly on CPU or Loihi 2 backends and can leverage Loihi 2’s convolutional network compression and graded spike features for improved memory usage and performance.
block.AbstractInput
by @fangwei123456 in https://github.com/lava-nc/lava-dl/pull/105
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.2.0...v0.3.0
Published by mgkwill over 2 years ago
The lava-dl library version 0.2.0 now supports automated generation of Lava processes for a trained network described by hdf5 network configuration using our Network Exchange (NetX) library.
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.1.1...v0.2.0
Published by mgkwill almost 3 years ago
Lava Deep Learning 0.1.1 is a bugfix dot release.
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.1.0...v0.1.1
Published by mgkwill almost 3 years ago
This first release of lava-dl under BSD-3 license provides two new modes of training deep event-based neural networks, either directly with SLAYER 2.0 or through hybrid ANN/SNN training using the Bootstrap module.
SLAYER 2.0 (lava.lib.dl.slayer) provides direct training of heterogenous event-based computational blocks with support for a variety of learnable neuron models, complex synaptic computation, arbitrary recurrent connection, and many more new features. The API provides high level building blocks that are fully autograd enabled and training utilities that make getting started with training SNNs extremely simple.
Bootstrap (lava.lib.dl.bootstrap) is a new training method for rate-coded SNNs. In contrast to prior ANNto-SNN conversion schemes, it relies on an equivalent “shadow” ANN during training to maintain fast training speed but to also accelerate SNN inference post-training dramatically with only few spikes. Although Bootstrap is currently separate from SLAYER, its API mirrors the familiar SLAYER API, enabling fast hybrid ANN-SNN training for minimal performance loss in ANN to SNN conversion.
At this point in time, Lava processes cannot be trained directly with backpropagation. Therefore, we will soon release the Network Exchange (lava.lib.dl.netx) module for automatic generation of Lava processes from SLAYER or Bootstrap-trained networks. At that point, networks trained with SLAYER or Bootstrap can be executed in Lava.
Open-source contributions to these libraries are highly welcome. You are invited to extend the collection neuron models supported by both SLAYER and Bootstrap. Check out the Neurons and Dynamics tutorial to learn how to create custom neuron models from the fundamental linear dynamics’ API.
Full Changelog: https://github.com/lava-nc/lava-dl/commits/v0.1.0