Bot releases are visible (Hide)
Published by github-actions[bot] 3 months ago
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.5.0...v0.6.0
November 9, 2023
Lava-dl SLAYER now has extended support for training and inference of video object detection networks and the associated pre and post processing utilities used for object detection. The object detection module is available as lava.lib.dl.slayer.obd
. The modules are described below:
Module | Description |
---|---|
obd.yolo_base |
the foundational model for YOLO object detection training which can be used to build a variety of YOLO models |
obd.models |
selected pre-trained YOLO SDNN models which can be fine-tuned for user-specific applications |
obd.dataset |
object detection dataset library (will be progressively extended) |
obd.bbox.metrics |
modules to evaluate object detection models |
obd.{bbox, dataset}.utils |
utilities to manipulate bounding boxes and dataset processing including frame visualization and video export |
Extensive tutorials for
are also available.
In addition, the lava-dl SLAYER tutorials now include XOR regression tutorial as a basic example to get started with lava-dl training.
Finally, lava-dl SLAYER now supports SpikeMoid loss, the official implementation of the spike-based loss introduced in
Jurado et. al., Spikemoid: Updated Spike-based Loss Methods for Classification.
which enables more advanced tuning of SNNs for classification.
Lava-dl NetX now supports users to configure inference of fully connected layers using sparse synapse instead of the default dense synapse. This allows the network to leverage the compression offered by sparse synapse if the fully connected weights are sparse enough. It is as simple as setting sparse_fc_layer=True
when initializing a netx.hdf5.Network
. netx.hdf5.Network
also supports global control of spike exponent (the fraction portion of spike message) by setting spike_exp
keyword. This allows users to control the network behavior in a more fine-grained manner and potentially avoid data overflow on Loihi hardware.
In addition, lava-dl NetX now includes sequential modules netx.modules
. These modules allow the creation of PyTorch style callable constructs whose behavior is described in the forward
function. In addition, these sequential modules also allow the execution of non-critical, but expensive management between calls in a parallel thread so that the execution flow is not blocked.
netx.modules.Quantize
and netx.modules.Dequantize
are now pre-built to allow for consistent quantization and dequantization to/from the fixed precision representation in the NetX network. Their usage can be seen in the YOLO SDNN inference on Lava and Loihi tutorial.
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.4.0...v0.5.0
Published by github-actions[bot] about 1 year ago
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.3.3...v0.4.0
Published by mgkwill over 1 year ago
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.3.2...v0.3.3
Published by PhilippPlank almost 2 years ago
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.3.1...v0.3.2
Published by mathisrichter almost 2 years ago
October 31, 2022
The lava-dl library version 0.3.1 now includes additional deep SNN inference and benchmarking tutorials.
lava.lib.dl.slayer
neuron normalization (#116)Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.3.0...v0.3.1
[^1]: Intel Core i5-5257U with 32GB RAM, running Ubuntu 20.04.2 LTS with lava v0.5.1. Performance results are based on testing as of November 2022 and may not reflect all publicly available security updates. Results may vary.
Published by mgkwill about 2 years ago
The lava-dl library version 0.3.0 now enables inference for trained spiking networks seamlessly on CPU or Loihi 2 backends and can leverage Loihi 2’s convolutional network compression and graded spike features for improved memory usage and performance.
block.AbstractInput
by @fangwei123456 in https://github.com/lava-nc/lava-dl/pull/105
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.2.0...v0.3.0
Published by mgkwill over 2 years ago
The lava-dl library version 0.2.0 now supports automated generation of Lava processes for a trained network described by hdf5 network configuration using our Network Exchange (NetX) library.
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.1.1...v0.2.0
Published by mgkwill almost 3 years ago
Lava Deep Learning 0.1.1 is a bugfix dot release.
Full Changelog: https://github.com/lava-nc/lava-dl/compare/v0.1.0...v0.1.1
Published by mgkwill almost 3 years ago
This first release of lava-dl under BSD-3 license provides two new modes of training deep event-based neural networks, either directly with SLAYER 2.0 or through hybrid ANN/SNN training using the Bootstrap module.
SLAYER 2.0 (lava.lib.dl.slayer) provides direct training of heterogenous event-based computational blocks with support for a variety of learnable neuron models, complex synaptic computation, arbitrary recurrent connection, and many more new features. The API provides high level building blocks that are fully autograd enabled and training utilities that make getting started with training SNNs extremely simple.
Bootstrap (lava.lib.dl.bootstrap) is a new training method for rate-coded SNNs. In contrast to prior ANNto-SNN conversion schemes, it relies on an equivalent “shadow” ANN during training to maintain fast training speed but to also accelerate SNN inference post-training dramatically with only few spikes. Although Bootstrap is currently separate from SLAYER, its API mirrors the familiar SLAYER API, enabling fast hybrid ANN-SNN training for minimal performance loss in ANN to SNN conversion.
At this point in time, Lava processes cannot be trained directly with backpropagation. Therefore, we will soon release the Network Exchange (lava.lib.dl.netx) module for automatic generation of Lava processes from SLAYER or Bootstrap-trained networks. At that point, networks trained with SLAYER or Bootstrap can be executed in Lava.
Open-source contributions to these libraries are highly welcome. You are invited to extend the collection neuron models supported by both SLAYER and Bootstrap. Check out the Neurons and Dynamics tutorial to learn how to create custom neuron models from the fundamental linear dynamics’ API.
Full Changelog: https://github.com/lava-nc/lava-dl/commits/v0.1.0