QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
OTHER License
Statistics for this project are still being loaded, please check back later.
Header-only library for using Keras (TensorFlow) models in C++.
PyTorch and TensorFlow implementation of NCP, LTC, and CfC wired neural models
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabli...
Segmentation models with pretrained backbones. Keras and TensorFlow Keras.
Implementations of ResNet-18, ResNet-34, ResNet-50, ResNet-101, and ResNet-152 in TensorFlow 2, b...
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Pytorch❤️ Keras 😋😋
A framework based on Tensorflow for running variational Monte-Carlo simulations of quantum many-b...
TensorFlow and Deep Learning Tutorials
hyper-sinh: An Accurate and Reliable Activation Function from Shallow to Deep Learning in TensorF...
Layers Outputs and Gradients in Keras. Made easy.
An easy-to-use library for GLU (Gated Linear Units) and GLU variants in TensorFlow.
Implements EvoNorms B0 and S0 as proposed in Evolving Normalization-Activation Layers.
PyTorch and TensorFlow/Keras image models with automatic weight conversions and equal API/impleme...
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g...