Keras Attention Layer (Luong and Bahdanau scores).
APACHE-2.0 License
Probing the representations of Vision Transformers.
Collection of custom layers and utility functions for Keras which are missing in the main framework.
Layers Outputs and Gradients in Keras. Made easy.
Neural network visualization toolkit for tf.keras
This repository presents a Python-based implementation of the Transformer architecture on Keras T...
Keras Temporal Convolutional Network.
attention block for keras Functional Model with only tensorflow backend
Adventure into using multi attention recurrent neural networks for time-series (city traffic) for...
Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All Y...
Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://ar...
中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sen...
ktrain is a Python library that makes deep learning and AI more accessible and easier to apply
Keras implementation of the "Show, Attend and Tell" paper
Deep learning codes and projects using Python
A curated list of dedicated resources and applications