This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
-
Updated
Sep 23, 2021 - Python
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Configurable Encoder-Decoder Sequence-to-Sequence model. Built with TensorFlow.
Chapter 9: Attention and Memory Augmented Networks
Tensorflow 2.0 tutorials for RNN based architectures for textual problems
Generate captions from images
A multi-layer bidirectional seq-2-seq chatbot with bahdanau attention.
Master Project on Image Captioning using Supervised Deep Learning Methods
A simple and easy to understand NLP teaching
Seq2Seq model implemented with pytorch, using Bahdanau Attention and Luong Attention.
Neural Machine Translation by Jointly Learning to Align and Translate paper implementation
s-atmech is an independent Open Source, Deep Learning python library which implements attention mechanism as a RNN(Recurrent Neural Network) Layer as Encoder-Decoder system. (only supports Bahdanau Attention right now).
Image Captioning is the process of generating textual description of an image. It uses both Natural Language Processing and Computer Vision to generate the captions.
Implementation of GRU-based Encoder-Decoder Architecture with Bahdanau Attention Mechanism for Machine Translation from German to English.
A language translator based on a very simple NLP Transformer model, backed by encoder, decoder and a Bahdanau Attention Layer in between, implemented on TensorFlow.
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
Solution for the Quora Insincere Questions Classification Kaggle competition.
Natural Language Processing
Implemented an Encoder-Decoder model in TensorFlow, where ResNet-50 extracts features from the VizWiz-Captions image dataset and a GRU with Bahdanau attention generates captions.
This repository contains TensorFlow/Keras models for implementing an Encoder-Decoder architecture for sequence-to-sequence tasks. It includes components such as Encoder, Decoder, Embedding Layer, LSTM Layer, Attention Mechanism, and more.
Add a description, image, and links to the bahdanau-attention topic page so that developers can more easily learn about it.
To associate your repository with the bahdanau-attention topic, visit your repo's landing page and select "manage topics."