In this article we are going to discuss about very interesting topic of natural language processing(NLP) Neural Machine translation (NMT) using Attention model. Machine translation is nothing but automatic translation of text from one language to another.
Here we will learn how to use sequence to sequence architecture (seq2seq) with Bahdanau’s Attention mechanism for NMT.
This article assumes that you understand following:-
Before going through code we will discuss Bidirectional LSTM and Attention mechanism in short.
If you understand LSTM then Bidirectional is quite simple. In bidirectional network you can use simple RNN(Recurrent Neural Network), GRU (Gated Recurrent Unit) or LSTM(Long short Term Memory). I am going to use LSTM in this article.
#machine-translation #nlp #machine-learning #deep-learning