Recurrent Neural Networks(RNN) suffer from short-term memory. This means that if there is a long sequence, an RNN will have a problem in carrying information from earlier time steps to later ones. Which may force the RNN to leave out important information from the beginning.

In-depth, during backpropagation, recurrent neural networks suffer from the vanishing of its gradients(Gradients are values used to update neural network weights)this problem occurs when the gradient shrinks as it back propagates through time.

But First, What is an RNN?

Recurrent Neural Network(RNN) is a type of Neural Network where the output from the previous step is fed as input to the current step. In other words, it converts the independent activations into dependent ones by providing the same weights and biases to all the layers, thus reducing the complexity of increasing parameters and memorizing each previous outputs by giving each output as input to the next hidden layer.

#recurrent-neural-network #artificial-intelligence #neural-networks #algorithms #lstm

Depth-Gated LSTM: From A to Z!
1.80 GEEK