In ANN the equation during Forward Propagation is
Y = W.X + b.
What is the equation during Forward Propagation for RNN, as it involves
What is the difference between
RNN in terms of Back Propagation.
Also, what is the difference in functionality between
Dropout in ANN vs
Recurrent_Dropout in RNN.
Are there any other key differences between