
Some examples of important design patterns for recurrent neural networks include the following:




Models that have recurrent connections from their outputs leading back into the model may be trained with teacher forcing. Teacher forcing is a procedure that emerges from the maximum likelihood criterion, in which during training the model receives the ground truth output y(t) as input at time t + 1.
The prediction value (y(t)) stationary needs to be verified.
RNN can take a sequence of vector x as input, it can also take in a fix number of inputs as inputs:

When input x is a sequence:
