In natural language, sentences contain sequential information but traditional neural network can’t capture this information because at any given time step, traditional neural network don’t have access to information in the past. In order for a neural network to capture this sequential information, the model needs to ”remember” previous information and this is only possible if we can allow the previous hidden states to flow through to the current state. This is where Recurrent Neural Network (RNN) comes in. As you can see from the figure below, at time step t, output of the neuron is dependent on the current input and the previous hidden state at time step t – 1. This shows that, over time, with an RNN, information at time step 0 can flow through to time step 5, allowing information to persist and the neural network to capture sequential information of the sentence.

Ryan

Ryan

Data Scientist

Leave a Reply