Posts

Showing posts from April, 2019

Recurrent Neural Network and Long-Short Term Memory

Image
Recurrent Neural Network In traditional neural networks, all the inputs and outputs are independent of each other, but in cases like when it is required to predict the next word of a sentence, the previous words are required and hence there is a need to remember the previous words.  Thus Recurrent Neural Network (RNN) came into existence. RNN are networks with loops in them, allowing information to persist. Recurrent Neural Network Architecture In the above diagram, the network takes $ x_t$ as input and outputs $ y_t$. The hidden layer applies a formula to the current input as well as the previous state to get current state $ h_t$. The formula for the current state can be written like this: $h_t = tanh(l1(x_t) + r1(h_{t-1}))$ The output can be calculated: $y_t = l2(h_t)$ This chain-like nature reveals that recurrent neural networks are intimately related to sequences and lists. They’re the natural architecture of neural network to use for such data. Recurrent Neu