RNN (Recurrent Neural Network)

How does RNN remember past data?
Can anyone explain the core concept behind differences in ANN and RNN?

Solution to

  1. RNN generally use the memory concept i.e., using the hidden state and weight sharing across each time steps. So when ever the next state is evaluated it consider the previous data points (it allow RNN to remember) and use that information to predict the current output. So for this you have to understand the concept of Forward and back propagation. For more information you can follow the course on Sequence Models where everything is been explained very clearly.
  2. The basic difference between the ANN and RNN is again their structure arrangement. In ANN we say that it follow the feedforward path i.e., the data flow in one direction whereas whereas in RNN use the recurrent structure which is use the loop to process the information.
    Hope you have understood the basic concept behind this.
3 Likes

Essentially, an ANN uses a flat 2D structure for the weights. The same weights are applied to every example.

In comparison, an RNN has an additional dimension to the weight matrices, so that it can apply different weights to each example in a sequence.

3 Likes

like a time=series prob, I imagine.

This is true.

They are used for any situation which involves a series of events. Language models are a big one (each word is an item in the sequence).

@Dr.Stone @TMosh If you are taking with respect to the sequence data. Then the sequence may or may not be time series.

So let consider example of speech signal.
So here we can say that it is a time series.
So let try to understand it w.r.t signal processing.
So actually when you write
time = series
It mean we are referring time at x -axis and y -axis will represent the amplitude of it. (Or we can say series is referring to the data points.)
For understanding we can say it is ok to write time=series, but in general when ever we refer to the signal we always say time series and generally avoid the term time=series. It makes a difference.

So we always write equation as
x(t)=Signal amplitude as a function of time (t)

I hope you got your answer.

I agree, sequences do not have to be a time series.

1 Like

10 billion percent! time = series was a typo. glad to get clarified. Instead of time series its rather sequential, time series being just an instance.

1 Like