W1 lecture on Bi-directional RNN

I am confused about this pic from bi-directional RNN lecture. Left-to-right indices for h-> I understand: t0 to T-1 in increasing by 1 order. But right-to-left indices for <-h I do not understand: t’0 to T to … to t3 to t2… I think there is some typo by the TA. May be you could comment and if I am right to be confused, someone could post a correct pic if possible.
Thank you.

Hello @Dennis_Sinitsky

I don’t know if you have take Deep Learning Specialisation, I am sharing a video

basically left to right t0 to t3 is computing network in forward direction where as the other right to left t0 to t3 is computing network in backward direction, to help compute a word from backward as well as forward direction.

Basically if there is a sentence, Dennis asked a doubt in Deep Learning.AI.

So to grasp information if Deep is deep learning.AI, it can take information from both directions provided the whole sequence of information is registered (which is the only disadvantage of Bidirection RNN)

The video I shared explains it properly. Feel free to ask any doubt.


Hi Deepti,
I took DL specialization (shame on me… :slight_smile: )
Now I understand the meaning of notation, all good.
Thank you

1 Like

Hello @Dennis_Sinitsky

Nothing shame in learning journey!!!
I believe learn like a kid and explore knowledge like a miner.
Even I forgot yesterday to put salt in my food :joy::crazy_face:

Just kidding!!!
Keep learning!!!