LSTM in pictures week1 - computing a<t>

Why on slide 38 LSTM in pictures to calculate

a the formula is given as Gamma_output*c

but the diagram suggests that we apply a tanh function to c before it is multiplied with the output of the output gate which I assume is the Gamma_output

The one involving the tanh is the right one. You’ll implement that in the assignment as well.

Here is the screenshot for the above post. the formula needs to be clarified or corrected as that there is clearly no mentioning of a tanh.

See this lecture https://www.coursera.org/learn/nlp-sequence-models/lecture/KXoay/long-short-term-memory-lstm at 4:32 for the tanh. Can you pinpoint to where you’re seeing a version without tanh ?

slide 38 LSTM in pictures to calculate

a the formula is given as Gamma_output(in red)*c(green). There is no tanh (yellow) applied to thte output of c(green)

the output of this tanh function is multiplied on the diagram with gamma_output to give us a

Which of the equations is the tanh on the diagram is referring to?

having checked the programming assingment there is a discrepancy between the lecture slide(above) and the programming assignment which matches what is in the diagram.

I think it’s worth to add a note after lecture slides to advise the students what the correct formula is or at least that there are different ways to comput a
tanh

The slide you’re showing doesn’t seem to exist in the video. @paulinpaloalto could you please file a ticket to fix the slide? Thanks.

Balaji I think you must be looking in the wrong course. This is not from the NLP course. Kaiju filed this under DLS Course 5, which is where the slide occurs. I’ll file a bug about the error. Thanks for reporting it!

@paulinpaloalto https://www.coursera.org/learn/nlp-sequence-models?specialization=deep-learning is the 5th course in DLS. Am I missing something?

Oh, sorry, you’re right. I was misled by the “nlp” in the URL and missed the deep-learning on the end there. The slide that Kaiju is referring to is in the lecture notes for C5 W1. It looks like the same mistake exists on slides 37 and 38. Note that it’s wrong in the lecture at 4:09, but then in the next slide that you point out at 4:32 Prof Ng has it correct. Also note that they do supply a correction reading item about the lectures that covers exactly this point.

4 Likes

No worries. Thank you.