C4W1_Assignment exercise 3 - decoder

Hi @pongyuenlam

Your Logit shape has missing unit when compared to the expected output.

So first I would check the below two codes in the def call for decode layer being called correctly or not

Do a pass through the post attention LSTM
x = MAKE SURE YOU HAVE RECALLED THIS WITH THE CORRECT POST ATTENTION LSTM, WHICH would be ACCORDING TO THE RNN AFTER ATTENTION i.e. ???(x)

Compute the log-its
FOR THIS YOU ARE SUPPOSE TO USE THE RECALL LAYER FROM THE DENSE LAYER WITH LOGSOFTMAX ACTIVATION, WHICH WOULD BE??

If the above to codes were recalled correctly then check the below two code lines

  1. The RNN after attention (did you recall unit correctly? and return_sequences as??
  2. The dense layer with logsoftmax activation, here units is not units, check the instructions section mentioned for this layer. This one should have the same number of units as the size of the vocabulary since you expect it to compute the logits for every possible word in the vocabulary.

if all these as were mentioned correctly, then DM me a screenshot of the grade cell Decode, chances are your previous cell won’t have matching units.

Regards
DP

1 Like