C4W1 Assigment - Decoder part - Dimension problem


After passing ‘context’ to x = self.embedding(context), it gets 4 dimensional, which is in turn not accepted by self.pre_attention_rnn(x, initial_state=state). I have no idea why this is happening.

Problems of this kind are difficult to debug without looking at the actual code. Here are my guesses:

  1. The matrix operations leading to layer lstm_245 are incorrect - mostly because in matrix notation AB \neq BA for all A and B
  2. Ordering of the layers is incorrect
  3. You missed a .sum(axis=-1) somewhere

Were you able to pass all unit tests leading to this cell? If yes, were you able to get the tests to work starting from a blank session (click Kernel → Restart and Run All Cells)? If none of these work, can you DM me the relevant error and code chunk?

check Your argument recall for return sequence in the RNN layer after attention. you probably might have either missed to recall one of the arguments.

Use search option, you will find many similar threads