Error in Week 3, Assignment 1, Exercise 2

I am getting the following error, as the dimensions of the layers are not compatible with one another.

The error is in the context line, but I can’t figure out what it is.


Help would be appreciated!

Hey @stirom, your implementation of “a” is incorrect. You are not passing in the correct input shape, nor the correct hidden units size.

Thanks for replying. Unfortunately, I am still not fully understanding everything I am doing wrong.
I am thinking that the amount of units in the pre-attention layer is equal to n_a (units = n_a), and X is the input which is passed to this pre-attention layer? I can’t really think of anything else but X to pass onto the Bidirectional LSTM. Could you please give another hint?

Hi @stirom ,

The units set for LSTM is incorrect. Tx is the length of the input sequence. What LSTM needs is the size of the hidden state in the LSTM layer, so that should be n_a.

1 Like