C5 W2 A2 Ex 5 - Error in test

Dear mentor, can you please advise?

Test failed
Expected value

[‘LSTM’, (None, 4, 128), 67072, (None, 4, 2), ‘tanh’, True]

does not match the input value:

[‘LSTM’, (None, 128), 67072, (None, 4, 2), ‘tanh’, False]

AssertionError Traceback (most recent call last)
in
28
29
—> 30 Emojify_V2_test(Emojify_V2)

in Emojify_V2_test(target)
25
26 expectedModel = [[‘InputLayer’, [(None, 4)], 0], [‘Embedding’, (None, 4, 2), 30], [‘LSTM’, (None, 4, 128), 67072, (None, 4, 2), ‘tanh’, True], [‘Dropout’, (None, 4, 128), 0, 0.5], [‘LSTM’, (None, 128), 131584, (None, 4, 128), ‘tanh’, False], [‘Dropout’, (None, 128), 0, 0.5], [‘Dense’, (None, 5), 645, ‘linear’], [‘Activation’, (None, 5), 0]]
—> 27 comparator(summary(model), expectedModel)
28
29

~/work/W2A2/test_utils.py in comparator(learner, instructor)
21 “\n\n does not match the input value: \n\n”,
22 colored(f"{a}", “red”))
—> 23 raise AssertionError(“Error in test”)
24 print(colored(“All tests passed!”, “green”))
25

AssertionError: Error in test

Please check the value of return_sequences argument of the LSTM layer.

First LSTM: return_sequences=True
Second LSTM: return_sequences=False

I think you are propagating the embedding_layer , not embeddings, through an LSTM layer with 128-dimensional hidden state. It must be embeddings.

I had “embeddings” inside the final brackets of both LSTM’s, Ok. I guess it should have been (X) on the second, as I have to propagate X through the 2nd LSTM.

Yes.