C5_W2_A2 Emojify

Hi, I’m having some trouble with the last exercise of the Emojify assignment, I have passed all the previous exercises but in the last one I get this error:

Test failed
Expected value

[‘LSTM’, (None, 4, 128), 67072, (None, 4, 2), ‘tanh’, True]

does not match the input value:

[‘LSTM’, (None, 128), 67072, (None, 4, 2), ‘tanh’, False]

AssertionError Traceback (most recent call last)
—> 24 Emojify_V2_test(Emojify_V2)

in Emojify_V2_test(target)
20 expectedModel = [[‘InputLayer’, [(None, 4)], 0], [‘Embedding’, (None, 4, 2), 30], [‘LSTM’, (None, 4, 128), 67072, (None, 4, 2), ‘tanh’, True], [‘Dropout’, (None, 4, 128), 0, 0.5], [‘LSTM’, (None, 128), 131584, (None, 4, 128), ‘tanh’, False], [‘Dropout’, (None, 128), 0, 0.5], [‘Dense’, (None, 5), 645, ‘linear’], [‘Activation’, (None, 5), 0]]
—> 21 comparator(summary(model), expectedModel)

~/work/W2A2/test_utils.py in comparator(learner, instructor)
21 “\n\n does not match the input value: \n\n”,
22 colored(f"{a}", “red”))
—> 23 raise AssertionError(“Error in test”)
24 print(colored(“All tests passed!”, “green”))

AssertionError: Error in test

I’m quite sure that the error is in the way I call the first LSTM layer but I don’t understand what I’m doing wrong, this is the first part of my code:

{mentor edit: code removed}

Can you pm me the notebook in pdf form

There are two LSTM layers in that function. You could have an error in either one, or any of the code between them.

Please don’t post your code on the Forum. That breaks the course Honor Code.

Hi, I’m also having a similar problem. Can you please take a look at my notebook (Lab id - tixmogtm) and help me identify what’s wrong? Thank you.

I will surely have a look. Can you please PM me your notebook in pdf format as that would be easier for me to access
Thanks and Regards,
Mayank Ghogale


Although this may be already solved, the original post gives a clue to the possible solution. The post says:

Expected value

[‘LSTM’, (None, 4, 128), 67072, (None, 4, 2), ‘tanh’, True]

does not match the input value:

[‘LSTM’, (None, 128), 67072, (None, 4, 2), ‘tanh’, False]

It may be happening that the LSTM layer in question was set to request a batch sequence instead of a single hidden state.

I would check the last parameter of the LSTM layers to make sure the appropriate truth value is being passed.

Hopefully this helps someone.


Thank you, Juan and Mayank.

I just restarted my kernel and re-run the whole notebook and it worked (without any change). So I guess it was some unrelated issue. But I appreciate your help.