Neural_machine_translation_with_attention Ex2 Error in test. The lists contain a different number of elements

Hi all. I cannot understand why this is happening

expected_summary = [[‘InputLayer’, [(None, 30, 37)], 0],
[‘InputLayer’, [(None, 64)], 0],
[‘Bidirectional’, (None, 30, 64), 17920],
[‘RepeatVector’, (None, 30, 64), 0, 30],
[‘Concatenate’, (None, 30, 128), 0],
[‘Dense’, (None, 30, 10), 1290, ‘tanh’],
[‘Dense’, (None, 30, 1), 11, ‘relu’],
[‘Activation’, (None, 30, 1), 0],
[‘Dot’, (None, 1, 64), 0],
[‘InputLayer’, [(None, 64)], 0],
[‘LSTM’,[(None, 64), (None, 64), (None, 64)], 33024,[(None, 1, 64), (None, 64), (None, 64)],‘tanh’],
[‘Dense’, (None, 11), 715, ‘softmax’]]

summary of model [[‘InputLayer’, [(None, 30, 37)], 0], [‘InputLayer’, [(None, 64)], 0], [‘Bidirectional’, (None, 30, 64), 17920], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘Dense’, (None, 30, 10), 1290, ‘tanh’], [‘Dense’, (None, 30, 1), 11, ‘relu’], [‘Activation’, (None, 30, 1), 0], [‘Dot’, (None, 1, 64), 0], [‘InputLayer’, [(None, 64)], 0], [‘LSTM’, [(None, 64), (None, 64), (None, 64)], 33024, [(None, 1, 64), (None, 64), (None, 64)], ‘tanh’], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘Dense’, (None, 11), 715, ‘softmax’]]

AssertionError Traceback (most recent call last)
in
34
35
—> 36 modelf_test(modelf)

in modelf_test(target)
31 assert len(model.outputs) == 10, f"Wrong output shape. Expected 10 != {len(model.outputs)}"
32
—> 33 comparator(summary(model), expected_summary)
34
35

~/work/W3A1/test_utils.py in comparator(learner, instructor)
16 def comparator(learner, instructor):
17 if len(learner) != len(instructor):
—> 18 raise AssertionError(“Error in test. The lists contain a different number of elements”)
19 for index, a in enumerate(instructor):
20 b = learner[index]

AssertionError: Error in test. The lists contain a different number of elements

I hope you sorted this out, but if you haven’t.

Check all of your previously defined functions ‘Carefully’ …
and make sure you double check the provided functions, and use if directed.

I had the exact same issue.

I have carefully checked all the functions and double checked the provided functions for an hour, yet I still got this exact error except I got 10 extra [‘Dense’, (None, 11), 715, ‘softmax’] at the end.

My issue was related to the one_step_attention() function and not using the provided functions directly above it.
But my issue was the same as the original post you made. with Repeat Vector and Concatenate repeated several times.

2 Likes

Were you able to sort this out?
I am having the same issue

1 Like

Did you sort it out? I’m have the same issue too

Hi, yes. I think I was re-writing the dense layer manually instead of using the “output_layer” from before in step 2C

Thanks. I found the issue was caused by a mistake I did in the previous exercise’s one_step_attention function, specifically in the ‘energies’ variable. Even though the tests were successful for exercise 1, that mistake followed me to exercise 2.

Hi all,

My final dense layer is repeating 10 times. Anyone know why this might be happening please?

Welcome to the community !

Please see a picture in that assignment.

There are multiple output units that correspond to the number of output characters, i.e, time steps. In our assignment, it is T_y. And it is set to 10 to produce “YYYY-MM-DD”, 10 characters.
Each unit consists of multiple layers like attention, post-attention LSTM, Softmax, …

If you look at modelf(), there is a “loop”, (for t in range(Ty)), to create a set of "one_step_attention, post_activation_LSTM, and Dense layer with Softmax activation. As you see, T_y = 10. So, you will see 10 Dense layers in model.summary()

Hi, thanks for reply. Perhaps I should have been clearer, my problem is that they are unusually stacked at the end:

@anon57530071 I have solved it now. I was using Keras directly but when replaced with your functions (concatenator, etc…) everything worked fine.

Great ! Thank you for letting us know.

I was getting 10 != 1
Turns out that the “outputs” for the model should equal to “outputs” and not “out”. I thought it would be out, since out is defined as the output layer. But I guess you have to provide ALL the out’s that you appended in the outputs.
Hope this helps someone :smiley: