Help! modelf in DLS Course5 Machine Neural_machine_translation_with_attention_v4a

AssertionError: Error in test. The lists contain a different number of elements

The 2nd. layer of the expected summary is “Input layer”. But in my result, the same layer appeared after “Dot” layer. And “RepeatVector”, “Concatenate” etc. didn’t appear.
Of course, I write “context = one_step_attention(a, s)”.

Why? How can I solve this problem?

<The result of “print(summary(model))”>
[[‘InputLayer’, [(None, 30, 37)], 0]
, [‘Bidirectional’, (None, 30, 64), 17920]
, [‘Dense’, (None, 30, 1), 65, ‘relu’]
, [‘Activation’, (None, 30, 1), 0]
, [‘Dot’, (None, 1, 64), 0]
, [‘InputLayer’, [(None, 64)], 0]
, [‘InputLayer’, [(None, 64)], 0]
, [‘LSTM’, [(None, 64), (None, 64), (None, 64)], 33024, [(None, 1, 64), (None, 64), (None, 64)], ‘tanh’]
, [‘Dense’, (None, 11), 715, ‘softmax’]]

expected_summary = [['InputLayer', [(None, 30, 37)], 0], ['InputLayer', [(None, 64)], 0], ['Bidirectional', (None, 30, 64), 17920], ['RepeatVector', (None, 30, 64), 0, 30], ['Concatenate', (None, 30, 128), 0], ['Dense', (None, 30, 10), 1290, 'tanh'], ['Dense', (None, 30, 1), 11, 'relu'], ['Activation', (None, 30, 1), 0], ['Dot', (None, 1, 64), 0], ['InputLayer', [(None, 64)], 0], ['LSTM',[(None, 64), (None, 64), (None, 64)], 33024,[(None, 1, 64), (None, 64), (None, 64)],'tanh'], ['Dense', (None, 11), 715, 'softmax']]

Hey @Noboru_Akiyama,
Your summary indeed is very different from the expected summary. Can you please DM your code for the modelf function to me, so that I can take a look at your code, and figure out exactly where the issue resides.

The issue as you suggests lie somewhere around the one_step_attention function only, since all the layers involved in that function seem to be missing.

Cheers,
Elemento

1 Like

Hey @Noboru_Akiyama,
Your modelf function is completely correct. The issue seems to lie in your implementation of one_step_attention function. Can you please DM that as well?

Cheers,
Elemento

1 Like

Hey @Noboru_Akiyama,
Please check the following line of code:

# Use densor1 to propagate concat through a small fully-connected neural network to compute the "intermediate energies" variable e. (≈1 lines)
e = densor1(a)

Can you figure out the issue after reading the comment? Also, take a look at the attention mechanism once to see “what you have to implement” and “what you have implemented”. Let me know if this hint helps.

Cheers,
Elemento

1 Like

Hi @Elemento

Thank you very very very much.
I spent dozens of hours for this problem. Thanks to your advice, I could solve it finally.

The argument of densor1 and donsor2 function was wrong.

Thank you again.

Hey @Noboru_Akiyama,
I am glad I could help.

Cheers,
Elemento

1 Like