When building up the model, there are appearing 9 repeatvector layers, which seem not to disturb the model, but the grader claims an incorrect model summary.
My model summary looks as follows:
[[‘InputLayer’, [(None, 30, 37)], 0], [‘InputLayer’, [(None, 64)], 0], [‘Bidirectional’, (None, 30, 64), 17920], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Concatenate’, (None, 30, 128), 0], [‘Dense’, (None, 30, 10), 1290, ‘tanh’], [‘Dense’, (None, 30, 1), 11, ‘relu’], [‘Activation’, (None, 30, 1), 0], [‘Dot’, (None, 1, 64), 0], [‘InputLayer’, [(None, 64)], 0], [‘LSTM’, [(None, 64), (None, 64), (None, 64)], 33024, [(None, 1, 64), (None, 64), (None, 64)], ‘tanh’], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘RepeatVector’, (None, 30, 64), 0, 30], [‘Dense’, (None, 11), 715, ‘softmax’]
when I continue ignoring the summary difference, it can train and fits as expected. I have no idea, where these repeatvector layers are created. Can you give me a hint ?
Regards,
Andreas