I am working on W3A1, all the test passed, but when run the
model.fit([Xoh, s0, c0], outputs, epochs=1, batch_size=100)
I got the following error. Not sure what’s going wrong. How can I debug?
ValueError: The two structures don't have the same sequence length. Input structure has length 10, while shallow structure has length 1.
And my model summery is:
Model: "functional_5"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_3 (InputLayer) [(None, 30, 37)] 0
__________________________________________________________________________________________________
s0 (InputLayer) [(None, 64)] 0
__________________________________________________________________________________________________
bidirectional_2 (Bidirectional) (None, 30, 64) 17920 input_3[0][0]
__________________________________________________________________________________________________
repeat_vector (RepeatVector) (None, 30, 64) 0 s0[0][0]
lstm[20][1]
lstm[21][1]
lstm[22][1]
lstm[23][1]
lstm[24][1]
lstm[25][1]
lstm[26][1]
lstm[27][1]
lstm[28][1]
__________________________________________________________________________________________________
concatenate (Concatenate) (None, 30, 128) 0 bidirectional_2[0][0]
repeat_vector[20][0]
bidirectional_2[0][0]
repeat_vector[21][0]
bidirectional_2[0][0]
repeat_vector[22][0]
bidirectional_2[0][0]
repeat_vector[23][0]
bidirectional_2[0][0]
repeat_vector[24][0]
bidirectional_2[0][0]
repeat_vector[25][0]
bidirectional_2[0][0]
repeat_vector[26][0]
bidirectional_2[0][0]
repeat_vector[27][0]
bidirectional_2[0][0]
repeat_vector[28][0]
bidirectional_2[0][0]
repeat_vector[29][0]
__________________________________________________________________________________________________
dense (Dense) (None, 30, 10) 1290 concatenate[20][0]
concatenate[21][0]
concatenate[22][0]
concatenate[23][0]
concatenate[24][0]
concatenate[25][0]
concatenate[26][0]
concatenate[27][0]
concatenate[28][0]
concatenate[29][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 30, 1) 11 dense[20][0]
dense[21][0]
dense[22][0]
dense[23][0]
dense[24][0]
dense[25][0]
dense[26][0]
dense[27][0]
dense[28][0]
dense[29][0]
__________________________________________________________________________________________________
attention_weights (Activation) (None, 30, 1) 0 dense_1[20][0]
dense_1[21][0]
dense_1[22][0]
dense_1[23][0]
dense_1[24][0]
dense_1[25][0]
dense_1[26][0]
dense_1[27][0]
dense_1[28][0]
dense_1[29][0]
__________________________________________________________________________________________________
dot (Dot) (None, 1, 64) 0 attention_weights[20][0]
bidirectional_2[0][0]
attention_weights[21][0]
bidirectional_2[0][0]
attention_weights[22][0]
bidirectional_2[0][0]
attention_weights[23][0]
bidirectional_2[0][0]
attention_weights[24][0]
bidirectional_2[0][0]
attention_weights[25][0]
bidirectional_2[0][0]
attention_weights[26][0]
bidirectional_2[0][0]
attention_weights[27][0]
bidirectional_2[0][0]
attention_weights[28][0]
bidirectional_2[0][0]
attention_weights[29][0]
bidirectional_2[0][0]
__________________________________________________________________________________________________
c0 (InputLayer) [(None, 64)] 0
__________________________________________________________________________________________________
lstm (LSTM) [(None, 64), (None, 33024 dot[20][0]
s0[0][0]
c0[0][0]
dot[21][0]
lstm[20][1]
lstm[20][2]
dot[22][0]
lstm[21][1]
lstm[21][2]
dot[23][0]
lstm[22][1]
lstm[22][2]
dot[24][0]
lstm[23][1]
lstm[23][2]
dot[25][0]
lstm[24][1]
lstm[24][2]
dot[26][0]
lstm[25][1]
lstm[25][2]
dot[27][0]
lstm[26][1]
lstm[26][2]
dot[28][0]
lstm[27][1]
lstm[27][2]
dot[29][0]
lstm[28][1]
lstm[28][2]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 11) 715 lstm[20][1]
lstm[21][1]
lstm[22][1]
lstm[23][1]
lstm[24][1]
lstm[25][1]
lstm[26][1]
lstm[27][1]
lstm[28][1]
lstm[29][1]
==================================================================================================
Total params: 52,960
Trainable params: 52,960
Non-trainable params: 0