Hello guys,
Could you please take a look at the following screenshotted error message and tell me what might be the problem with my code. I guest the problem may be coming from the way I applied the Dense layer to the hidden state output of the post-attention. This is what I did:
Dense(units=n_a, activation = ‘softmax’)(s)
Thank you,
Hi,
It looks like you are using a concatenation of two layers in your model and they have different shapes, that is why it is showing this error. You might want to check the organization of your layers or if you could post your whole network, people might be able to better understand the problem and diagnose it. Good luck!
Hi @theskywalker15,
Thanks for your reaction.
In the modelf() function, we don’t explicitly use concatenate. We use concatenator in the first exercise to to concatenate a and s_prev; and I got the first exercise right.
In the second exercise, we invoke the function created in the first exercise to perform one step of the attention mechanism to compute the context vector, I did it this way: one_step_attention(a, s), (which implicitly use the concatenator at one of its stages), and I think I used the correct arguments for this. I’m still stuck
Also, it’s against the rule to post personal code here in the forum.
I finally got it!
I was redefining the Dense layer with an incorrect value for the units argument (the right one being len(machine_vocab)), instead of using the global variable output_layer.