I am getting this error how do I solve it,
I used (Q1, enc_output, enc_output, padding_mask, return_attention_scores=True) as inputs for self.mha2 layer.
Regards Utkarsh,
Thank You.
I am getting this error how do I solve it,
I used (Q1, enc_output, enc_output, padding_mask, return_attention_scores=True) as inputs for self.mha2 layer.
Regards Utkarsh,
Thank You.
Two ideas:
Sir, I have personally sent you my code, please go through it…
^this literally made me laugh out loud. Like “bro incoming message, go through it, report back, pronto”
hey, how did you solve this error? I am getting the same error here.
Hello,
In UNQ_C6 corresponding to the implementation layer, I believe I’ve entered the correct code but still seem to be getting the error below. It makes me think my inputs (query, key and value) to self.mha1 might be wrong, but I don’t see how. Any help or clues would be much appreciated!
Regards,
Bharti
In this assignment, these three are the same thing.
Thanks for that tip @saifkhanengr. That’s what I’ve done in my code, but the problem exists. I am sharing a screenshot of my code as a DM, would be grateful for any advice.
Thank you for sending me your code. It is incorrect for layernorm2
.
Hint: What denotation is used for the output of the first block after normalization?
Thanks a ton! That worked.