Question C9: Unable to make model inference

Hi,

I get the following error although I feel I have put in all the correct values:

TypeError: add got incompatible shapes for broadcasting: (1, 10000, 512), (1, 4096, 512).

If I replace the padded_length with 4096, it creates an inference. It seems to be related to the token length that the model was trained on. However, the padded length is 10000.

How do I fix the issue?

Hi @danieldacruz7,

Your padded_length and padded is incorrect.

Best,
Mubsi