Function Encoder was throwing an error when self.pos_encoding was invoked in ‘call’. I had to directly use positional_encoding(…) inside ‘call’ to make it work. Please check if there is a genuine error.
Please share a screenshot of the error you are encountering without sharing any part of grade cell codes.
Regards
DP
You should not have to do that, of course. Note that they gave you the code to invoke positional_encoding
in the __init__
method to create self.pos_encoding
. Please read over the code in __init__
to see what is done there. One thing to keep in mind is the types of the various objects that are created there. self.pos_encoding
is a tensor, but self.embedding
is a function, right? You must invoke it with an input.
But if the above suggestions don’t help, then the next step is what Deepti just mentioned: please show us the error trace you are getting. Note that we do not have the magic superpower to just look at your notebook. Or if your question is “are the tests broken”, the answer is “they work fine for me”.
These courses have been in operation for several years and have been taken by literally thousands of students before you, so they are pretty well debugged by this point. It’s not impossible that you’ve discovered some new subtle bug that no-one before you has tripped over, but the probability of that is pretty low. So your first assumption when you hit a problem like this should be “I need to debug my code a bit more”.
Here is the error when self.pos_encoding was invoked in ‘call’. - I was able to work around by directly use positional_encoding(…) inside ‘call’.
I have added the error message above that was flagged when self.pos_encoding was invoked in ‘call’. For now, I directly invoked positional_encoding(…) inside ‘call’ to make it work.
Are you sure you didn’t modify code that was outside the “YOUR CODE HERE” segments?
If you’ve gotten things to work and pass the tests, maybe this isn’t worth spending more mental energy on, but I did not have to make the change that you describe in order to get this to work.
So one strategy is just to cruise ahead and work through the rest of the assignment and hope that there are no other ill effects further down the line from your different implementation strategy.
we need to see the complete error, not just the header or footer of error description, also as mentioned by other mentor, editing any part outside of marker ###START AND END CODE HERE### might resolve your respective exercise unittest but can cause either failure of further down the test cell or failure of your submission grade.
Regards
DP
@paulinpaloalto, @Deepti_Prasad I did not modify any code outside “START CODE HERE” and “END CODE HERE”.
Yes, I was able to complete the entire programming assignment yesterday itself with the modified strategy to directly invoke positional_encoding(…) inside ‘call’. Thought of posting this error for the benefit of others who may encounter the same.
Happy to see such a helpful community. Gained good insights into deep learning by completing this 5 course series - link
That should not be required. Somewhere in your code, you’re making two offsetting mistakes.
Congratulations on completing all 5 courses! It’s great to hear that you found them useful. There is quite a lot of interesting material there and Prof Ng does a really excellent job of presenting it all (IMO anyway).
I had the same error and Gireesh’s solution worked for me. I think it has something to do with the value of maximum_position_encoding in the init function, but I don’t know for sure.
Maybe it’s worth taking a look at your code with the hope that we can get a real understanding of the nature of the issue, since it seems like people hit it every few months. We can’t do that on a public thread, but please check your DMs for a message from me.
After looking at the code, I think the issue is just that we need to be careful when using the self.pos_encoding
tensor. It turns out you need to use selective indexing on the second index to get the correct sized tensor. If you just use it “as is”, it is too large.