There seems to be a floating point precision issue in grading the test_model function. Here are my results. Note that if these values were rounded to the fist decimal (and in some cases to the 2nd decimal) they would all be correct.
Your log perplexity does not match with expected value.
Check if you are getting rid of the padding or checking if the target equals 0.
If your result is an array instead of a float, check if you are using numpy to perform the sums.
Expected value near: 1.7646706.
Got 1.7582136392593384.
Your perplexity does not match with expected.
Expected value near: 5.839648246765137.
Got 5.802063465118408.
Your log perplexity does not match with expected value.
Check if you are getting rid of the padding or checking if the target equals 0.
If your result is an array instead of a float, check if you are using numpy to perform the sums.
Expected value near: 1.5336857.
Got 1.520787000656128.
Your perplexity does not match with expected.
Expected value near: 4.635229110717773.
Got 4.575824737548828.
Your log perplexity does not match with expected value.
Check if you are getting rid of the padding or checking if the target equals 0.
If your result is an array instead of a float, check if you are using numpy to perform the sums.
Expected value near: 1.5870862.
Got 1.5890307426452637.
Your perplexity does not match with expected.
Expected value near: 4.889481067657471.
Got 4.898998260498047.
The result output from your test_model
function is compared with the expected value with an absolute tolerance of 1e-5
. Your code should be fixed.
You can see details about the test in w2_unittest.py
.
Seems I am now hitting a similar issue… although maybe it is a model issue in itself?
Your log perplexity does not match with expected value.
Check if you are getting rid of the padding or checking if the target equals 0.
If your result is an array instead of a float, check if you are using numpy to perform the sums.
Expected value near: 1.7646706.
Got 1.7582136392593384.
Your perplexity does not match with expected.
Expected value near: 5.839648246765137.
Got 5.802063465118408.
Your log perplexity does not match with expected value.
Check if you are getting rid of the padding or checking if the target equals 0.
If your result is an array instead of a float, check if you are using numpy to perform the sums.
Expected value near: 1.5336857.
Got 1.520787000656128.
Your perplexity does not match with expected.
Expected value near: 4.635229110717773.
Got 4.575824737548828.
Your log perplexity does not match with expected value.
Check if you are getting rid of the padding or checking if the target equals 0.
If your result is an array instead of a float, check if you are using numpy to perform the sums.
Expected value near: 1.5870862.
Got 1.5890307426452637.
Your perplexity does not match with expected.
Expected value near: 4.889481067657471.
Got 4.898998260498047.
3 Tests passed
6 Tests failed
Struggling to spot the error, any help would be much appreciated.
I don’t have access to the latest version of the course. Please find an NLP mentor to help you out.