NLP C2W3 EX10 Perplexity Tests

I have read all the previous topics opened regarding this exercise and I couldn’t find a similar one to my case.
My solution gives the correct perplexity for the training data
Perplexity for first train sample: 2.8040
Perplexity for test sample: 3.9654
Expected Output

Perplexity for first train sample: 2.8040
Perplexity for test sample: 3.9654

So far so good.. however 2/4 of the tests are failing
Wrong perplexity value.
Expected: 6.137396479150367
Got: 10.321829396616625.
Wrong perplexity value.
Expected: 5.0931554910158665
Got: 9.382534044133097.
2 Tests passed
2 Tests failed
and I found this note:
Note: If your sentence is really long, there will be underflow when multiplying many fractions.

  • To handle longer sentences, modify your implementation to take the sum of the log of the probabilities.

Does this imply that there should be a condition on the calculation of the probability in case of longer sentences, because the code was already provided and it used the cumulative product and not the sum of the logs and we just had to plug in few lines.

I am perplexed.

Thanks in advance.

1 Like

hi @smirkmkd

Can you send me screenshot of the grade function calculate perplexity codes by personal DM. Click on my name and then message to send the screenshot.

Regards
Deepti

Hi Deepti, I managed to find my error. It was in the previous exercise estimate_probability.
n_plus1_gram = (''.join(previous_n_gram), word)
this was my line for n_plus1_gram which worked fine for that exercise but failed the tests for the perplexity exercise. Solving it, made the tests pass.
Idk if I am allowed to share the line I used to correct this but this topic can be closed now.
Thank you.

1 Like