NLP Course 4, Week 1, Programming assign: Strange behaviour in "next_symbol" function

I have shown partial implementation of the code for next_symbol here

(Solution code removed, as posting it publicly is against the honour code of this community)

When I run w1_unittest.test_next_symbol(next_symbol, NMTAttn) ,
the code doesn’t reach “020” nor do I get any errors. Or at least the print statement is not doing the job. I can usually figure this out using the debugger but installing TRAX is a nightmare and I do not see a debug option in the coursera notebook.

Any suggestions, ideas about what is going on?

Thanks in advance

I have not gotten past NLP Course 1, so I am not familiar with that code.

My theory would be that something is wrong with your earlier computations and padded_length ends up being an outrageously large value. So the reason it doesn’t print “020” is that it’s allocating huge amounts of memory, which makes the padding operation take way too long.

Of course there’s an easy way to confirm or deny that theory: instead of just printing “010” at that step, why not print padded_length just out of curiosity? :nerd_face:

Hi,
I think you should check again on your padded_length and padded. I suggest using np in padded_length.