Does know what’s wrong with the X variable here? BTW, I set idx = j % len(examples)
I don’t think the value of idx
is the issue here. Where does it say to directly call rnn_forward
from model
? In my code, I’m calling optimize
there.
Also note that you filed this under DLS Resources, but it’s the Dinosaur Names assignment from Week 1 of DLS C5, right? I moved the thread for you.
Actually now that I think about it a bit harder, optimize
calls rnn_forward
with the same arguments, so maybe there is more of a problem here than just the call graph. Notice that the length of your Y value is less than the length of your X value the way you wrote it. I appended a newline to the Y value according to the instructions.
Also note that the line that “throws” is not referencing X, but Y, right?
The instructions say to use run_forward() here:
The Y variable is set to
Y = single_example_ix[1:] + [ix_newline] … I guess that’s not correct.
It says to use the integer representation of ix_newline. Perhaps ix_newline = char_to_ix[‘\n’] is wrong?
Sorry, but you are reading too much into that statement about rnn_forward
. It is making a point just about the format of the data. If you look at the logic of optimize
, it calls rnn_forward
of course. And here is the comment they give you in the model
template code:
# Perform one optimization step: Forward-prop -> Backward-prop -> Clip -> Update parameters
# Choose a learning rate of 0.01
On the point about Y, yes, you need to append the index of newline. But you didn’t in the code we can see in the exception trace you show above.
Ok finally got it thanks for your help…X didn’t require any function call at all - just as you said. Makes total sense now, cheers