I am confused here… Unable to execute it. Can someone assist here please?
I am getting the following error:
‘NoneType’ object is not subscriptable
I am not sure what is causing it. I think its related to predictions.
And that could be the effect of other code blocks. Firstly I am not able to understand how to fix idx. Could you tell me how to do it?
It looks like you have not filled in some code somewhere and the result is that a “None” value is being propagated through your computation until the above error is thrown. It should be clear from the statement that throws the exception which variable they are talking about. Note that it is not “idx”, right? That is the value of the index. What the message is telling you is that the variable to which you are using idx as an index has type “None”. Now you need to track backwards to figure out why it is “None” instead of being a valid array or list or whatever else is expected there.
This is how debugging works: the first order of business is to understand what the error message is telling you. Then you need to work backwards to figure out how it happened.
If that’s not enough to get you to the next level, then you could try showing us the actual exception trace instead of just the error message. Seeing the line of code that actually threw the error would be pretty informative. Please don’t just post your full source code, but it’s fine to post the exception trace.
Hi, @paulinpaloalto. I tried debugging but unable to find the exact reason.
Here is the error:
TypeError Traceback (most recent call last)
in
----> 1 parameters, last_name = model(data.split("\n"), ix_to_char, char_to_ix, 22001, verbose = True)
2
3 assert last_name == ‘Trodonosaurus\n’, “Wrong expected output”
4 print("\033[92mAll tests passed!")
in model(data_x, ix_to_char, char_to_ix, num_iterations, n_a, dino_names, vocab_size, verbose)
62 # Perform one optimization step: Forward-prop → Backward-prop → Clip → Update parameters
63 # Choose a learning rate of 0.01
—> 64 curr_loss, gradients, a_prev = optimize(X, Y, a_prev, parameters, learning_rate = 0.01)
65
66 ### END CODE HERE ###
in optimize(X, Y, a_prev, parameters, learning_rate)
32
33 # Forward propagate through time (≈1 line)
—> 34 loss, cache = rnn_forward(X, Y, a_prev, parameters)
35
36 # Backpropagate through time (≈1 line)
~/work/W1A2/utils.py in rnn_forward(X, Y, a0, parameters, vocab_size)
100
101 # Update the loss by substracting the cross-entropy term of this time-step from it.
→ 102 loss -= np.log(y_hat[t][Y[t],0])
103
104 cache = (y_hat, a, x)
TypeError: ‘NoneType’ object is not subscriptable
I’m getting this same error (same stack trace). Was a solution ever found?
I figured it out. It was related to Y and the solution was in this post:
Yes, I figured out error. It worked exactly as mentioned in the above description.