Random handwritten digits are not just 0 and 1

I’m trying to run Week 1’s assignment on a different environment, and I’m having an issue. There’s a part of the exercise which states:

‘Let’s compare the predictions vs the labels for a random sample of 64 digits. This takes a moment to run.’

When I run this code in the Coursera environment, the random digits that come out are all 0s and 1s. Whereas when I run it in the other environment, I’m getting all digits 0 - 9.

I believe this is the line that is picking the indices, how is this filtering for 0s and 1s?

# Select random indices
random_index = np.random.randint(m)

Another problem (which I’m hoping is related): all my yhats are coming out as 1. What am I doing wrong?

Thank you!

Posting the full code here in case it makes it easier (although it’s exactly the same as what’s already in the cell before):

import warnings
warnings.simplefilter(action='ignore', category=FutureWarning)
# You do not need to modify anything in this cell

m, n = X.shape

fig, axes = plt.subplots(8,8, figsize=(8,8))
fig.tight_layout(pad=0.1,rect=[0, 0.03, 1, 0.92]) #[left, bottom, right, top]

for i,ax in enumerate(axes.flat):
    # Select random indices
    random_index = np.random.randint(m)
    
    # Select rows corresponding to the random indices and
    # reshape the image
    X_random_reshaped = X[random_index].reshape((20,20)).T
    
    # Display the image
    ax.imshow(X_random_reshaped, cmap='gray')
    
    # Predict using the Neural Network
    prediction = model.predict(X[random_index].reshape(1,400))
    if prediction >= 0.5:
        yhat = 1
    else:
        yhat = 0
    
    # Display the label above the image
    ax.set_title(f"{y[random_index,0]},{yhat}")
    ax.set_axis_off()
fig.suptitle("Label, yhat", fontsize=16)
plt.show()

Sorry, I just realized my mistake. The dataset in Coursera has 1,000 examples because the load_data() function is loading a subset of the data, while my np.load() is bringing all 5,000 examples. I’ll filter my X according to the label and I believe that should solve it!

Hey @ian.pisano

Exactly! In your own exploration, how you can solve it matters the most!

Keep it up!
Raymond

1 Like