Question about a common mistake of using dropout during testing


It mentions in the notebook that - "A common mistake when using dropout is to use it both in training and testing. You should use dropout (randomly eliminate nodes) only in training. "

→ Does this mean that people mistakenly use dropout to train a model with a testing dataset…? I am not sure if I understand the situation under which this mistake occurs.

parameters = model(train_X, train_Y, keep_prob = 0.86, learning_rate = 0.3)

print (“On the train set:”)
predictions_train = predict(train_X, train_Y, parameters)
print (“On the test set:”)
predictions_test = predict(test_X, test_Y, parameters)

I appreciate your time,

I think what was meant here is the following:

  • you use dropout during training. Each run has different nodes being ignored and in the end this will reduce reliance on some “key inputs”. After multiple epochs, you will have weights for all nodes, though.

  • However during testing you use all of the weights. This is also what you would do during operation on real data. Dropout is only used for training purposes.

Anyway, this is what I took from the description. Does this make sense?


Thank you for your reply!!! (I apologize for the late reply, I was thinking it through a bit more…)

So I guess I am still confused about the code.

You train the model using the training data and get the optimal parameters with this line of code → parameters = model(train_X, train_Y, keep_prob = 0.86, learning_rate = 0.3).
Then, you make predictions on the train and test set using the predict function.
However, the predict function does not have an argument that allows for dropout (keep_prob) so I am not sure how someone would mistakenly use dropout during testing.

Sorry if my question does not make sense…

I would say with the given code, it is hard to make that mistake. The message in the lecture to me was that if you write everything from scratch (i.e. your own predict function) you should not write allow for a drop-out parameter in your predictions.

1 Like

Thank you for the clarification!