Inverted Dropout

It is always the case with all forms of regularization (L2, dropout …) that they only happen during training. If they left the factor of 1/keep_prob in the code, then remember that keep_prob will be 1 when you are not actually doing dropout. That’s the way you disable dropout if it is part of the code, so that it doesn’t happen during test. You pass a value < 1 during training, but pass 1 in all other cases.

1 Like