I had a doubt about the dropout method for regularization.
We use a probability number which indicates the probability of a particular neuron being removed for the particular iteration. My doubt is that, isn’t it entirely possible that for some iteration, may be an entire layer got dropped out, i.e., all the neurons of a particular layer got dropped out? What will happen if something like this happens?
1 Like
Hello, @subhrajitm20,
Yes, it’s possible, and for Tensorflow (and Torch) implementation, there is no special treatment - a3 (a^{[3]}) will be all zeros.
Cheers,
Raymond
1 Like