Dropout Technique

I am bit confused in dropout implementation that in single iteration there are m training examples , will drop out to be applied to each example separetely. Meaning for first example if we deactivating first two hidden units in layer one and for second example in same iteration will have to deactivate same first two unit in 1st layer or different example will have different hidden units deactivagtion in same iteration?
In course it is mentioned that different iteration will have different drouped out neuron but in case of different example in same iteration it is not mentioned.

Hi, @Vinayak.

Neurons are randomly dropped for each training example independently. But the implementation is vectorized, don’t let that confuse you.

I’m sure it will become clear once you implement forward_propagation_with_dropout in the lab :slight_smile: