The technique of Dropout was discussed in the context of fully connected layers. However, it appears all of a sudden in the assignment on U-Net in the convolutional context. Could you please add some discussion of Dropout in this context?
The original paper states the following:
“Drop-out layers at the end of the contracting path perform further implicit data augmentation”.
This is subtle and points to a different function of drop-out layers than regularization. Maybe this is why it’s left out of the discussion at this point.