I’m in Week 1 of Advanced Learning Algorithms course. I might have missed the definition of dense layer so want to ask whether my understanding is correct.
Dense layer is a layer which connects to every neuron of its preceding layer, in other words, it uses all the output from preceding layer?
Yes your understanding of dense layers is correct!. A dense layer, also known as a fully connected layer, connects every neuron in its preceding layer to every neuron in the dense layer. This means that each neuron in a dense layer receives input from all the neurons in the previous layer.
Well dense layers themselves do not always cause overfitting. Whether dense layers contribute to overfitting depends on various factors, including the specific problem, the dataset, the size of the neural network and so on and that’s why we use dropout layers ( Regularization technique) to prevent the overfitting