Exploring Dropouts

Week 3 Video: Exploring Dropouts, at timestamp 1:27. In the graph, the Validation Accuracy already seems higher than the Training Accuracy in the epoch0, and it seems to decrease over epochs instead of increasing. Why?

Secondly, does this mean manually training on the Data is insignificant in this scenario?

1 Like

As you have seen in the course the dropout is used to prevent overfitting and the reason why validation accuracy decreases over some training epochs is that you are predicting with the data that the model hasn’t seen. Manually training on the data is applicable too but with a network with few nodes to prevent the model from overfitting.