Hi @Anatoliy_Elsukov
It seems your model is overfitting. So, when you add dropout 20% rate, it introduces a regularization effect by randomly dropping out neurons during training. This encourages the network to learn more robust and generalizable representations. The dropout can lead to fluctuations in the accuracy values as the model learns different patterns or focuses on different features in different training iterations.
hi @carlosrl
thank you for your respond, but it is hard to call model overfitted: validation accuracy higher then training accuracy. anyway pictures are very close and I don’t see any benefits of dropout in this model.
Hello @Anatoliy_Elsukov! You are right that using dropout doesn’t benefit much in this case. But try using dropout with 0.5, instead of 0.2, and then comment out the dropout layer and compare the results. Maybe you see a significant difference.
PS: Do this experiment after submission of your assignment.