- Question 8 Increasing the parameter keep_prob from (say) 0.5 to 0.6 will likely cause the following: (Check the two that apply)
Increasing the regularization effect
Reducing the regularization effect
Causing the neural network to end up with a higher training set error
Causing the neural network to end up with a lower training set error
Could anyone Please explain how can it lead to decrease in regularisation and how can it lead to low training set error?
Hi @ajaykumar3456, from your question you seem to have a good understanding of the working of keep_prob. What will happen when you decrease keep_prob from 0.5 to say 0.4?
when keep_prob decreases to 0.4, then we will be eliminating more number of neurons in that particular layer
Correct me If i am wrong.
if we are eliminating more number of neurons, then more regularisation is being done as the network is being simplified.
So what will happen the other way around, when you increase?
If you are increasing the keep_prob, You are allowing more number of neurons to stay in the layer,and less amount of regularisation is being done. And at the same time as the network is more complex and as we are not applying any kind of regularisation, training set error will be low.
@sjfischer Please correct me If I am wrong.
and Thanks for the way you are clarifying doubts by giving hints.