DeepLearning.AI
Dropout: Why divide by keep prob?
Course Q&A
Deep Learning Specialization
Improving Deep Neural Networks: Hyperparameter tun
coursera-platform
paulinpaloalto
July 10, 2021, 11:19pm
2
Please see
this recent thread
for a pretty thorough discussion of this issue.
1 Like
show post in topic
Related topics
Topic
Replies
Views
Activity
Doubt related to Inverted Dropout Technique
Improving Deep Neural Networks: Hyperparameter tun
coursera-platform
2
826
February 16, 2023
Regularization by Inverted Dropout
Improving Deep Neural Networks: Hyperparameter tun
coursera-platform
1
690
August 12, 2021
Inverted dropout Intuition?
Improving Deep Neural Networks: Hyperparameter tun
coursera-platform
3
691
May 24, 2022
Dropout scaling fix (division by keep_prob)
Improving Deep Neural Networks: Hyperparameter tun
coursera-platform
3
681
September 28, 2022
Expected Value Stays the Same After Scaling In Dropout
Improving Deep Neural Networks: Hyperparameter tun
coursera-platform
2
570
October 4, 2021