Doubt related to Inverted Dropout Technique

There are several additional interesting and slightly more subtle points in how they have us build dropout in the assignment: as they usually do here, they set the random seed to a fixed value, so that it’s easy for them to write the test cases and the graders. But that means we’re literally dropping the exact same neurons on every iteration, which is not how dropout really works. Here’s a thread which discusses that point and even includes some experiments.

Then the other point is that if you are just dropping the same neurons every time, isn’t that equivalent to just defining a smaller network to start with and then dispensing with all the dropout business. Maybe, but here’s a thread which discusses that point a bit more.

1 Like