Week 1 ex 2 forward_propagation_with_dropout

I am stuck on ex 2 of week 1. I’m sure my code is correct, but the autograder passes 1 test and fails another.

I’ve looked at all the other threads, and:

  • I do use np.random.rand, not randn
  • I grab the shape[0] and shape[1] as parameters
  • I use smaller than, not larger than, for the mask matrix
  • I use np.multiply for A1
  • I do divide by keep_prob to scale

What is the “if all else fails, break glass” method the other thread mentions?

The emergency method is to share the source code using Direct Messages (DMs) which are private conversations. I’ll send you a DM to initiate the process.

Everything you say in your description sounds correct. The only concern is that there are two layers and both need to have dropout applied, right? Not just A1, but also A2. But if you got all the other stuff right, I’m sure you got that as well. The comments in the template are pretty clear.

For anyone else scouring the forums - my bug was that I had accidentally modified something outside the “YOUR CODE HERE” block, and never noticed the inadvertent change. Thanks to Paul for spotting that for me, and getting me out of that bang-my-head-on-the-keyboard loop.

Yes, it was a tough one to spot: I read over the core solution code about 5 times and couldn’t see anything wrong. But then I read the whole function from top to bottom and finally saw the error. Just as a general matter, it’s not against the rules to add or modify code outside of the “YOUR CODE HERE” sections, but you just need to make sure you know what you’re doing when you “go there”. :grinning:

The grader doesn’t examine your source code: it just calls your functions and checks the return values.