Problem with the Initialization Assignment in C2 W1

When creating a post, please add:

  • Week # must be added in the tags option of the post.
  • Link to the classroom item you are referring to:
  • Description (include relevant info but please do not post solution code or your entire notebook)

Hello all, from Ethiopia.
I am take the course 'Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization". While I am taking programing assignment of week 1 exercise 2, I fill the missing code by

{moderator edit - solution code removed}

But, it gives me the output and the error message that
2Tests
1 Tests Failed
AssertionError: Not all tests were passed for initialize_parameters_random. Check your equations and avoid using global variables inside the function.
I could not identify where the global variable is.
Thank you!

What is the name of the notebook file, and what function are you working on?

Note that the error message doesn’t say you’re using a global variable. It just points out a common mistake, so you can check whether you’re using a global variable.

Please take another look at the instructions for that function. They give you explicit instructions for how to initialize the weight and bias values. The way you have done it does not agree with what the instructions told you to do. You are correct that the W values use np.random.randn, but the factor that you multiply them by is different than what the instructions say. Then for the b values, they tell you to use np.zeros, instead of np.random.randn.

I thank you so much. You have detected where the mistake were!!

I’m glad to hear that the answer was helpful. Not to put too fine a point on it, but please make sure that the “meta” lesson is learned there: the instructions in the notebooks here are very thorough. In some cases it almost feels like they go a bit too far: e.g. in this case they almost literally wrote out the solution for you in the instructions. So in general it’s not a good strategy to “save time” by not reading the instructions carefully: you usually end up wasting a lot more time than you “saved”. Onward! :nerd_face: