Week 3, Programming Assignment, Exercise 8

Hi everyone,
When running the nn_model() , the cost isn’t changing after iterations, and I am getting this error. What do you think is the cause of the problem?
Thanks in advance.

If the cost doesn’t change, then the first place I would look is at your “update parameters” logic.

Notice that your b1 and b2 values are still zero, so I’ll bet your W values have not changed from their original either. So why are they not changing? If your update_parameters function passed its test case, then there must be something wrong with the way you are calling it. That logic should be in nn_model.

1 Like

The logic for updating weights after subtracting gradients may not be put correctly. You should have something like W1 = W1 - gradient(wrt W1) *learning_rate

1 Like

One way cause this behavior is to misspell the variable name that gets the relevant return value from update_parameters. The updated values are passed back in a dictionary.

Hi, I had exactly the same issue and solved it, simply having un-indentation of all the functions inside the nn_model.

Yes, indentation is a key part of the syntax of python. Changing it fundamentally changes the meaning of the code. Getting it right is essential!

Hello everyone,
I also have an issue running the nn_model( ). I get an assertion error for the shape of A2 despite the fact that it had already been verified as fine in the previous function forward_propagation.
What do you think I am missing here?


A perfectly correct function can still throw errors if you pass it mismatching or incorrect parameters. So the bug is somewhere else. Most likely it’s in your nn_model logic, since all the other routines passed their tests. What is the actual shape you are getting for A2 in the failing case? What should it be and why is it different? Then work backwards …

Of course there is a small caveat here: it’s possible that the test cases for some of the earlier functions didn’t catch all possible bugs.

1 Like

Thanks a lot, the issue indeed came from the initialization of the parameters, I had put the inputs in the wrong order…
(I tried to print the shape of A2 in the nn_model but it wasn’t appearing since the assertion was directly in the forward_propagation function and thus was stopping the algorithm before the print.
So today, after a little thought I put a print(A.shape) in the forward_propagation… I feel bad for thinking about it so late … :upside_down_face:)