Week 4 assignment 2 initialize_parameters

Hi
I had a problem with the first exercise in layer_dims, it passed only two tests and failed the other two, from reading the other topics and threads here I found out that it’s something in the Initialization of the parameters but still can’t figure it out.
And I know I shouldn’t pass on hard-coded values or make it the same as the week 3.
I really will appreciate a hint or an example of how it should be done.
here is the error I get:

Cost after iteration 1: 0.6925272135228361
Cost after first iteration: 0.693049735659989
Cost after iteration 1: 0.6919261002473156
Cost after iteration 1: 0.6919261002473156
Cost after iteration 1: 0.6919261002473156
Error: Wrong output for variable W1.
Error: Wrong output for variable b1.
Error: Wrong output for variable W2.
Error: Wrong output for variable b2.
Cost after iteration 2: 0.6768602138414122
Error: Wrong output for variable W1.
Error: Wrong output for variable b1.
Error: Wrong output for variable W2.
Error: Wrong output for variable b2.
 2  Tests passed
 2  Tests failed
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-17-f9ec5304d38d> in <module>
      3 print("Cost after first iteration: " + str(costs[0]))
      4 
----> 5 two_layer_model_test(two_layer_model)

~/work/release/W4A2/public_tests.py in two_layer_model_test(target)
     75     ]
     76 
---> 77     multiple_test(test_cases, target)
     78 
     79 

~/work/release/W4A2/test_utils.py in multiple_test(test_cases, target)
    140         print('\033[92m', success," Tests passed")
    141         print('\033[91m', len(test_cases) - success, " Tests failed")
--> 142         raise AssertionError("Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.".format(target.__name__))
    143 

AssertionError: Not all tests were passed for two_layer_model. Check your equations and avoid using global variables inside the function.

Hello Ahmed Mohammed,

Welcome to the community.

It’s a two_layer_model and you are passing the test for the first, but failing for the second. How is that happening?

Check your implementations for linear_activation_backward.

You are considering dA1, W2 & db2 at the first place with sigmoid and then dA0, dW1 & db1 with relu activation to get the final output.

1 Like

Thank you very much for your help Rashmi, I tried for about 12 hours on this, and you just solved it in 1 minute.

Thank you again