DNN Course 1-week4 A2-wrong output for variable W1

Cost after iteration 1: 0.6925272135228361
Cost after first iteration: 0.693049735659989
Cost after iteration 1: 0.6919261002473156
Cost after iteration 1: 0.6919261002473156
Cost after iteration 1: 0.6919261002473156
Error: Wrong output for variable W1.
Error: Wrong output for variable b1.
Error: Wrong output for variable W2.
Error: Wrong output for variable b2.
Cost after iteration 2: 0.6768602138414122
Error: Wrong output for variable W1.
Error: Wrong output for variable b1.
Error: Wrong output for variable W2.
Error: Wrong output for variable b2.
2 Tests passed
2 Tests failed

AssertionError Traceback (most recent call last)
in
3 print("Cost after first iteration: " + str(costs[0]))
4
----> 5 two_layer_model_test(two_layer_model)

~/work/release/W4A2/public_tests.py in two_layer_model_test(target)
75 ]
76
—> 77 multiple_test(test_cases, target)
78
79

~/work/release/W4A2/test_utils.py in multiple_test(test_cases, target)
140 print(’\033[92m’, success," Tests passed")
141 print(’\033[91m’, len(test_cases) - success, " Tests failed")
→ 142 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
143

AssertionError: Not all tests were passed for two_layer_model. Check your equations and avoid using global variables inside the function.

I have gone though all the posts regarding this topic, I have not used my A1 function in here, I have not used any incorrect function parameter call for “Learning_rate”. Not sure what new bug is present here. Any help would be appreciated!. BTW my L-layer model in A2 works just fine, its only the two-layer model that has this issue.

Well, there are lots of details to get right in two_layer_model. Are you sure you used the correct activation functions in the right places? E.g. in backward prop, the order is reversed, right?

And are you sure you did not use the “Deep” version of the initialization routine for the two layer case?

Hello, thanks for the help.
This is what was incorrect in my code-
During back prop to calculate DA0 and DA1, I was supposed to use “sigmoid” first and later use “relu”. I had this in reverse. Thanks Paul! Now it works