W4_A1_Ex-8/9_Linear_activation_backward & L_model_backward

I have problems with exercise 8( linear_activation_backward(dA, cache, activation) ) and exercise 9( L_model_backward(AL, Y, caches) ). I 've checked the formulas and my code. Everything seems to work fine but the grader says I have the wrong results can you please help me.

EXERCISE8:

{Moderator’s Edit: Solution Code Removed}

The grader return:

With sigmoid: dA_prev = [[-0.04591835 -0.00062194]
 [-0.0394537  -0.00053438]
 [ 0.02393479  0.00032418]]
With sigmoid: dW = [[-0.04587658 -0.03916771  0.01054726]]
With sigmoid: db = [[0.02199546]]
With relu: dA_prev = [[-0.18375266  0.        ]
 [-0.1578829   0.        ]
 [ 0.09578046  0.        ]]
With relu: dW = [[-0.18551486 -0.15574832  0.04367201]]
With relu: db = [[0.08684355]]
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 2.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
 4  Tests passed
 2  Tests failed

EXERCISE 9:

{Moderator’s Edit: Solution Code Removed}

The grader return:

dA0 = [[-0.320042    0.        ]
 [ 0.35133184  0.        ]
 [-0.04123361  0.        ]]
dA1 = [[-0.07219589 -0.78109842]
 [ 0.07925433  0.85746479]
 [-0.00930158 -0.10063526]]
dW1 = [[ 0.30884531 -0.09790237 -0.37807723]]
dW2 = [[-0.40489078 -0.32867521 -0.43766069]]
db1 = [[0.15628947]]
db2 = [[0.41669817]]
Error: Wrong shape for variable dA0.
Error: Wrong shape for variable dW1.
Error: Wrong shape for variable db1.
Error: Wrong output for variable dA1.
Error: Wrong output for variable dW2.
Error: Wrong output for variable db2.
Error: Wrong output for variable dA0.
Error: Wrong output for variable dW1.
Error: Wrong output for variable db1.
 1  Tests passed
 2  Tests failed

Hey @jmstf94,
Welcome to the community. In the implementation of linear_activation_backward, I guess you have missed out on an important point. As mentioned in the markdown, sigmoid_backward and relu_backward computes dZ for you, you don’t have to do anything else. If you want, you can check out their implementations in the dnn_utils.py file.

Now, how about you try to debug the Exercise 9 on your own, and if you still face an issue, feel free to let us know.

P.S. - Posting solution codes publicly is strictly against the community guidelines. Please refrain from doing so in the future.

Cheers,
Elemento