W4_A1_Ex-9_L_model backward

in the last exercise , I encounter is issue.
I take my time to figure out but no way
this is what I did any what the grader said.

{moderator edit - solution code removed}

You are always using dAL as the input to linear_activation_backward. That is correct for the output layer, but not for the hidden layers.

Note that it is a general principle of debugging that a perfectly correct function can still throw errors if you pass it incorrect arguments. That is what has happened here: the error is thrown in relu_backward, which you didn’t write. But that does not mean it’s not your mistake. The mistake is in your higher level code that results in wrong (mismatching) values getting passed down to relu_backward. When you debug, you have to start from the point of the error. What is wrong? Now you have to track backwards up the stack to figure out why.

For debugging it will help to see the source code for relu_backward to understand what is happening. You can click “File → Open” and find the imported file. There is a topic about how to do that on the DLS FAQ Thread.