W4_A1_Ex-9_L_Model_Backward_Function

The point of the error is that the values you passed down to relu_backward don’t match, right? The dA is a different shape than the Z value in the cache. So how could that happen? Notice that you are always passing dAL for the dA argument when you call linear_activation_backward from L_model_backward. That only works at the output layer, right?

This is a classic example of how debugging works: the error is thrown two levels down the call stack in a routine that was just given to you, so you can assume it’s correct. So how could that happen? You must have passed bad arguments, so you start by seeing what the error means and then you track backwards up the call stack to figure out where the real problem is. Where did the bad value come from?

1 Like