Facing Error in L_model_backward - Relu part
I have tried A_prev_temp instead of AL but the error is still there, Please guide me.
There are several problems there: the dA input is dAL in the last hidden layer, but then it’s different after that, right? It’s different in every layer: that’s the point.
Also note that you’re passing the wrong value for the cache there.
1 Like
The problem is that you are passing the wrong value for the “cache” parameter to linear_activation_backward
: that is why it throws that error down in sigmoid_backward
.
2 Likes