DLS Course 1 week 4 assignment 1 exercise 9 - L_model_backward

Facing Error in L_model_backward - Relu part
I have tried A_prev_temp instead of AL but the error is still there, Please guide me.

There are several problems there: the dA input is dAL in the last hidden layer, but then it’s different after that, right? It’s different in every layer: that’s the point.

Also note that you’re passing the wrong value for the cache there.

1 Like

hey, I’m facing an error in the sigmoid part, I don’t understand the type error

The problem is that you are passing the wrong value for the “cache” parameter to linear_activation_backward: that is why it throws that error down in sigmoid_backward.

2 Likes