Dear all,
I have a problem with exercise 9 of the week 4 assignment where am getting the error related to relu_backward as attached. How do I solve this:
Dear all,
I have a problem with exercise 9 of the week 4 assignment where am getting the error related to relu_backward as attached. How do I solve this:
Your input argument to the linear_activation_backward
function (inside the L_model_backward
) is incorrect. It should not be dAL
in the case of relu
. Think about it. You have to loop over the hidden layers.
Hi @Sheila_Murunga ,
Looking at the error track back, it suggests that the call for linear_activtion_backward() using ‘relu’ as activation function is incorrect. If you referred back to the diagram that shows the backward pass, it shows the activation function for the last layer is ‘sigmoid’.