Exercise 9 - L_model_backward
I am not really sure what I am supposed to do in this part of the exercise. I attached a screenshot below for the specific part.
Exercise 9 - L_model_backward
I am not really sure what I am supposed to do in this part of the exercise. I attached a screenshot below for the specific part.
You need to call one of the functions that you wrote earlier: linear_activation_backward
. Of course that involves understanding how that function works and what it expects for arguments and what it returns as its return values. They give you some hints in the comments at least about the return values point. Please start by reviewing how linear_activation_backward
works and then read the instructions for L_model_backward
. The high level description is that back propagation is the mirror image of forward propagation: we start with the output layer and then loop backwards through the hidden layers to compute the gradients for the W^{[l]} and b^{[l]} values at every layer.