Forward_propagation_n behaviour in gradient checking [i]

In Gradient Checking assignment Week 1, we have forward_propagation_n which runs i times from i = range(num_parameters).

I was wondering for each i, it does full round of forward prop and then only extracts i-th gradient to be updated in gradapprox. Is not this wasteful considering all other derivates in forward_prop_n are left unused?

No derivatives are computed inside forward_propagation_n. The returned cost is used to check gradients.

1 Like