Week1 Gradient Checking

In function “def gradient_check(x, theta, epsilon=1e-7, print_msg=False):”

I’m not solving the “grad” equation. I just put grad=gradapprox and all the tests are passed :slight_smile: . Kindly assist me how to tackle this problem

Hi, @hark99.

That function returns the difference between the gradient computed by backward_propagation and the approximated gradient. If you set them to the same value, the difference will be 0. That’s why the tests passed.

The comments tell you how to compute grad:

# Check if gradapprox is close enough to the output of backward_propagation()
#(approx. 1 line)
# grad =

I have to calculate the derivative of ‘gradapprox’ or something else after setting
J_pLus = (theta + epsilon) * x

I am still facing issues in calculating the gradient of gradapprox. Give me a hint if possible.

@nramon
I am not sure how to write a code for “Check if gradapprox is close enough to the output of backward_propagation()”. Can you please guide me?

Hi @sima_ranjbari,

You have already computed gradapprox, right? Now call backward_propagation() (you are given x and theta) and follow these steps (from the notebook) to compute the difference:

  1. Compute the numerator using np.linalg.norm (numerator = ...).
  2. Compute the denominator. You will need to call np.linalg.norm twice (denominator = ...).
  3. Divide them (difference = ...).

Let me know if you need help with a specific step. Good luck :slight_smile: