Course 2 - Week 1- Assignment 3 - gradient_check_n

A​lthough I have passed all the tests in all the exercises of Gradient Checking Programming Assignment 3 of Week 1 , I am getting a 60/100 grade!

Can you post the output of the grader as text?

of course! this is the output:
[ValidateApp | INFO] Validating ‘/home/jovyan/work/submitted/courseraLearner/W1A3/Gradient_Checking.ipynb’
[ValidateApp | INFO] Executing notebook with kernel: python3
Tests failed on 1 cell(s)! These tests could be hidden. Please check your submission.

The following cell failed:

X, Y, parameters = gradient_check_n_test_case()

cost, cache = forward_propagation_n(X, Y, parameters)
gradients = backward_propagation_n(X, Y, cache)
difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
expected_values = [0.2850931567761623, 1.1890913024229996e-07]
assert not(type(difference) == np.ndarray), "You are not using np.linalg.norm for n...
assert np.any(np.isclose(difference, expected_values)), "Wrong value. It is not one...

The error was:

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-11-c57ee5e9e05a> in <module>
      6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
      7 assert not(type(difference) == np.ndarray), "You are not using np.linalg.no...
----> 8 assert np.any(np.isclose(difference, expected_values)), "Wrong value. It is...

AssertionError: Wrong value. It is not one of the expected values

So you maybe have a bug in forward_propagation_n, backward_propagation_n or gradient_check_n.

Can you post your implementation of gradient_check_n? We can start there.

Give me forward_propagation_n and backward_propagation_n as well :slight_smile:

When I submit the assignment, I pass with score 100/100, because gradient_check_n_test doesn’t run:

[ValidateApp | INFO] Validating '/home/jovyan/work/submitted/courseraLearner/W1A3/Gradient_Checking.ipynb'
[ValidateApp | INFO] Executing notebook with kernel: python3
Success! Your notebook passes all the tests.
==========================================================================================
The following cell failed:

    X, Y, parameters = gradient_check_n_test_case()
    
    cost, cache = forward_propagation_n(X, Y, parameters)
    gradients = backward_propagation_n(X, Y, cache)
    difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
    assert not(type(difference) == np.ndarray), "You are not using np.linalg.norm for n...
    
    gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)

The error was:

    ---------------------------------------------------------------------------
    NameError                                 Traceback (most recent call last)
    <ipython-input-11-0b57cb811a44> in <module>
          6 assert not(type(difference) == np.ndarray), "You are not using np.linalg.no...
          7 
    ----> 8 gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)
    
    NameError: name 'gradient_check_n_test' is not defined

We will investigate this further. In the meantime, you can refresh your workspace, save your current solution, copy paste your solution to the latest version of the assignment and submit again. Be sure to delete all other files that might be visible.

Follow the instructions here:

1 Like

Thank you for everything. I’m going to try it :pray:

I see now that I too was running an old version of the lab. After I refresh, I no longer have any errors:

[ValidateApp | INFO] Validating '/home/jovyan/work/submitted/courseraLearner/W1A3/Gradient_Checking.ipynb'
[ValidateApp | INFO] Executing notebook with kernel: python3
Success! Your notebook passes all the tests.

Please let me know if a refresh worked for you. If it did, please also remove all solution code from this post :slight_smile:

Okey, I will keep you informed.

1 Like

It’s me again :sweat_smile:
Unfortunately refreshing workspace didn’t work and i don’t know what should i do!

Download and upload the notebook here as an attachment and I will have a look.

1 Like

Thank you so much for your kind reply. This is my Notebook.

Edit: Removed notebook

Okey, I finally found it :smiley:

First of all,

backward_propagation(x, theta) can be implemented in a much simpler way. Try and see how you can simplify the gradient.

Secondly, check again how you calculate theta_minus[i]

1 Like

I cannot thank you enough for helping me :white_heart: