week 3 assignment from gradient checking is not working and i tried all the ways to get code functions but its still not working especially at x,y gradient check test case and cost cache= forward_propagation that line code is not working at all
Gradient checking is the 3rd assignment in Week 1 of DLS Course 2. Please show us the failure outputs that you are getting from whatever test cases fail.
i got this output when i use this input of code X, Y, parameters = gradient_check_n_test_case()
cost, cache = forward_propagation_n(X, Y, parameters)
gradients = backward_propagation_n(X, Y, cache)
difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
expected_values = [0.2850931567761623, 1.1890913024229996e-07]
assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”
AttributeError Traceback (most recent call last)
in
3 cost, cache = forward_propagation_n(X, Y, parameters)
4 gradients = backward_propagation_n(X, Y, cache)
----> 5 difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
in gradient_check_n(parameters, gradients, X, Y, epsilon, print_msg)
27 theta_plus = np.copy(parameters_values) # Step 1
28 theta_plus[i] += epsilon # Step 2
—> 29 J_plus[i], _ = forward_propagation_n(vector_to_dictionary(theta_plus), X, Y) # Step 3
30
31 # Compute J_minus[i]
in forward_propagation_n(X, Y, parameters)
21
22 # retrieve parameters
—> 23 m = X.shape[1]
24 W1 = parameters[“W1”]
25 b1 = parameters[“b1”]
AttributeError: ‘dict’ object has no attribute ‘shape’
@Ravi_Teja6, there is no gradient checking assignment in Week 3. I wasted quite a bit of time searching Week 3 trying to find it.
When you post on the forum, please try to be accurate.
The order of the function arguments here is incorrect.
am soryy its week 1 programming assignment on gradient checking am extremely sorry for giving wrong details
That’s the same error. It appears you didn’t make any changes.
could you please help me out of this code this is only one i left in the entire course and assignement
Tom pointed you exactly to the code that is wrong and even told you what is wrong with it. Please check the definition of the function forward_propagation_n
.
hi sir i got the answer by rectifying the mistake but when i re-submit the assignment it shows i failed in the assignment where i got the “There is a mistake in the backward propagation! difference = 0.28509316” in the last now what should i do?
i changed the code little bit after researching about forward propagation could you check is their anything do i need to change and the autograder is not letting me to finish this course . and it is not letting me to change the last code where gradient check test case one
.i don’t known why[grid]
[/grid]
i got the nb grader showing this even though i tried a code in different line of code and that was not fair
Hello, @Ravi_Teja6,
This is my result of that part and it passed the grader:
Please follow these two steps:
-
Get the latest version of the notebook. Your screenshot tells me you might not be using the latest version for your submission. For example, Mine (
) has way more decimal places than yours (
). We should first use the latest notebook to get rid of any unexpected behavior. Check this FAQ out for how to get a clean copy.
-
After submitting with the latest version, if there is still an error, please share the full error message. For example, in your last sharing, it does not contain the part below the following:
Let us know.
Cheers,
Raymond
i have done this in jupytor notebooks from coursera labs (launch lab) i can show you the complete error and i can share the whole document
{moderator edit - solution code removed}
i got the answer now. i changed the code and now its working sir thank you so much sir