I’m getting a failed assertion for this problem
I followed the steps to get the cost and gradient calculation (my tests passed for the propagate function). I also followed the instructions of updating the parameters by using the update rule θ=θ−α dθ.
I’m stuck because I feel like my propagate function is correct since the test passed but i’m not passing the tests for optimize. None of my functions output match the expected outputs from the test_optimize() function
3 Likes
found the error in my code! tests pass now
2 Likes
A perfectly correct subroutine can still give you wrong answers if you pass it incorrect parameters. The problem is not in your propagate function, it is in the new logic you added in optimize. Now you need to find the error. Notice that your first cost values agrees, but it is the second one that does not. So most likely is the “update parameters” logic that is wrong, which is new code in optimize, right?
2 Likes
Oh, sorry, my previous reply was before I saw your update. Congrats on finding the issue on your own power!
2 Likes
I’m really stuck here. I am using the whole \theta = \theta - \alpha*d\theta on the relevant parameters and evidently making the same mistake as the OP but I can’t see what it is and I have gone back and watched the relevant material. Is there perhaps an intuition I am missing?
2 Likes
Could you please paste the Traceback of your error just like Chris did ?
2 Likes
It’s exactly the same.
Do you mean values for w, b, dw and db are same as well ?
2 Likes
As we chatted, all values for w, b, dw and db care correct. Even cost for the first iteration is correct. From an algorithm view point, it seems that you are correct.
The test program uses two patterns. The first test uses default values for num_iterations, learning_rate and others. But, the 2nd test changes those. For example, it sets the learning rate to 0.1 (not 0.009). This is the reason of error in the 2nd test.
Please make sure that you do not use hard coded values in your code. That may be OK for a particular test, but may not for a general testing. (A grader definitely changes those values… )
1 Like
Hi there! I encountered the same problem and I change the 0.009 to learning_rate in my code, but it is still not working, and the output looks like this, do you have any idea about how to resolve this?
1 Like
The point is that you should not have to set the learning rate to any specific value in your code: just use the value that was passed in as an argument (parameter).
1 Like
Also notice that the reason the assertion failed is that your value for the cost after 100 iterations is “NaN” instead of the correct value. Your initial cost value was correct, which means that the problem is not in your computation of the cost, but more likely in your “update parameters” logic. E.g. did you add instead of subtracting the updates?
2 Likes
Thank this response was helpful.l had the same error because l was setting a specific value for the learning rate t but like 0.5*dq which gave me wrong values for the costs.
1 Like
Hi, I have the same issue, although with different values. The first cost value is correct , the second is not. I’ve checked for hard-coded values, there are none, the w,b update rules seems fine. Where else could I look? The traceback is:
w = [[0.73906337]
[2.06737855]]
b = 0.6073747140171403
dw = [[ 0.30572597]
[-0.07835381]]
db = -0.1529282379068634
Costs = [array(0.15900538)]
AssertionError Traceback (most recent call last)
in
7 print("Costs = " + str(costs))
8
----> 9 optimize_test(optimize)
~/work/release/W2A2/public_tests.py in optimize_test(target)
73 assert type(costs) == list, “Wrong type for costs. It must be a list”
74 assert len(costs) == 2, f"Wrong length for costs. {len(costs)} != 2"
—> 75 assert np.allclose(costs, expected_cost), f"Wrong values for costs. {costs} != {expected_cost}"
76
77 assert type(grads[‘dw’]) == np.ndarray, f"Wrong type for grads[‘dw’]. {type(grads[‘dw’])} != np.ndarray"
AssertionError: Wrong values for costs. [array(5.80154532), array(0.58923175)] != [5.80154532, 0.31057104]
1 Like
Ok. I trust that you are not hard-coding and that the update rule is correct, as you mentioned. So, have you passed the propagate_test
?
1 Like
Yes, all tests before this part are ok
1 Like
Alright. Send me your code of optimize
function in a private message. Click my name and message.
1 Like