Dear tutor,
It seems I got exactly the same number as expected, but got the following error.
Plus, it print out the iteration result, no matter print_cost=False or True
Thanks.
Cost after iteration 0: 0.693086
Cost after iteration 1000: 0.000220
Cost after iteration 2000: 0.000108
Cost after iteration 3000: 0.000072
Cost after iteration 4000: 0.000054
Cost after iteration 5000: 0.000043
Cost after iteration 6000: 0.000036
Cost after iteration 7000: 0.000030
Cost after iteration 8000: 0.000027
Cost after iteration 9000: 0.000024
W1 = [[ 0.71392202 1.31281102]
[-0.76411243 -1.41967065]
[-0.75040545 -1.38857337]
[ 0.56495575 1.04857776]]
b1 = [[-0.0073536 ]
[ 0.01534663]
[ 0.01262938]
[ 0.00218135]]
W2 = [[ 2.82545815 -3.3063945 -3.16116615 1.8549574 ]]
b2 = [[0.00393452]]
AssertionError Traceback (most recent call last)
in
----> 1 nn_model_test(nn_model)
~/work/release/W3A1/public_tests.py in nn_model_test(target)
292 assert output[“b2”].shape == expected_output[“b2”].shape, f"Wrong shape for b2."
293
→ 294 assert np.allclose(output[“W1”], expected_output[“W1”]), “Wrong values for W1”
295 assert np.allclose(output[“b1”], expected_output[“b1”]), “Wrong values for b1”
296 assert np.allclose(output[“W2”], expected_output[“W2”]), “Wrong values for W2”
Since learning rate is not a given input and without it, the code will not run.
ideally, learning rate should be a given input for the wrapper function
Since learning_rate has a default value set to 1.2, if we don’t have provide a value explicitly to this parameter when invoking update_parameters, the value will be 1.2.
If the test code has a custom implementation for update_parameters with learning rate set to something other than 1.2, your implementation of nn_model will fail.
Very good point, I have deleted the learning rate.
However, still got the same error.
The key issue is that, my output is exactly the same as “Expected output”. however, got the error says W1 value do not match.
Thanks & Regards
GRADED FUNCTION: nn_model
def nn_model(X, Y, n_h, num_iterations = 50000, print_cost=False):
“”"
Arguments:
X – dataset of shape (2, number of examples)
Y – labels of shape (1, number of examples)
n_h – size of the hidden layer
num_iterations – Number of iterations in gradient descent loop
print_cost – if True, print the cost every 1000 iterations
Returns:
parameters – parameters learnt by the model. They can then be used to predict.
“”"
Well, now you are not passing the learning rate at all to update_parameters, so you end up with the default declared in that function always, even if a different value were passed into nn_model at the top level. When I tried that, it passes the test cases in the notebook, but it is logically incorrect. It depends on the fact that the test case happens to use the default value of the LR.
The entire script, I can not see where the learning rate is assigned.
I am not allowed to assign the learning rate to 1.2
I am not allowed to use the default learning rate.
Could you please tell me what is the main cause that my code output matches the expected output, it gives the error that W1 is incorrect?
Thanks. If you let me know the root cause, maybe it is easier for me to debug.
Sorry, but I can’t tell that from the evidence here on this thread. I will send you a DM about how to proceed.
But a general recommendation would be that you spend some time reading about how “keyword parameters” work in python. Please try googling “python keyword parameters” and find some good tutorials on that. I’m not sure that is what is causing your problem here, but it’s an important part of python syntax that it sounds like is not clear in your mind.