I’m currently attempting the [Week 2 Logistic Regression with a Neural Network Mindset] programming assignment, but have hit a snag in [Exercise 8 - Model].
I keep getting the below error, even though all the previous tests have been passed. On further debugging, I’ve noticed that the number of iterations is being set to 50 and the learning rate to 0.0001 even though they’re set to 2000 and 0.5 respectively in the model() function call. My fear is that the function’s arguments are being overwritten somehow, leading to possibly the training and test data sets themselves being ‘corrupted’. Has anyone experienced this previously?
Number of iterations is 50
Learning rate is 0.0001
Number of iterations is 50
Learning rate is 0.0001
Iteration 0
Cost after iteration 0: 0.693147
train accuracy: 66.66666666666667 %
test accuracy: 66.66666666666667 %
---------------------------------------------------------------------------
AssertionError Traceback (most recent call last)
<ipython-input-58-7f17a31b22cb> in <module>
----> 1 model_test(model)
~/work/release/W2A2/public_tests.py in model_test(target)
113 assert type(d['costs']) == list, f"Wrong type for d['costs']. {type(d['costs'])} != list"
114 assert len(d['costs']) == 1, f"Wrong length for d['costs']. {len(d['costs'])} != 1"
--> 115 assert np.allclose(d['costs'], expected_output['costs']), f"Wrong values for pred. {d['costs']} != {expected_output['costs']}"
116
117 assert type(d['w']) == np.ndarray, f"Wrong type for d['w']. {type(d['w'])} != np.ndarray"
AssertionError: Wrong values for pred. [array(0.80773563)] != [array(0.69314718)]
Appreciate the response! I’ve used signature() to obtain the signatures for the propagate(), optimize() and model() methods, but I can’t find anything out of place…
The strange thing is that I’m printing the number of iterations from the model() method itself, before making any call to the optimize() method!
Signature of model function: (X_train, Y_train, X_test, Y_test, num_iterations=2000, learning_rate=0.5, print_cost=True)
Signature of optimize function: (w, b, X, Y, num_iterations, learning_rate, print_cost)
Signature of propagate function: (w, b, X, Y)
Number of iterations is : 50
Understood. Are you passing 7 parameters to optimize() inside model()? The issue is since optimize has default values for some of its arguments, it will still work if you don’t pass all 7 parameters; it will just behave strangely.
All the parameters provided to model() are being passed to optimize() without any modification.
Is it perhaps something to do with the ‘model’ argument being passed to the model() method? I assumed that this was a dictionary containing all the necessary parameters, but maybe I’m missing something?
model_test(model)
I’m fairly sure it’s something to do with this. When running the next line of code (In [55]) using the model() method once again, this time with different parameters, the method works swimmingly.
Signature of model function: (X_train, Y_train, X_test, Y_test, num_iterations=2000, learning_rate=0.5, print_cost=True)
Signature of optimize function: (w, b, X, Y, num_iterations, learning_rate, print_cost)
Signature of propagate function: (w, b, X, Y)
Number of iterations is : 2000
Cost after iteration 0: 0.693147
Cost after iteration 100: 0.584508
Cost after iteration 200: 0.466949
Cost after iteration 300: 0.376007
Cost after iteration 400: 0.331463
Cost after iteration 500: 0.303273
Cost after iteration 600: 0.279880
Cost after iteration 700: 0.260042
Cost after iteration 800: 0.242941
Cost after iteration 900: 0.228004
Cost after iteration 1000: 0.214820
Cost after iteration 1100: 0.203078
Cost after iteration 1200: 0.192544
Cost after iteration 1300: 0.183033
Cost after iteration 1400: 0.174399
Cost after iteration 1500: 0.166521
Cost after iteration 1600: 0.159305
Cost after iteration 1700: 0.152667
Cost after iteration 1800: 0.146542
Cost after iteration 1900: 0.140872
train accuracy: 99.04306220095694 %
test accuracy: 70.0 %