I have passed all the preceding tests, but am stuck on Exercise 8, where I get an error saying that my values for d[‘w’], and possibly d[‘b’] is wrong. I think that I have the line:
{moderator edit - solution code removed}
right, but I don’t know how to assign the return values to params, grads, cost into the return value d.
Any help would be deeply appreciated.
Larry Hunsicker
AssertionError Traceback (most recent call last)
in
1 from public_tests import *
2
----> 3 model_test(model)
~/work/release/W2A2/public_tests.py in model_test(target)
131 assert type(d[‘w’]) == np.ndarray, f"Wrong type for d[‘w’]. {type(d[‘w’])} != np.ndarray"
132 assert d[‘w’].shape == (X.shape[0], 1), f"Wrong shape for d[‘w’]. {d[‘w’].shape} != {(X.shape[0], 1)}"
→ 133 assert np.allclose(d[‘w’], expected_output[‘w’]), f"Wrong values for d[‘w’]. {d[‘w’]} != {expected_output[‘w’]}"
134
135 assert np.allclose(d[‘b’], expected_output[‘b’]), f"Wrong values for d[‘b’]. {d[‘b’]} != {expected_output[‘b’]}"
AssertionError: Wrong values for d[‘w’]. [[0.]
[0.]
[0.]
[0.]] != [[ 0.08639757]
[-0.08231268]
[-0.11798927]
[ 0.12866053]]
If you pas
The way you are calling optimize
is not correct. Think about the implications for the learning rate and number of iterations. Those are passed in to model
at the top level, but you are not passing them to optimize
. So that means that what will actually happen is that you will end up using the default values that are declared in the definition of the optimize
function. What if those are different than the actual values being passed to model
?
In terms of what to do with the return values from optimize
, note that those are dictionaries, right? We’ve seen examples earlier of how to extract values from dictionaries. They give you the logic for creating the final returned dictionary d in the provided template code, but you need to extract the returned values and put them in the correct variables to correspond to how that template code works.
Many thanks for your very prompt reply. I think that I understand your first point – I need to copy all the needed values, including num_iterations, learning_rate, and print_cost, from the values in the outer function’s parameters. I am assuming that all these parameter values are available in the scope of the outer function. So I edited that call to:
{moderator edit - solution code removed}
I assume that now the optimize function returns three objects to the outer function’s namespace: the dictionaries “params” and “grads”, and the vector “costs”. (Sorry to call it a vector. I am an R programmer.) If I were allowed five or six lines, I could call the optimize function, and then I think that I could assign the w and b values from the params dictionary. But the surrounding text suggests that I should be able to do both of these steps in a single line of code, and that has stumped me. If there is “an example earlier” that would explain how I can do the above steps in a single line of code, I would appreciate your pointing me to that code.
Thanks, again, for your prompt reply.
Larry
I’m not sure what they mean by “one line of code”, but all those comments are purely advisory. Also note that the grader doesn’t actually look at your source code: it only cares about the return values of your functions. You need at least two lines of code there: one line to retrieve the value of w from the dictionary and one to retrieve the value of b from params
. Well, if you’re counting the invocation of optimize
, that makes 3 lines.
You also need to invoke predict
a couple of times.
Your invocation of optimize
looks correct now, although you don’t really need the equal signs in the parameter values. I knew R once upon a time, but it’s so long ago that I don’t remember how they deal with the concept of “scope”. You might want to spend just a few minutes reading up on the results of a google search like “variable scope python functions”.
Thanks, Paul. With the correction of the optimize call and use of a few more than 4 lines of code, I seem to have got the right answers.
Nice work! Onward! There is plenty of interesting material ahead. 