Regarding Week2 assignment (Exercise 5 - propagate), would someone kindly let me know what I did wrong ? Here are what I have:
A = sigmoid(np.dot(w.T, X) + b)
dw = (1/m_train) * np.dot(X, (A - Y).T)
When I ran the code, it complains: f"Wrong values for grads[‘dw’]. {grads[‘dw’]} != {expected_dw}"
Any idea ? Thanks!
What is the value of m_train there? If that is a global variable, not defined in the scope of your propagate function, that will cause problems. All functions need to be “self-contained”, meaning that they only reference local variables, which includes the formal parameters.
I appreciate your response. Thanks!
m_train = train_set_x_orig.shape[0] which I defined in part I (Exercise I) above. I copied and pasted it again for Exercise 5 down below, re-ran, and still had the issue as shown above. This is pretty odd.
What does train_set_x_orig have to do with the particular test case for propagate? The whole point is that propagate is a self-contained general purpose function which works with any inputs, right?
If you are not familiar with the concept of variable “scope”, you should really take a python course before you continue here. Does it make sense when I said above that propagate should only reference “local” variables?
For Exercise 5, I need to compute dw = (1/m) X (A - Y).T. If I’m not mistaken, m is the number of training examples which, in part I, we already computed, known as “m_train”, and somehow I thought m_train was a global variable. Somehow, I missed the “m = X.shape[1]” which I should have used instead. Still, the confusion I’m having right now is: What is the difference between m=X.shape[1] and m_train in part I above ? Are they both the number of training examples ?
Yes, but the point is that the X is different, right? That is the point. The X is the one passed into propagate, not some other value that happens to be hanging around in the notebook global space, right?
Thanks! Perhaps, it will help everyone if there is a printable PDF version of the assignment in addition to this online notebook because scrolling up and down multiple times through such a long assignment can be difficult to see the entire thing. In the industry, people tend to use pagination instead. I had my Machine Learning course with Andrew seven years ago and I had a great time. While much of the material of this course is the same as those in machine learning, upon my review, I find this assignment quite confusing; the individual exercises seem like they are part of one big assignment but they are not. Anyway, I’m still on Exercise 5 (propagate), would you please let me know what I did wrong for the cost function:
A = sigmoid(np.dot(w.T, X) + b)
content = np.dot(Y.T, np.log(A)) + np.dot((1-Y).T, np.log(1-A))
cost = (-1/m) * np.sum(content)
I got assertion error: Wrong values for cost. 12.519827193591215 != 2.0424567983978403
Thanks a lot!
The individual functions here are part of one big assignment. Check the model function later in the assignment, which integrates the earlier functions. The notebooks are a new more modern way to package things that includes both instructions and explanations interspersed with code, but it does take a little getting used to. You can print out the contents of the notebook if you find that helpful. There is a “File → Download” command that lets you select an output format like HTML or PDF if you want to print them out.
The problem with your cost is that you are dotting an m x 1 vector with a 1 x m vector, so you wind up with an m x m matrix, right? That is not what you want. Try this and watch what happens:
v = np.array([[1, 2, 3, 4]])
print(v)
print(1**2 + 2**2 + 3**2 + 4**2)
print(np.dot(v, v.T))
print(np.dot(v.T,v))
print(np.sum(np.dot(v.T,v)))
When I run that, here is what I get:
[[1 2 3 4]]
30
[[30]]
[[ 1 2 3 4]
[ 2 4 6 8]
[ 3 6 9 12]
[ 4 8 12 16]]
100
The result you get from this should look familiar:
print(np.dot(v.T,v))
[[ 1 2 3 4]
[ 2 4 6 8]
[ 3 6 9 12]
[ 4 8 12 16]]
It’s the multiplication table for the numbers 1 to 4. So it’s clear that when you add that up, the answer has nothing to do with what we are actually trying to compute here.
Thanks for everything! I already submitted this assignment and got a grade. I appreciate your assistance!