Here are my results from the test cell for propagate:
m = 3
A = [[0.99979657 0.62245933 0.00273196]]
dw = [[ 0.25071532]
[-0.06604096]]
db = -0.1250040450043965
cost = 0.15900537707692405
m = 4
A = [[0.99849882 0.99979657 0.15446527 0.99966465]]
All tests passed!
My results match the “Expected Values” in the version of the notebook that I have. I’m guessing maybe you have an old version of the notebook. There is a topic on the FAQ Thread about how to refresh to the latest version. Make sure to delete the “dot py” files like public_tests.py when you do the “refresh”.
You should be using np.log, not just plain log. You did it correctly for the (1 - A) term.
One other thing to check is your parentheses. I think the np.sum and the factor of -\displaystyle \frac {1}{m} only apply to the first term the way you have written the code. That will not end well.
It looks like something has gone wrong in your code that results in the grads variable returned by the propagate function being a python string instead of a python dictionary.
Put the following print statement right before the “return” statement in your propagate function:
print(f"type(grads) = {type(grads)}")
When I do that, here’s what I get:
type(grads) = <class 'dict'>
Note that the code that stores your dw and db values in the grads dictionary was given to you and you should not have needed to change it. Here’s what it looks like in the default notebook:
grads = {"dw": dw,
"db": db}
return grads, cost
If your code looks different than that, then how did that happen?
For below inputs :
w = np.array([[1.], [2]])
b = 1.5
X = np.array([[1., -2., -1.], [3., 0.5, -3.2]])
Y = np.array([[1, 1, 0]])
The output is :
dw = [[ 0.75214595]
[-0.19812289]]
db = -0.37501213501318953
cost = 0.15900537707692405
The value of A matrix is matching exactly and cost is also correct.
But dw and db not matching. Request you to kindly help.
Here are my results (I added a print statement in the logic to show A):
A = [[0.99979657 0.62245933 0.00273196]]
dw = [[ 0.25071532]
[-0.06604096]]
db = -0.1250040450043965
cost = 0.15900537707692405
But I guess it makes sense that since your cost value matches, your A is pretty much guaranteed to be correct. So it is only the dw and db values that are wrong. That gives you a pretty good clue where to look: compare your logic to the math formulas shown in the text. There aren’t that many moving parts there.
Here’s one clue: take your dw[0] value and divide it by my dw[0] value:
Note that the mentors do not have the superpower to examine your notebooks, so I can only reason from the results you show. It’s clear that your cost code generates different values on both of the test cases there, but it looks like everything else is correct. So you need to carefully compare your implementation of the cost to the math formula. If you used np.dot, make sure that you used the transpose correctly. You have two 1 x m vectors, right? So you need to be dotting 1 x m dot m x 1 in order to get a 1 x 1 output. If you dot m x 1 with 1 X m then you get m x m and summing that does not give the same result. Here’s a thread about that.
Getting this error in q 5 - Merge all functions into a model
Wrong values for d[‘w’]. [[ 0.14449502]
[-0.1429235 ]
[-0.19867517]
[ 0.21265053]] != [[ 0.08639757]
[-0.08231268]
[-0.11798927]
[ 0.12866053]]
The usual cause for that error is “hard-coding” the values of learning rate and number of iterations when you call optimize from model. If there are equal signs mixed in with your parameters, that is a mistake. Or if you omit those parameters, that also counts as “hard-coding”, since it means you are using the default values defined with the optimize function and ignoring the actual values that are requested in the original call to model.
When you get an error at the beginning of a line like that, it means the error is on the previous line. Check the parentheses on the “cost” line. You’ll find that they don’t match: there are 11 open parens, but only 9 close parens, which is what causes that error. Note that the editor in the notebook is python syntax aware: click on a paren and it will highlight the matching one. Or not …