Incorrecto value for cost in Logistic_Regression_with_a_Neural_Network_mindset

[https://www.coursera.org/learn/neural-networks-deep-learning/programming/thQd4/logistic-regression-with-a-neural-network-mindset/lab](https://Incorrecto value for cost in Logistic_Regression_with_a_Neural_Network_mindset)

I am getting incorrect value for cost in Exercise 6. Can you please let me know what am I doing wrong here?

Exercise 5 outputs were verified to be correct. Below is my implementation for the cost

{moderator edit - solution code removed}

Welcome to the forum @reeshav!

My advice is to check your usage of np.sum() and/or np.dot() in your computation of the variable cost in Exercise 5.

Let me know what you find!

The code you show looks correct. In fact if you look carefully at the failed assertion message, you will see that your first value of the cost (the one at 0 iterations) agrees with the expected value. The one that is wildly wrong is your second value (100 iterations). So your basic cost computation is correct, but there must be something wrong with either your back propagation or the way you are updating the parameters. Actually the back prop calculations looks correct also. I’ll bet you’re doing something like using “+” instead of “-” in the parameter update. Notice that your cost is growing rather than shrinking. Hmmmm. :face_with_monocle:

Hi, I have the same problem. I donno what’s wrong and getting very small values. However, i got correct db value.


My guess is that you do not have the latest version of the notebook. There was a version with a bug in it that was fixed maybe 36 hours ago, but perhaps either you haven’t closed and reopened the notebook since the fix or maybe they didn’t make it a “force update”.

There is a topic on the FAQ Thread about how to get a clean copy. Then carefully “copy/paste” over just your solution code to the new version and see if that fixes the issue.

It’s working. Thanks!

Hi, I’m having problem with the cost function, my code seems to be fine, but it doesn’t match with the expected output.

If you have two row vectors, please note that v^T \cdot w and v \cdot w^T do not give you the same result. The way you have implemented your cost computation is incorrect. The mistake is present in the original post on this thread, but I missed it the first time around. Please see below for a link that discusses the issue here in more detail.

Here’s another recent thread which shows examples of the difference between the two ways of doing the transpose + dot product.

1 Like