What is wrong here? Linear Regression Practice Lab Week 2


What is wrong here? Is the error due to the value of alpha and iterations chosen or due to the initial w and b values??

Should I get this these exact expected values of w and b? If so how?

The problem seems to be that your gradient descent function isn’t updating the w and b values, that’s what is causing the cost to never decrease.

However, you are getting new w and b values when the function ends. So that’s maybe not the issue.

Or, it could be a problem in the compute cost function, if it isn’t using the w and b values that are being passed to it.

My compute cost function is working fine I believe, because after I ran it I was given an all pass output, so that’s not the issue.
What else do you think I can do to solve this?

Your cost values did not change with the iterations.

So there is either a problem with your cost function, or a problem with the weight and bias values that are being passed to it from the gradient descent function.

I got the solution. I found out that in the compute cost functions my values were outputted in reverse, i.e dj/dw had the value of dj/db and vice versa. But the program passed it anyway, so I moved on, not thinking much about it. But after changing it, the output came out right and I was able to proceed further.
Thanks anyways :thanks: :thanks:

Thanks for your report.