C1_W2_Linear_Regression (value of w in gradient descent)

In the C1_W2_Linear_Regression lab, the value of w was initially is w = 75.203.
why this initial w value(w = 75.203) is not used in the gradient algorithm, in the function of gradient descent algorithm, they set initial_w = 0. initial_b = 0 ?

Hello @Ranjeet_Kumbhar,

Welcome to the DeepLearning.AI community.

The value of 75.203 is the cost for w = 2(initial_w) and b = 1(initial_b) and not the value of initial_w.

Yeah, I got it now, Thanks for your reply