{ moderator edit: code removed}
Initialize parameter
w_init = 0
b_init = 0
some gradient descent setting
iterations = 10000
tmp_alpha = 1.0e-2
run gradient descent
w_final, b_final, j_hist, p_hist = gradient_descent(x_train, y_train, w_init, b_init, tmp_alpha, iterations,
compute_cost, compute_gradient)
print(f"(w,b) found by gradient descent: ({w_final:8.4f},{b_final:8.4f})")
Cost versus iterations of gradient descent
“”"
A plot of cost versus iterations is a useful measure of progress in gradient descent
“”"
plot cost vs iteration
fig, (ax1, ax2), = plt.subplots(1, 2, constrained_layout=True, figsize=(12, 4))
ax1.plot(J_hist[:100])
ax2.plot(1000 + np.arange(len(J_hist[1000:])), J_hist[1000:])
ax1.set_title(“Cost vs. iteration(start)”)
ax2.set_title(“Cost vs. iteration (end)”)
ax1.set_ylabel(‘Cost’)
ax2.set_ylabel(‘Cost’)
ax1.set_xlabel(‘iteration step’);
ax2.set_xlabel(‘iteration step’)
plt.show()
Predictions
print(f"1000 sqft house prediction {w_final * 1.0 + b_final:0.1f} Thousand dollars")
print(f"1200 sqft house prediction {w_final * 1.2 + b_final:0.1f} Thousand dollars")
print(f"2000 sqft house prediction {w_final * 2.0 + b_final:0.1f} Thousand dollars")
Plotting
fig, ax = plt.subplots(1, 1, figsize=(12, 6))
plt_contour_wgrad(x_train, y_train, p_hist, ax)
Zooming in,
fig, ax = plt.subplots(1, 1, figsize=(12, 4))
plt_contour_wgrad(x_train, y_train, p_hist, ax, w_range=[180, 220, 0.5], b_range=[80, 120, 0.5],
contours=[1, 5, 10, 20], resolution=0.5)
********** ERROR*****************
Traceback (most recent call last):
File “C:\Users\DJ\PycharmProjects\pythonProject2\C1_W1_Lab04_Gradient_Descent_Soln.py”, line 117, in
ax1.plot(J_hist[:100])
^^^^^^
NameError: name ‘J_hist’ is not defined. Did you mean: ‘j_hist’?