[C1_W3_Logistic_regression] How does the plot_decision_boundary() does in details?

Dear Mentors,

  1. I do not understand the yellow line on the picture here. Help me understand it, pls
  2. I do not understand why we must divide into 2 cases in this function such as that.

Thank you so much. Have a good day <3


def plot_decision_boundary(w, b, X, y):
# Credit to dibgerge on Github for this plotting code

plot_data(X[:, 0:2], y)

if X.shape[1] <= 2:
    plot_x = np.array([min(X[:, 0]), max(X[:, 0])])
    plot_y = (-1. / w[1]) * (w[0] * plot_x + b)
    plt.plot(plot_x, plot_y, c="b")
    u = np.linspace(-1, 1.5, 50)
    v = np.linspace(-1, 1.5, 50)
    z = np.zeros((len(u), len(v)))

    # Evaluate z = theta*x over the grid
    for i in range(len(u)):
        for j in range(len(v)):
            z[i,j] = sig(np.dot(map_feature(u[i], v[j]), w) + b)
    # important to transpose z before calling contour       
    z = z.T
    # Plot z = 0.5
    plt.contour(u,v,z, levels = [0.5], colors="g")

The first case is used for 2D plots, where you can draw the hypothesis as a straight line.

The yellow-highlighted code is bit of algebra for computing the vertical axis value given a horizontal axis value, given that the f_wb value is zero.

The second case is used for 3D plots, where the decision boundary is a contour plot instead of a straight line.

1 Like