I have the following questions from the assignment on logistic regression:
- Section 2.6 Learning parameters using gradient descent uses theta. What is theta here?
- Could someone explain what X[positive, 0], X[positive, 1] does in the plot_data function in utils.py for the assignment on logistic regression? I read here:
Pyplot tutorial — Matplotlib 3.7.2 documentation that corresponding values of arrays are matched eg. If we have plt.plot([1,2,3,4],[1,4,9,16], 'ro’), the points to be plotted are: (1,1), (2,4), (3,9), (4,16) but I don’t quite understand how it works in our function below
def plot_data(X, y, pos_label="y=1", neg_label="y=0"):
positive = y == 1
negative = y == 0
# Plot examples
plt.plot(X[positive, 0], X[positive, 1], 'k+', label=pos_label)
plt.plot(X[negative, 0], X[negative, 1], 'yo', label=neg_label)