Decision Boundary in Logistic Regression

I imagine that the true class should always be greater to threshold and the false class be less than the threshold. So, the true class is defined for z greater the value corresponding to the threshold, and the false class is defined for z less than the value corresponding to the threshold. If this is true how true class is inside the decision boundary here and the flase class is out. I feel everything is reversed!

Hello @Abdelrahman_Osama2,

In fact, if we just look at the equation on the slide that you have shared here (no lecture video, no \hat{y} = 0 nor \hat{y} = 1 on the graphs, nothing else), we can’t tell whether inside or outside is for the True class. As you said, it is about whether z is above or below the threshold, and if we consider 50% probability as the threshold that is corresponding to z = 0, based on just the equation, there is no way to tell which side is for the True class.

We cannot determine by whether the side has larger x_1 or x_2 values or not, and it is clear if we consider the following cases:

  1. if the weights are all positive and the bias is zero, then larger x values tend to result in higher z
  2. if the weights are all negative and the bias is zero, then larger x values tend to result in smaller z

Therefore, without knowing the exact values of the weights and the bias, we can only take the lecture for it.

However, I can backward determine about the bias term given that (1) \hat{y} = 1 is said to be inside the boundary and (2) we were using z=0 as the threshold - the bias has a positive value. I know b is positive by setting all the x's to zeros, do you get what I mean?

Cheers,
Raymond

This happens easily if one of the weight values is negative.

Thanks so much. I got it now