Does linear regression with a single feature always lead to an underfit model?
Does quadratic regression with a single feature always lead to a ‘just-right’ model?

As the graph shapes of those functions are identical or similar to those models, this question is raised in my head.Professor Andrew has taught us the concept using this similarity.

If the answers to my questions are yes, then I suppose that a linear equation with multiple features can lead to an overfit or just the right model.
and the quadratic model with multiple features can lead to an overfit or even an underfit model.

You have been in this topic for some time. Let me know if the below gives you any different view.

I think your “quadratic regression with a single feature” means y = b + w_1x + w_2x^2.

First, it is okay to say that, it is more likely to overfit with more polynomial terms, and more likely to underfit with less terms. I agree with Tom that we will never say always here, because being underfit/overfit is relative to the data. You need to tell us how the data behave before there is any chance to comment on it being underfit/overfit.

Generally speaking, with more polynomial terms, there will be more trainable weights. With more trainable weights, there will be more freedom for it to fit itself to the training data, and thus a higher chance of overfitting. When we say “overfitting”, we are saying “overfitting to the training data”. Therefore, the more freedom it has, the more likely to overfit.

Cheers,
Raymond

PS: in your first post, you said “the graph shapes of those functions”, “this question”, “this similarity”. However, we may not be able to guess what they are. What are “those”, “this” and “this”? If there are something you want to mention to us, you may want to show them to us too.