Overfit undefit justright linear quadratic polynomial

Hi @farhana_hossain,

In addition to Tom‘s great reply:

  1. Polynomial fits usually tend to oscillate, especially in the boundary / edge regions where you have labels.

  2. this statement is not generally true:

The statement is quite true with respect to linear models where you have one weight for each feature + the bias. So in 1D linear regression you have two weights (the bias and the weight for the gradient / slope).

But especially in rather complex nonlinear models with hidden layers, like deep neural networks, you can have waay more weights than input features.

Also: important aspects to consider in the underfitting / overfitting discussion is often:

  • the capacity of the model (e.g. shaped by the model weights)
  • the complexity of the relationship you want to model (e.g. a challenging nonlinear classiciation task)
  • the data type (e.g. unstructured video data)
  • the amount of data and it’s ratio to the model capacity mentioned above (the more data you have and the more complex the task you wanna solve (e.g. classifying unstructured video data) the more model capacity (= more weights or more hidden layers) you can allow resp. you are going to need so solve your business problem.

Here some more info if you wanna dive deeper into the topic:

Happy learning!

Best regards
Christian

1 Like