Difference between L2 Regularisation and Inverted Dropout

Could you please explain the Difference between L2 Regularisation and Inverted Dropout internal mechanism?
Thanks

Hi, @ajaykumar3456.

May I ask what you mean by that?

The details of L2 regularization and dropout are explained in the lectures I linked to. By the time you implement them, their internal mechanisms should be quite clear :slight_smile:

i need to ask the same question
does l2 regularization result in transferring from high variance to high bias (logistic regression) same nodes in every iteration
and inverted dropout result in transferring from high variance to just Right

Hi, @Hendawy.

Both regularization techniques can increase bias. But it’s possible to reduce variance without hurting bias much.

Hope you’re enjoying the course :slight_smile:

1 Like