Algorithms robustness against errors

Andrew mentioned in lecture that DL algorithms are robust against random errors such as misclicks or human mislabeling; but they are not as robust against systematic errors. Anyone can explain the reason behind it, and the statistics or theory behind that?

Thanks!

Hey @derickwh,

The general idea is that NN is a tool to learn patterns in the data and systematic errors have some pattern. That said, NN won’t be able to distinguish systematic errors from important features.

2 Likes