As data increase performance boosts

So far- I am bit of confuse regarding following discussion of Andrew- video- Why is Deep Learning taking off? at time “4:20 - 5:25”

So far my understanding is-
As the relative ordering of Neural Network Algorithm is not well defined- so at the beginning it takes time to process to generate outcome. But as time goes its performance increases

So the end result is -
For smaller dataset it seems it taking huge time to process
But for large data it performance increases as the performance increases on time going

[Is my understanding okay?]

Hello @Md_Mojammel_Haque,

Welcome to the first week of the DLS, and to this community!

To begin with, it has nothing to do with the time/ process time/ training time. It is not about the time. It is about how well we know about the performance ranking among these 4 algorithms: large NN, medium NN, small NN, traditional learning algorithm.

As the relative ordering of Neural Network Algorithm is not well defined

The above line is about that when the amount of data is small, nobody can say exactly that, among the 4 algorithms, who performs better than who. One reason that we can’t determine who should be the best, who next, and who would be the worst is that, as the lecture said,

it is often up to your skill at hand engineering features that determines the performance

However, when we have a lot of data, it then becomes clear that, large NN can outperform medium NN, and medium NN can outperform small NN, and small NN can outperform traditional learning algorithm.

With what I have said, please watch that part of the video again and see if everything makes sense.

Cheers,
Raymond

1 Like

In addition to @rmwkwok‘s great reply.

Visualizations like this one are is often used to explain this effect this also graphically:

Source

This thread could of of interest for you: Do traditional algorithms perform better than CNN? - #2 by Christian_Simonis

Best regards
Christian