Tools difference between Modern AI and Traditional AI

As per the course, Traditional AI and Modern AI differs in terms of using Artificial Neural Networks. I would like to know is this the only difference? The course explained that Articificial Intelligence as of today comprises of the tools which are related to Machine learning, Deep Learning or NeuralNet and intersects with Data Science tools. So, what were the tools that contributed to Traditional AI? Basis my limited search I was able to read that it was more of if-then else fuzzy logic .


Hi @Siddhant_Sharma

often things are not absolutely black or white but let me try to answer your question:

Classic ML tooling often involves libraries like scikitlearn (Python library) or Julia for physics-informed AI, see also this thread:
I also agree with you that some classic control-theoretic models like oberserver approaches (Luenberger observer, kalman filters, particle filters) and also fuzzy logic are often associated with classic AI as well as recommender systems. Especially the control theoretic modeling resp. robotics also can involve or originates from Matlab/Simulink or C/C++.

Deep learning frameworks are often Python-based and leverage libraries like keras, Jax, PyTorch etc. and big data. Often, if this big data is available, this can be combined with cloud-native approaches to manage this big data well without having to invest in infrastructure (CapEx) upfront too much. For example databricks is also a popular framework to utilize Spark for distributed data processing.

Here you can find a thread that could be interesting for you:

So in conclusion, you have basically two approaches:

Both approaches are possible with neural networks in general, but the architectures as well as amount data and the number of learnable parameters are completely different.

Hope that helps!

Best regards