Our the traditional NLP methods taught in Natural Language Processing with Classification and Vector Spaces and Natural Language Processing with Probabilistic Models still relevant today?

Our traditional NLP methods taught in Natural Language Processing with Classification and Vector Spaces and Natural Language Processing with Probabilistic Models (ngrams , CBOW ,) still relevant today or worth giving your time when we Know their limitations or they won’t be used in maybe production in organizations (asked in interviews?) and also sequential models , deep learning . transformers have already grown exponentially and there’s already a lot to learn in that

Hi @faber_soaks

Yes, they are still used because they are cheap to train and customize, they are fast and if you have a simple problem - that is all you need.
Also, they can be used as a simple baseline to check against the full blown many billions parameter transformer model - how better is the metric that you care with the giant model.
In other words, there’s no need for a bulldozer if you want to plant a flower in your garden.

Cheers

Note that the NLP course is “intermediate” level. It’s not necessarily going to get you a job, but it is a good baseline for the advanced skills and the most recently-invented methods.