Multi-Class Classification videos should be in Course 1 of the DLS

Hi, I don’t know if this is the right place to post this, but couldn’t find the “DLS Feedback” category. I’m working through the Specialization, and finished Course 1 a few weeks ago.

Eager to put what I’ve learned in practice, I started coding a neural network library with only numpy and matplotlib, to see if it could handle the MNIST dataset. The thing is, I ran into a lot of trouble because I jumped right into it with only what I’ve learned from Course 1 of the specialization, and since Course 2 was just about tuning parameters and making my network learn faster and more efficiently, I didn’t care about starting it right away until I finished my “homemade” NN library.

Now after a lot of reading about the Softmax function, and the multi-categorical loss function, which are both generalizations from the ones we learned in course 1, I could get my neural network to work fine and it could learn the MNIST dataset and classify correctly most of the digits it encountered in test time.

Now, three weeks after, I’m currently about to finish Course 2 of the specialization, and imagine my surprise when I saw the last few videos are all about multi-class classification! IMHO I feel the topic is very out of place regarding the other classes in it, and it would be of greater value for the Specialization as a whole if it was in the first course, so you can get a fuller idea of the classical neural networks you will build in the future. Without the Multi-Class lectures on Course 1 you don’t truly learn the basics in my opinion, as the Softmax Function and the cross entropy loss for more than two classes are core blocks of neural networks, and they shouldn’t be relegated to an epilogue in the second course, which is oriented towards tuning and making your network more efficient.

That being said, I’m loving Course 2 so far, and I’m looking forward to going through the remaining ones in the specialization.

1 Like

Hi, @rsuriano.

Thank you for the feedback. Glad to hear you’re enjoying the course and running your own experiments.

I’m not sure what the best place to introduce categorical cross-entropy would be, but <spoiler>you’ll still see more loss functions throughout the specialization</spoiler> :slight_smile:

Happy learning!

Hi @nramon, thanks for the answer, looking forward to learn about those new loss functions then :sweat_smile:

1 Like