In course 1 we’re not explicitly taught the math behind NNs and why Z=WX+b etc. are we expected to learn this in ouown capacity or will it be taught in further courses in this spcialization?
Hi, @Narayan. The math that you describe is mostly elementary linear algebra with some probability and statistics. There is some coverage of this in the videos (e.g. matrix multiplication and addition) as well as in the textual material to the notebook exercises. Without foreknowledge of these concepts, the going will be rougher, as more of your time will be absorbed by independent research and pencil and paper practice of the mathematical methods. Course 1 is your best chance to learn these.
Thanks for the quick reply! I know all of the concepts that you’ve mentioned above. What i wanted to know is WHY did we multiply w and x . How did we derive these equations. is there any other way to write them. Basically I’m curious about the working of neurons and how exactly the mathematical functions are interacting with the image data that we’re training it with
Got it! The equations that comprise a feedforward network (aka, “multi-layer perceptron”) are not so much “derived” but are rather architectural choices which have a long intellectual history. An appreciation of their development from the perspective of Course 1 would require a bit of supplemental reading. Be sure to read the optional material and even listen to the interviews with the various influential researchers.
I found the book Hands-on Machine Learning with Scikit-Learn, Keras & Tensorflow by Aurelien Geron (2nd edition updated for Tensorflow 2) to be illuminating in this regard. The introductory material in Chapter 1 provides a useful overview. Part II (Neural Networks and Deep Learning) begins with some material on the historical development of the multi-layer perceptron.
That said, the remainder of the specialization will provided lots of motivation and explanation of the architectural choices of the various deep learning models that you will encounter. By the end of the Specialization, you hopefully will have a deep (pun intended) appreciation for these choices. Be patient – it takes time and effort to have the concepts gel into a meaningful whole.
Thanks! I was just doubtful whether i was jumbling up the wayi must learn stuff.
Interesting take on mathematics related to neural nets.
Very interesting! Thanks for sharing that link.
Thanks! I worked on a linear regression college enrollment study way back after I graduated. Since I retired, I am doing the lifelong learning thing, auditing classes that look interesting.
Kermit Paulos
Arlington Virginia