That is not part of the course and it is not required to understand that. I’ve skimmed some of that material, but it was several years ago, so I won’t be able to answer that without spending an hour to refresh my memory on that material. Sorry, but I’m not feeling like that would be a good use of my time right at the moment. Jonas goes through everything in quite a bit of detail. If you really want to understand it, I suggest you go through it all again and make sure you understand in detail the notation he is using and how he is representing everything. Note that matrix multiplication is not commutative in general, but the dot product of two 1D vectors is commutative. I’m not sure if that is relevant in this context or not, since I have not studied his notation.

A couple of high level points:

- Prof Ng just gives you the formulas for back propagation, so you can just use them as is.
- By the end of DLS C2, we will switch to using TensorFlow and won’t have to worry about implementing back propagation anymore. The platform takes care of that for us.

We are not required to understand calculus in order to succeed here. If you want to go through the full derivations, the other approach besides the material on Jonas’s website is to check out the links on this thread.