Can anyone please help me with my queries:
- Even in DNN back propagation and here as well what always confused me how to decide on what to use during multiplication like element wise or dot product?(is it has to deal with derivatives of matrix using Jacobian matrixβ¦can you please explain in a detailed and intuitive way)
2.It is regarding the dz that was mentioned in the lab notebook, isnt the dz same as dtanh?
Z=πππ₯π₯β¨π‘β©+ππππβ¨π‘β1β©+ππ , then dJ/dZ=πππππ₯π‘β(1βtanh2(πππ₯π₯β¨π‘β©+ππππβ¨π‘β1β©+ππ))?
At the risk of stating the obvious, weβre taking derivatives of functions, so it depends on what the functions are, right? Some of them are βelementwiseβ operations (e.g. the application of an activation function) and some of them involve dot products.
Matrix calculus is really beyond the scope of these courses, but hereβs a thread with links to background material if you want to go deeper.
Note that in all cases beyond Course 2, we are using TF so we donβt actually have to deal with any of the derivatives ourselves: backprop is all just magically handled for us βunder the coversβ by TF (or PyTorch or whatever your ML platform of choice is).