At the risk of stating the obvious, we’re taking derivatives of functions, so it depends on what the functions are, right? Some of them are “elementwise” operations (e.g. the application of an activation function) and some of them involve dot products.
Matrix calculus is really beyond the scope of these courses, but here’s a thread with links to background material if you want to go deeper.
Note that in all cases beyond Course 2, we are using TF so we don’t actually have to deal with any of the derivatives ourselves: backprop is all just magically handled for us “under the covers” by TF (or PyTorch or whatever your ML platform of choice is).