Backpropagation algorithm derivation

during derivation for dw, db and dz for different layers in n layer neural network I just get some time get confused with
a) In obtained result which matrix will be transposed/not transposed

b) where will be dot product and element-wise product

please throughly explain to me as this may trouble me lot in future, if there are some references regarding to this please share also.

You need to have some background in matrix calculus, since that’s what is being done here. Here’s a thread with links to the derivations of back propagation and background info on matrix calculus.