You can examine the source for relu_backward by using “inspect” or you can click “File → Open” and have a look around (there’s a topic about that on the FAQ Thread). But the top level point is that just because the error gets thrown in relu_backward, that does not mean that’s where the bug is. What that means is that your code passed incorrect parameters down to relu_backward with dimensions that do not match. Now you need to figure out why. Print the shapes of the objects at the call site. Are they what you would expect from the “dimensional analysis”? At what layer does the error get thrown? What are the shapes of Z and dA at that layer? What are your shapes?
Actually we can see the bug in the code shown in the exception trace. You are in the for loop over the hidden layers, so what value of dA should you be passing in that case?