I see, so you would be asking about “backward propagation”.
For the maths part, this post in the above has actually done it once. The 1st line in the 2nd photo shows the gradient of a particular weight w_1 in the hidden layer as calculated using the chain rule which essentially multiplies several terms up. The 1st photo gives the meaning of the symbols that are used in the 2nd photo. However, for the 1st line of the 2nd photo, I only showed two of the five partial derivatives, but if you like maths and know differentiation, it should be pretty easy for you to derive the other three.
So, if you like maths and know differentiation, I would suggest you to read that post, otherwise, you may read this post by one of the course’s teaching staff which delievers the idea of backpropagation without too much maths.
Raymond