Back propagation not working accurately

(Apologies in advance for the symbols. I dont have any background in mathematics and I am confused a bit about all the dLs and dZs and what not.)

I am trying to do something manually so that the concepts are clear

My neural network is

Z1 = (W*X) + b

A1 = tanh(Z1)
(I am skipping Z2 = W2X+b2, to keep it simple)

A2 = sigmoid(A1) i.e. =1/(1+EXP(-A1))

Cost = =-((Y*LOG(A2))+((1-Y)*LOG(1-A2)))

Loss = Cost/m

I am doing back propagation using the following steps:

dL/A2 = (-y/a2) + ((1-y)/(1-a2))

dL/A1 = dL/A2 * dA2/A1 i.e. dL/A2 * A2 * (1-A2)

dL/dZ1 = dL/dA1 * dA1/dZ1 i.e. dL/dZ1 = dL/dA1 * SECH(A1)

dW = dl/dZ1 * X.T

db = sum(dl/dZ1) /3

Are the back propagation steps correct? The reason I am asking is that when I do the propagation in excel the results are off.

Hi @Harshawardhan_Deshpa

In your backpropagation steps, the mistake lies in the calculation of dL/dZ1. The correct derivative of tanh(Z1) with respect to Z1 is 1 - tanh^2(Z1).

The correct formula should be dL/dZ1 = dL/dA1 * (1 - A1^2) (Given that A1 = tanh(Z1)).

Hope this helps!

Thanks

1 Like