Building_a_Recurrent_Neural_Network_Step_by_Step: rnn_cell_backward giving wrong output for dWax

I am getting all the correct answers for rnn_cell_backward, except for the dWax line. I am getting:
gradients[“dWax”][3][1] = 1.4135354229573462
But the Expected Output lists:
gradients[“dWax”][3][1] = 0.410772824935
My code for dWax computation is:
# compute the gradient of the loss with respect to Wax (≈2 lines)
dxt = np.dot(Wax.T, dtanh)
dWax = np.dot(dtanh, dxt.T)

What am I doing wrong here? Thanks.

Hi @neilsikka

for computing dWax is dot product of dtanh and transpose of xt like this equation
image

Thanks!
Abdelrahman