Trouble with lstm_cell_backward function


I’m unable to find my mistakes.
here is the error code :

in lstm_cell_backward(da_next, dc_next, cache)
41 # Compute parameters related derivatives. Use equations (11)-(18) (≈8 lines)
—> 42 dWf =, np.transpose([[a_prev], [xt]]))
43 dWi =, np.transpose([[a_prev], [xt]]))
44 dWc =, np.transpose([[a_prev], [xt]]))

<array_function internals> in dot(*args, **kwargs)
ValueError: shapes (5,10) and (2,) not aligned: 10 (dim 1) != 2 (dim 0)

dWf is wrong but I don’t know why !

Can you give me a way to understand that , please?
Thank you

Hi @CyrilleB

You need to ensure that the shapes of the arrays being multiplied are compatible. Check the dimensions of dft , [a_prev] , and [xt] , and ensure they align properly for matrix multiplication. You can do this by using .shape to print and see if they can be multiplied or not. It’s possible that the shapes of a_prev and xt need to be adjusted or that the transposition is not producing the desired shape.


It did You suggested me to do :
dft : (5, 10)
a_prev : (5, 10)
xt : (3, 10)

I think I don’t precisely understand what mean the equation 11 in the course about this lstm_cell_backward function.
Can you give me more information, please?

I found part of the answer of my question…

I should use np.concatenate !

I pursue my quest to find what is wrong because now, I have this error message aftyer using np.concatenate((a_prev, xt), axis=1).T

ValueError: all the input array dimensions for the concatenation axis must match exactly, but along dimension 0, the array at index 0 has size 5 and the array at index 1 has size 3

I understand what it means. But how and why change the size? n_a and n_x can be differents…
Can you help me, please ?

Hi @CyrilleB ,

The issue is still with the dimensions of a_prev and xt when trying to concatenate them. Make sure a_prev and xt have compatible shapes for concatenation along the correct axis (feel free to print their shapes using .shape try to understand what each dimension represents to ensure proper alignment).

If the issue persists feel free to share code with me in private messages (not here!) so we work it out together!