"Building_a_Recurrent_Neural_Network_Step_by_Step" in "2.2 - Forward Pass for LSTM" / "Wy" dims vs "a" dims

Hello,

I am bit struggling on matching the dimensions of “Wy” vs “a” dimensions.
See the picture attached.

in the Assignment “Building_a_Recurrent_Neural_Network_Step_by_Step”
in " 2.2 - Forward Pass for LSTM" :

in my understanding the dims of “Wy” are found in python using:
n_y, n_a = parameters[“Wy”]

but “Wy” is “dot producted” to “a” to obtain “y_pred”

so the dimensions of “a” should be “na, ny” ?

See this for the shapes and how we name their components:

The shapes of Wy, a, and y are all there.

A matrix of shape m,n multiplying to another matrix of shape n, k yields a resulting matrix of shape m, k.

(m, n) x (n, k) → (m, k)

Cheers,
Raymond

Hi Raymond, thanks a lot for your very quick response !
Sorry yes indeed !

  • nb of lines of 2nd Matrix = nb of columns of 1st matrix,
    but
  • nb of of columns of 2nd matrix can be of any dimension as it is a dot product, in our case of dimension m (the number of sentences examples we used)