Week1 - What is the weight Waa?

Waa – Weight matrix multiplying the hidden state,
numpy array of shape (n_a, n_a)

What does this mean? Where is it located, if I want to visualize RNNs like feedforward neural networks:

Hey there @deepakjangra

I know you are still a bit confused but the concept of Feedforward NNs and RNNs are completely different!

In the RNNs mentioned, W_{aa} is the weight matrix for the hidden state, with shape (n_a, n_a), where n_a is the number of units in the hidden layer. Also, for better understanding, take a look at the process:

  1. Input x^{(t)} and previous hidden state a^{(t-1)} are fed into the RNN cell.

  2. The hidden state a^{(t)} is calculated using the weight matrices W_{aa}, W_{ax}, and the bias b_a followed by the tanh activation function.

  3. The output \hat{y}^{(t)} is computed by applying the weight matrix W_{ya} and the bias b_y to the hidden state a^{(t)}, followed by the softmax function.

Feel free to ask if you need further help!