Rnn_forward W1 UNQ_C2

loop over all time-steps

for t in range(T_x):
    # Update next hidden state, compute the prediction, get the cache (≈1 line)
    a_next, yt_pred, cache = rnn_cell_forward(x[:,:,t], a_next, parameters_tmp)
    # Save the value of the new "next" hidden state in a (≈1 line)
    a[:,:,t] = a_next
    # Save the value of the prediction in y (≈1 line)
    y_pred[:,:,t] = yt_pred
    # Append "cache" to "caches" (≈1 line)
    caches.append(cache)
### END CODE HERE ###

Please advise on this part, as it is produces the following error:
~/work/W1A1/public_tests.py in rnn_forward_test(target)
69 parameters_tmp[‘by’] = np.random.randn(n_y, 1)
70
—> 71 a, y_pred, caches = target(x_tmp, a0_tmp, parameters_tmp)
72
73 assert a.shape == (n_a, m, T_x), f"Wrong shape for a. Expected: ({n_a, m, T_x}) != {a.shape}"

in rnn_forward(x, a0, parameters)
41 for t in range(T_x):
42 # Update next hidden state, compute the prediction, get the cache (≈1 line)
—> 43 a_next, yt_pred, cache = rnn_cell_forward(x[:,:,t], a_next, parameters_tmp)
44 # Save the value of the new “next” hidden state in a (≈1 line)
45 a[:,:,t] = a_next

in rnn_cell_forward(xt, a_prev, parameters)
30 ### START CODE HERE ### (≈2 lines)
31 # compute next activation state using the formula given above
—> 32 a_next = np.tanh(np.dot(Wax,xt) + np.dot(Waa,a_prev) + ba)
33 # compute output of the current cell using the formula given above
34 yt_pred = softmax(np.dot(Wya,a_next) + by)

<array_function internals> in dot(*args, **kwargs)

ValueError: shapes (5,3) and (4,8) not aligned: 3 (dim 1) != 4 (dim 0)

Well, either your Wax or your xt is the wrong shape. So what shapes should they be? If your rnn_cell_forward code passed the tests, you can assume the bug is in the new code in rnn_forward. One question: what is the parameters_tmp variable that you are passing as the 3rd argument there? The parameters variable is given to you as a parameter to rnn_forward. Why not just use that?

In fact that’s probably the bug: you are referencing a global variable down in rnn_forward instead of the actual arguments that were passed in. That happens to work if the actual variable passed in was that global variable, but as soon as something different gets passed “Kaboom!”

Thanks to your hints, I could resolve the issue!

That’s great! Thanks for confirming.

Dear Paul, I had now the case that all tests passed in rnn_cell_forward, but I had the wrong result in yt_pred[1]. Blended by the statement “All tests passed” in the first programming exercise I recognized only my mistake when I came to the second and I got the wrong results. FYI, I choose a_prev instead of a_next in yt_pred calculation.

Wow I made the same mistake and could not debug for a long time… The tests should be a bit more thorough.