The comment at the beginning of one_step_attention
of Assignment 1 of Week 3 states:
Arguments:
a -- hidden state output of the Bi-LSTM, numpy-array of shape (m, Tx, 2*n_a)
Why is n_a
doubled?
The comment at the beginning of one_step_attention
of Assignment 1 of Week 3 states:
Arguments:
a -- hidden state output of the Bi-LSTM, numpy-array of shape (m, Tx, 2*n_a)
Why is n_a
doubled?
Because Bidirectional RNN is used in the assignment, one for forward-direction, the other for backward-direction. See the exercise description below.