Week 3, Neural_machine_translation_with_attention

In the following section:
The last step is to define all your inputs and outputs to fit the model:

  • You have input X of shape (m = 10000, T_x = 30) containing the training examples. - You need to create s0 and c0 to initialize your post_attention_LSTM_cell with zeros.
  • Given the model() you coded, you need the “outputs” to be a list of 10 elements of shape (m, T_y) .

Shouldn’t the “outputs” be a list of 10 elements of shape (m, len(machine_vocab)), which is (m, 11) ?

Hi @Neruoy

The length of outputs is 10 characters long, but there are 11 characters to choose from ranging from 0 to 9 and the sign ‘-’.

1 Like

I see. Thanks very much :slight_smile: