Neural Machine Translation with Attention Error

When building the modelf function the dimensions of my first two input layers seem to be backwards. I followed the instructions and cannot see where I’m going wrong.

Did you modify the code in the cell after the modelf() function, where it calls model = target(…)?

It should look like this:

# UNIT TEST
from test_utils import *

def modelf_test(target):
    m = 10
    Tx = 30
    n_a = 32
    n_s = 64
    len_human_vocab = 37
    len_machine_vocab = 11
    
    
    model = target(Tx, Ty, n_a, n_s, len_human_vocab, len_machine_vocab)

I have not. The code matches what you posted.

The output is slightly odd… “Repeat Vector” is in front of “Bidirectional”.
The most possible case is that you switched the first and second parameter for “concatenate” in one_step_attention(). The order is important. So, please revisit your implementation of “one_step_attention()”.

By the way, pasting your code in here is not recommended. Please remove them, thank you.