Comparator Error for Assignment Neural_machine_translation_with_attention Week 3

I am getting a comparator Error for the first Input X for the function modelf(UNQ_C2 )

I tried the print the shape of X which is matching to (None, 30, 37), but the error is as below

Test failed at layer: 0
Expected value

[‘InputLayer’, [(None, 30, 37)], 0]

does not match the input value:

[‘InputLayer’, [(None, 64)], 0]

Ps:- i have not changed the order of paramters below,

X = Input(shape=(Tx, human_vocab_size))
#print(X.shape)
s0 = Input(shape=(n_s,), name='s0')
c0 = Input(shape=(n_s,), name='c0')

I figued it out , it was an error in my previous function

1 Like

Can you please explain how you solved it?

Hi Aditya,

i had an issue with the previous function which i implemented one_step_attention()

I had the same exact issue. The hint is in the comments in the one_step_attention():

# Use concatenator to concatenate a and s_prev on the last axis (≈ 1 line)
# For grading purposes, please list 'a' first and 's_prev' second, in this order.
3 Likes

@kecai @salman0149 could you please elaborate more because previously it was defined to concatenate using the last axis:

# Defined shared layers as global variables
...
concatenator = Concatenate(axis=-1)

And in the one_step_attention:

...
s_prev = repeator(...)
# Use concatenator to concatenate a and s_prev on the last axis (≈ 1 line)
# For grading purposes, please list 'a' first and 's_prev' second, in this order.
concat = concatenator([..., ...])

But I get a similar dimensions mismatch error in the modelf() function

You’ve replied on a thread that hasn’t been used in three years. So the information in this thread may no longer apply - the assignments tend to be updated occasionally.

In the concat = ... line, you only need to call concatenator() with a list of the correct parameter. The axis=-1 was already defined earlier.

Sorry, my code was ok. Was just a transient disconnection from the Google Colab kernel that left some previous variables empty.

After restart and re run all the notebook all was OK!

1 Like