NLP C4_W1: Embedding mask_zero=True but the input_dim in unittest is wrong?

Exercise 1 text mentions using ‘0’ as padding and therefore mask_zero should be set to True and according to https://www.tensorflow.org/api_docs/python/tf/keras/layers/Embedding the input_dim should be vocab_size + 1. However, setting this value fails w1_unittest.test_encoder(Encoder) which apparently doesn’t expect the + 1 in the input_dim value.

hi @khteh

can share a screenshot of your failed unit test result related to exercise 1 here. Make sure not to post any codes here.

hi @khteh

Read the highlighted statement and do correction to your input dim

Isn’t that the vocab_size the input parameter of the constructor instead of using a global fix constant?

1 Like

yes correct

Then your comments are redundant.

1 Like

you need to post screenshot of your failed unittest results mentioned here in creator topic explanation and as you mentioned in created topic of using vocab size +1, that was respond to that query.

Error happens only when input_dim=vacabulary+1 together with mask_zero=True but not when +1 is not used.

that’s probably there might be error with how parameters are added to the layers

please send screenshot of the codes by personal DM