LSTM layer size confusion?

Can someone please explain why the lstm layer size in the ungraded lab is being set equal to the size of the embedding? Shouldn’t it be equal to the size of the number of tokens in a sentence?

Thanks

Hey @Samarth_Gupta,
Welcome to the community. Please follow the discussion here. You will find out that this discussion discusses your point as well as some other points regarding the LSTM layer size confusion. Let me know if this helps.

Cheers,
Elemento