While in init we prepare self.pos_encoding that variable can not be used. Test creates Encoder with maximum_position_encoding == 5, but passes X with sequence length 3. So to pass this I had to re-calculate position encoding.
maximum_position_encoding
defines the maximum length restriction of input, but it’s not necessary to be equal, as long as (input length) <= maximum_position_encoding
.
Ok, I took a slice before summing positional encoding, but find it uselessly overcomplicating this assignment.
Hi @ruzakirov
Could you share the error messages?
Nope. It was size mismatch. Now, I probably don’t have access to the code. My point is that it’s possible to modify this assignment in such way that the assignee shouldn’t take slice of self.pos_encoding
when using it in his code.
May be it’s elementary and I was too tired when tried to complete assignment for the first time.