Built-in attention layer in Keras, Week 3

In week 4 we build an attention layer ourselves. Then build the model using our own implementation of the attention layer.

It would be interesting to recreate the same model using the built-in Attention layer in tf.keras. Has anyone attempted to recreate it?

Hi Mazatov,

Good suggestion. Did you give it a try yourself? Any thoughts to share here?



Not successfully : )