When creating a post, please add:
Link to assignment: https://www.coursera.org/learn/attention-models-in-nlp/quiz/5E7sq/neural-machine-translation
- The option about Activation Functions should be updated. As of writing this thread the option is like
Activation Functions: While activation functions like ReLU are crucial in neural network layers for introduci ng non-linearity, they are not considered an essential component of the attention mechanism itself. (Incorre ct) - This replaces the previously correct option.