C4 week 2 The video with title Transformer Decoder

C4 week 2 Text summarization
link: https://www.coursera.org/learn/attention-models-in-nlp/lecture/rDLol/transformer-decoder
The Archetecture which is disscussed in the video is of encoder not decoder and terms wetre not clearly explained too
and they have also not mentioned that masked MHA is a part of Decoder

hi @Hitansh_Ramtani

if this is a feedback, then please chose learning platform feedback category.

if you have any doubt, then provide a brief discription of your issue or doubt.