What is "causal attention"?

I found this term in the lab and can’t understand a sense

Hi @someone555777

The attention mechanics aims to extract the most relevant information out of your sequence.

Causal attention means that the model cannot look at the future, but can take a look back as you see in the screenshot! In signal and control theory there is the concept of causal filters, this is where the name comes from:

The word causal indicates that the filter output depends only on past and present inputs

Hope that helps!

Best regards
Christian

1 Like

so, is it referenced, that model is not BERT? (not biderectional)

I am not 100% sure if I understand where you are getting at. Quoting from Huggingface :hugs:

According to the abstract,
Bart uses a standard seq2seq/machine translation architecture with a bidirectional encoder (like BERT) and a left-to-right decoder (like GPT).

Best regards
Christian

so, have I said something wrong?