@gent.spah I know this question has been answered, but I tried looking for other posts that get at this inquiry, but figured rather than just starting a new one it might be good to add on here.
So… I was wondering if you had any ‘intuition’:
I mean in traditional ML we calculate precision and recall as follows:
But both from the lecture and your linked article precision and recall are calculated like this:
While BLEU is not too bad, I’m going to have to sit and think and go over ROUGE a few more times to make sure I get that.
However, my question actually relates to-- I know the context is different, but how are these versions of recall and precision equivalent ?
Or are they being used ‘in a different way’ ?
… It is at least not ‘obviously’ jumping out at me…