Week 2 - Fine-turning - clarification on catastrophic forgetting

In the quiz at the end of the fine-tuning video, selecting the option

“Catastrophic forgetting only occurs in supervised learning tasks and is not a problem in unsupervised learning.”

leads to an incorrect answer.

This implies that catastrophic forgetting can also a problem during pre-training, which is an unsupervised learning step.

Or am I misunderstanding something?

Thanks in advance

I am not understanding this:

if this is true and pre-training is unsupervised why should it be a problem!

The problem is that the quiz indicates this statement should be “FALSE”.

In other words, the statement “Catastrophic forgetting can occur in both supervised as well as unsupervised learning tasks” is TRUE

I’m can’t come up with an example of a unsupervised learning task where an LLM would suffer from catastrophic forgetting

Right, I see. Catastrophic forgetting means in principled that the weights learned from the training are messed up. This can happen for any kind of training and you would want that be it supervised or unsupervised! It basically means what the model has learned has been compromised.