Does C2W2 not freeze the BERT layers?

Hi,

The hyper parameter has this: freeze_bert_layer=False

So the RoBERTa model in train.py is not frozen, therefore it’s not a transfer learning. Correct?

Hi @PDS_Mentors,

Can one of you help here ?

Thanks,
Mubsi

Hi @seanjiang in this case we are finetuning from pre-trained weights on new data. Therefore, it is a case of transfer learning.

If we freeze them, I understand that it won’t finetune. :wink:

1 Like