C2W1 : tokenizers version error

It worked well in the course and the assignment, but when I tried to replicate the process on my own AWS machine in the Sagemaker Studio it gave me following error:

ImportError: tokenizers>=0.10.1,<0.11 is required for a normal functioning of this module, but found tokenizers==0.9.3.
Try: pip install transformers -U or pip install -e '.[dev]' if you're working with git master

Error is for the line:
from transformers import RobertaTokenizer

I have tried to force install tokenizer library as well but still getting same error.
subprocess.check_call([sys.executable, '-m', 'pip', 'install', "--upgrade", 'tokenizers==0.10.3'])

Any help would be appreciated.

Hello siddharth,
Welcome to the deeplearning.ai community.
You can upgrade or install different version of transformer then it will work fine.

Hi @bisht, as you can see in my post, I have tried to install newer version but it’s not upgrading the library.
subprocess.check_call([sys.executable, '-m', 'pip', 'install', "--upgrade", 'tokenizers==0.10.3'])