Hello, There seems to be some compatibility issues with Lab2. When I run the code, it produces errors when I try importing:
“from datasets import load_dataset
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, GenerationConfig, TrainingArguments, Trainer
import torch
import time
import evaluate
import pandas as pd
import numpy as np”
I did not make any modifications to the code. Here is the error:
ModuleNotFoundError Traceback (most recent call last)
/opt/conda/lib/python3.7/site-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
1125 try:
→ 1126 return importlib.import_module(“.” + module_name, self.name)
1127 except Exception as e:
/opt/conda/lib/python3.7/importlib/init.py in import_module(name, package)
126 level += 1
→ 127 return _bootstrap._gcd_import(name[level:], package, level)
128
/opt/conda/lib/python3.7/importlib/_bootstrap.py in _gcd_import(name, package, level)
/opt/conda/lib/python3.7/importlib/_bootstrap.py in find_and_load(name, import)
/opt/conda/lib/python3.7/importlib/_bootstrap.py in find_and_load_unlocked(name, import)
/opt/conda/lib/python3.7/importlib/_bootstrap.py in _load_unlocked(spec)
/opt/conda/lib/python3.7/importlib/_bootstrap_external.py in exec_module(self, module)
/opt/conda/lib/python3.7/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)
/opt/conda/lib/python3.7/site-packages/transformers/training_args.py in
59 import torch
—> 60 import torch.distributed as dist
61
ModuleNotFoundError: No module named ‘torch.distributed’
The above exception was the direct cause of the following exception:
RuntimeError Traceback (most recent call last)
in
1 from datasets import load_dataset
----> 2 from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, GenerationConfig, TrainingArguments, Trainer
3 import torch
4 import time
5 import evaluate
/opt/conda/lib/python3.7/importlib/_bootstrap.py in handle_fromlist(module, fromlist, import, recursive)
/opt/conda/lib/python3.7/site-packages/transformers/utils/import_utils.py in getattr(self, name)
1114 value = self._get_module(name)
1115 elif name in self._class_to_module.keys():
→ 1116 module = self._get_module(self._class_to_module[name])
1117 value = getattr(module, name)
1118 else:
/opt/conda/lib/python3.7/site-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
1129 f"Failed to import {self.name}.{module_name} because of the following error (look up to see its"
1130 f" traceback):\n{e}"
→ 1131 ) from e
1132
1133 def reduce(self):
RuntimeError: Failed to import transformers.training_args because of the following error (look up to see its traceback):
No module named ‘torch.distributed’ "
Any suggestions how I can fix this?
Thank you.