Hello everyone. Encountered error with C1_M4 3.2 assignment. The task is very clear there, a pretty regular training loop, that I’ve done many times before, but here specifically loss function throws an error:
```
File /usr/local/lib/python3.12/site-packages/torch/nn/functional.py:3494, in cross_entropy(input, target, weight, size_average, ignore_index, reduce, reduction, label_smoothing)
3492 if size_average is not None or reduce is not None:
3493 reduction = _Reduction.legacy_get_string(size_average, reduce)
→ 3494 return torch._C._nn.cross_entropy_loss(
3495 input,
3496 target,
3497 weight,
3498 _Reduction.get_enum(reduction),
3499 ignore_index,
3500 label_smoothing,
3501 )
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument target in method wrapper_CUDA_nll_loss_forward)
```
I have double checked the device where model, output and labels are, all on cuda, but I still get this error.
I tried to override device to be CPU, to avoid this error, and I started getting following error:
```
File /usr/local/lib/python3.12/site-packages/torch/autograd/graph.py:823, in _engine_run_backward(t_outputs, *args, **kwargs)
821 unregister_hooks = _register_logging_hooks_on_whole_graph(t_outputs)
822 try:
→ 823 return Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
824 t_outputs, *args, **kwargs
825 ) # Calls into the C++ engine to run the backward pass
826 finally:
827 if attach_logging_hooks:
RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
```
even though I am doing optimizer.zero_grad() at the start of each for loop iteration.
Am I doing something wrong?
Basically the code I have is really basic (which makes it even more frustrating)
```
{moderator edit - solution code removed}
Like it doesn’t get simpler than this ![]()