Hi all,
I am facing this error when I try to attempt my HW.
ResourceExhaustedError: OOM when allocating tensor with shape[2000,32,65,65] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [Op:Conv2DBackpropFilter]
I reduced the number of filters to 16,32,64 and it solved this issue and I was able to complete my assignment.
2 Likes
I reduced the batch size. First went down to 1000, but that didn’t help. So I lowered it to 500 and that did the job. I didn’t try in-between values like 800 though.
I lowered the number of filters per layer and now am able to use the entire batch size of 2000.
it worked, thank you for sharing the knowledge