CPU Usage and GPU usage

I have recently tried to implement an approach to deepfake detection I found in a paper.
The model is training very slowly.
I contains 7000 training images. 700 validation images.
It is supposed to run for 550 epochs.
It is using maximal CPU and minimal GPU.
I have used tensorflow.
Can anybody solve the problem?

Use a lot of GPU.

1 Like

Is there code to increase gpu usage in tensorflow?

https://www.tensorflow.org/guide/gpu

1 Like

And you may also have to configure your environment to use the GPU.

For example, on Colab you have to change an environment setting in the user interface so you can access a GPU.

1 Like

Hi Diwarkar. If you are doing image pre-processing in your training loop, (for example resizing the images) this can lead to low GPU use and high CPU use. This is because the image pre-processing is done on the CPU. The GPU use is low because it’s waiting for the CPU to finish processing each batch.

To speed things up you should resize images and do other image pre-processing before you start training. To further speed up training, you should also ensure that the batch size is large enough to use 100% of the GPU memory. 7000 images is quite a small dataset therefore training should be quite fast, unless the images are extremely large. Good luck.

1 Like

I have resized the images to 160x160 beforehand.It is taking 1min per epoch.Is this good speed or bad?

It depends on how complex your model is.