A table for Model size x Compute might be wrong?

Thank you for this amazing course!

I’m struggling with understanding the table in the image below (image from Consideration on getting started now). For instance, I guess a 13B model in FP32 requires around 52GB (=13B*4) VRAM for inference, but this table says you need 640GB not even p3.8xlarge(64GB).

Am I missing something? Please someone correct me if I am wrong.
Thank you in advance!