# Petaflops per second-days

I am confused by the definition of petaflop/s-days. The definition states:
“A petaFLOP per second day is a measurement of the number of floating point operations performed at a rate of one petaFLOP per second, running for an entire day.”
If floating point operations are performed at the rate of one petaFlop per second then for a day it is just the number of seconds in a day petaflop isnt it…? What am i missing?

The definition of petaFLOP/s-day might be a bit confusing. While it is true that one petaFLOP per second for one day would equal the number of seconds in a day, the definition is referring to the number of floating point operations performed at that rate. So, a petaFLOP/s-day is actually a measure of the number of floating point operations performed at a rate of one petaFLOP per second, running for an entire day. I hope this clears your doubt.

1 Like

Thanks for the response @Atharva_Divekar but i am still not clear. Could you offer a concrete example where it is different from the number of pertFlops/s-day is not equal to the rate multiplied by the number of seconds in a day…That would help clarify matters for me. Thanks again

@weareus Let us assume you have a machine with a compute capacity of one petaFLOP/s. Suppose that you want to train GPT-3 that has 175 Billion parameters. Then you need to continually run your machine for 24 \times 60 \times 60 \times 4000= 345.6 \times 10^6 seconds = 345 million seconds. That is, you would have computed 345.6 \times 10^6 petaFLOPs of operations. You can simplify this by reducing it to days. This translates to 4000 petaFLOP/s-days (11 petaFLOP/s-year). That’s why training such a huge model is computationally prohibitive (excluding a few tech giants)!

With the same machine, we could train the BERT-base model in 3 days. (3 petaFLOP/s-days). Hope this helps

1 Like

Thanks @Arun_Prakash_A Could you please explain how you got the number 4000 in your calculation.

@weareus From this paper:https://arxiv.org/pdf/2005.14165.pdf

4000 is the number of petaFLOP/s days it took to pre-train GPT-3

petaFlopas per second-days is good unit for HTC or High Throughput Computing rather than High Performance Computing ( HPC ) which only (prefix)flops matter.
in HTC what matter is how much computing cycle you get per days, weeks, months, …