By using this site, you agree to our Privacy Policy and our Terms of Use. Close
TheLastStarFighter said:

Please don't use flops in that way.

It's not an accurate denominator for gauging performance, certainly not real-world performance.

You really need to stop saying that tired line over and over in an attempt to sound smart.  You absolutely can compare flops with flops to compare computing capacity.  At no point did I say something like twice the flops = twice the performance, but twice the flops is twice the flops.

No you can not, if you have information that points to the contrary, link it. - I am sure we would all like to read it.

1) Flop numbers are typically a single precision floating point, theoretical compute performance ceiling that GPU's often don't reach.
2) There is more to rendering games than single precision floation point.
3) Don't forget about integers.
4) What about double precision and half precision? For mobile, Half precision is typically more important than your single precision Gflop number.
5) What about bandwidth?
6) What about caches? Memory capacity?
7) What about the memory controller, pipelining?
8) What about texturing performance?
9) What about geometry performance? And more? (Get the picture yet?)

Not only that, but in the GPU world it's fairly common to take a GPU with a lower Gflop number and have it outperform a GPU with a higher Gflop number. (If you wish for me to school you on this and provide examples, I would be more than happy to do so.)

So I ask you this, if you are unable to reliable use it to compare GPU's as an overall performance metric, THEN WHY USE IT AT ALL?
Rather, hows about you compare GPU's for all their merits instead of just single precision floating point so you have a higher degree of relevency and accuracy?




www.youtube.com/@Pemalite