| taus90 said: yeah sure Flops doesnt mean anything to general purpose hardware and computing, so a game developer who does low level codes and profiling graphics api, will not even entertain you if you say clock speed and flops arent important... flops, clockspeed, ram bus all matters when chasing that elusive frame render budget.
|
I never said that flops was not important. But using it alone as a determiner for absolute performance isn't important and is actually highly inaccurate.
It's a theoretical number often not achievable by real world hardware.
For instance you can have a GPU with less flops outperform a GPU with more flops. Require me to prove this?
| taus90 said: so flops tells a developer what they can achieve on that specific hardware designed for a specific task which is gaming. |
No it doesn't. You know what does though? Benchmarks, profiling, testing. All done on Dev kits. Flops doesn't mean a game can run at such and such resolution.
And flops alone completely ignores the precision the floating point operations are operating at, it also has nothing to do with integer performance, doesn't give you an idea on geometry performance... And it tells you nothing of fillrate and I can go on.
Megahertz and flops alone does not tell us what the hardware is capable of. It never has. Never will. And people should stop abusing those metrics.

www.youtube.com/@Pemalite








