By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
GOWTLOZ said:

Flops are relevant, they give a good ballpark estimation of what the hardware can do.

No they don't.
Here are two Geforce 730. Roughly the same flops. 50-65% performance difference.
http://www.jagatreview.com/2015/04/gt-730-ddr3-vs-gt-730-gddr5/2/

Here are two Geforce 1030... Roughly same flops. 100-125% performance difference.
https://www.gamersnexus.net/hwreviews/3330-gt-1030-ddr4-vs-gt-1030-gddr5-benchmark-worst-graphics-card-2018

Here are two Radeon 7750.  Roughly the same flops... 40-50%.
https://www.goldfries.com/computing/gddr3-vs-gddr5-graphic-card-comparison-see-the-difference-with-the-amd-radeon-hd-7750/

Hows about the Radeon 5870 and Radeon 7850? The 5870 is 2.72 Teraflops verses the 7850's 1.76 Teraflops. - The 5870 should win right, it has almost 1 Teraflops more?
Nope. It looses by about 8-55%

https://www.anandtech.com/bench/product/1062?vs=1076

Tell me again how much accuracy flops will give you in determining a GPU's capabilities. - Evidence overwhelmingly says you are wrong on that claim. Like... Really wrong.

Flops tells us only one thing...
The single precision theoretical floating point performance of a part, it absolutely tells us nothing about half precision, double precision, quarter precision, integer, geometry, bandwidth, texturing, compression, culling capabilities of a part... In short, just like bits... It's a useless metric on it's own.

The performance differences are due to different memory used.

1. One is DDR3 and the other is GDDR5.

2. DDR4 and GDDR5.

3. GDDR3 and GDDR5.

There are other reasons as well, such as the difference in VRAM between the GT 730s. Flops are a good indicator of performance but also drivers, APIs and stuff. GDDR5 and GDDR6 should likely be used in next gen consoles and that with the Flop performance should indicate the performance of next gen consoles.