By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
HollyGamer said:

You  brought Bethesda as an example, it's mean you prove my point. Bethesda never has any new engine, they always using the same engine from 2001 era. Their engine are limited so it performed bad on hardware that come out after 2001 and new hardware , many effect, graphic and gameplay, AI, NPC etc look and played very outdated. 

Did you even bother to read? Or only pick and choose what you want?

It pretty much happens with every major game engine.

HollyGamer said:

Yes flop is flop, but how Flop perform are different on every uarc, the equation of effectiveness  from one uarc to other uarc is very different . the effectiveness of TFLOPS can be measured from one UARC to other UARC. Navi it's indeed 1.4 times then GCAN.  

No. A flop is exactly the same regardless of the Architecture in question.

A flop is the exact same mathematical equation regardless if it's GCN or RDNA, RNDA isn't taking that mathematical operation and doing it differently, the flop is the same.

The issue is... The flops you standby are a theoretical denominator, not a real world one.

And the reason why RDNA gets more performance than GCN, isn't because of FLOPS at all. It's everything else that feeds that hardware as RDNA has the EXACT same instruction set as GCN. - Meaning how it handles mathematical operations is identical to GCN. So you are wrong on all accounts.

DonFerrari said:

On your comparison of GPUs you used one with DDR4 and other with GDDR5 that would already impact the comparison. We know that the core of your argument is that TFlop have almost no relevance (and after all your explanations I think very little people here put much stock in the TFlop alone), but what I said is ceteris paribus. If everything else on both GPUs is perfectly equal and just the flops are different (let's say because one have a 20% higher clockrate) then the one with the 20% higher clockrate is a stronger GPU (that sure the rest of the system would have to be made to use this advantage). Now if you mix the memory quantity, speed, bandwidth, design of the APU itself and everything else of course you will only be able to go and have a real life performance after they release. And even so you won't really have a very good measurement because same game running on 2 system the difference in performance may not be because one is worse than the other but just how proficient in that HW the dev are.

You do actually get diminishing returns though.

If we take for example:
* 1024 Stream processors * 2 instructions per clock * clockrate
And had...
* 1024 * 2 * 1000Mhz = 2 Teraflops.
And
* 1024 * 2 * 1500Mhz = 3 Teraflops.

The 3 Teraflop part isn't going to necessarily be 50% faster in floating point calculations... The GPU may be having the caches run at an offset of the core clock speed, and may not see the same 50% increase in performance. - Thus bottlenecks in the design come into play which limits your total throughput.

It's similar to when nVidia had the shader clock independent of the core clock back in the Fermi days.

Plus FLOPS doesn't account for the entire capabilities of a chip... It doesn't take into account integer, quarter/half/double precision floating point, geometry, texturing, Ray Tracing and more, it's only one aspect of a GPU, not the complete picture.

It's like using "bits" to determine a consoles capabilities.

I do agre that it isn't a linear comparison and that 50% more flops (when all other things are equal) isn't equal to 50% more power.

Also I would say that the difference will also have relevance if the base is the slower and the other one was "boosted" or if the base is the fast and the other was "capped" let's say because of thermal concerns.

But yes the core of your point remains that flops don't account for almost anything in comparing two systems. Still marketing won't care about it =p

kirby007 said:
CGI-Quality said:

That was down to development, not power. The PS3 had more computing power than the 360, despite the fact that the 360 had the better GPU and unified RAM.

No, but its the prime example what pema tried to say

Not really. He isn't talking about good use or bad use of the computacional power. He is saying that comparing flops don't tell anything. When PS3 was fully utilized it stood above X360. Not to forget that we are talking about GPU and in this area X360 had a lead over PS3.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."