By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Rumor:PS5 & Anaconda Scarlet GPU on par with RTX 2080, Xbox exclusives focus on Cross gen, Developer complain about Lockhart.UPDATE: Windows Central said Xbox Anaconda target 12 teraflop

 

What do you think

I am excited for next gen 22 61.11%
 
I cannot wait to play next gen consoles 4 11.11%
 
I need to find another th... 2 5.56%
 
I worried about next gen 8 22.22%
 
Total:36
DonFerrari said:

Makes almost 0 sense that it would need 18CUs to match PS4 for compatibility on a 40CU, that would mean the new console would have power about PS4Pro (which is 2.25x stronger than base PS4). With XSX being 4x more powerful than X1X (and seems like GPU about 2x as powerful) then PS5 being a PS4Pro level machine would be something so weak that it would need to sell at 199 to have a chance.

Don't forget the 800Mhz vs 2000MHz gpu-clocks...........



Around the Network
Trumpstyle said:
Pemalite said:

My mistake. I was reading my database incorrectly.
My point still stands however.

Yeah I also made a mistake, turns out Proelite over at Beyond3d was a fake insider, it was he who said the devkit for PS5 was 40CU's. But looks false.

Oberon remains a mystery, maybe it just haft to do with backwards compatibility and nothing else.

Oberon says 3 things, that the gpu is clocked at 2ghz and it have 2 backwards compatility modes. 1 mode where 18CU's is active which matches PS4 and another with 40CU's which don't match PS4 pro so you would assumed this was Boost mode with all CU's active, but the insider at beyond3d is false.

Another leak from Klee , he said Xbox Series X are using 64 CU Navi clock at 1500 Mhz 



HollyGamer said:

You  brought Bethesda as an example, it's mean you prove my point. Bethesda never has any new engine, they always using the same engine from 2001 era. Their engine are limited so it performed bad on hardware that come out after 2001 and new hardware , many effect, graphic and gameplay, AI, NPC etc look and played very outdated. 

Did you even bother to read? Or only pick and choose what you want?

It pretty much happens with every major game engine.

HollyGamer said:

Yes flop is flop, but how Flop perform are different on every uarc, the equation of effectiveness  from one uarc to other uarc is very different . the effectiveness of TFLOPS can be measured from one UARC to other UARC. Navi it's indeed 1.4 times then GCAN.  

No. A flop is exactly the same regardless of the Architecture in question.

A flop is the exact same mathematical equation regardless if it's GCN or RDNA, RNDA isn't taking that mathematical operation and doing it differently, the flop is the same.

The issue is... The flops you standby are a theoretical denominator, not a real world one.

And the reason why RDNA gets more performance than GCN, isn't because of FLOPS at all. It's everything else that feeds that hardware as RDNA has the EXACT same instruction set as GCN. - Meaning how it handles mathematical operations is identical to GCN. So you are wrong on all accounts.

DonFerrari said:

On your comparison of GPUs you used one with DDR4 and other with GDDR5 that would already impact the comparison. We know that the core of your argument is that TFlop have almost no relevance (and after all your explanations I think very little people here put much stock in the TFlop alone), but what I said is ceteris paribus. If everything else on both GPUs is perfectly equal and just the flops are different (let's say because one have a 20% higher clockrate) then the one with the 20% higher clockrate is a stronger GPU (that sure the rest of the system would have to be made to use this advantage). Now if you mix the memory quantity, speed, bandwidth, design of the APU itself and everything else of course you will only be able to go and have a real life performance after they release. And even so you won't really have a very good measurement because same game running on 2 system the difference in performance may not be because one is worse than the other but just how proficient in that HW the dev are.

You do actually get diminishing returns though.

If we take for example:
* 1024 Stream processors * 2 instructions per clock * clockrate
And had...
* 1024 * 2 * 1000Mhz = 2 Teraflops.
And
* 1024 * 2 * 1500Mhz = 3 Teraflops.

The 3 Teraflop part isn't going to necessarily be 50% faster in floating point calculations... The GPU may be having the caches run at an offset of the core clock speed, and may not see the same 50% increase in performance. - Thus bottlenecks in the design come into play which limits your total throughput.

It's similar to when nVidia had the shader clock independent of the core clock back in the Fermi days.

Plus FLOPS doesn't account for the entire capabilities of a chip... It doesn't take into account integer, quarter/half/double precision floating point, geometry, texturing, Ray Tracing and more, it's only one aspect of a GPU, not the complete picture.

It's like using "bits" to determine a consoles capabilities.



Last edited by Pemalite - on 16 December 2019

--::{PC Gaming Master Race}::--

Pemalite said:
HollyGamer said:

You  brought Bethesda as an example, it's mean you prove my point. Bethesda never has any new engine, they always using the same engine from 2001 era. Their engine are limited so it performed bad on hardware that come out after 2001 and new hardware , many effect, graphic and gameplay, AI, NPC etc look and played very outdated. 

Did you even bother to read? Or only pick and choose what you want?

It pretty much happens with every major game engine.

HollyGamer said:

Yes flop is flop, but how Flop perform are different on every uarc, the equation of effectiveness  from one uarc to other uarc is very different . the effectiveness of TFLOPS can be measured from one UARC to other UARC. Navi it's indeed 1.4 times then GCAN.  

No. A flop is exactly the same regardless of the Architecture in question.

A flop is the exact same mathematical equation regardless if it's GCN or RDNA, RNDA isn't taking that mathematical operation and doing it differently, the flop is the same.

The issue is... The flops you standby are a theoretical denominator, not a real world one.

And the reason why RDNA gets more performance than GCN, isn't because of FLOPS at all. It's everything else that feeds that hardware as RDNA has the EXACT same instruction set as GCN. - Meaning how it handles mathematical operations is identical to GCN. So you are wrong on all accounts.



Well my reasoning are based on Digital foundry LOL, you seem don't want to lose in debate typical Australian 

On Topic actually we are agree on the same thing . SO yeah 



the ps3 had more FLOPS than the xbox360 still most games looked better on the 360



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Around the Network
CGI-Quality said:
kirby007 said:
the ps3 had more FLOPS than the xbox360 still most games looked better on the 360

That was down to development, not power. The PS3 had more computing power than the 360, despite the fact that the 360 had the better GPU and unified RAM.

No, but its the prime example what pema tried to say



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

CGI-Quality said:
kirby007 said:

No, but its the prime example what pema tried to say

His argument is that FLOPS don't tell the whole story. You were actually supporting that by pointing out that the weaker device (based on FLOPS) had better looking games under certain circumstances.

No shit sherlock

WARNED: Flaming ~ CGI

Last edited by CGI-Quality - on 17 December 2019

 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

HollyGamer said:
Trumpstyle said:

Yeah I also made a mistake, turns out Proelite over at Beyond3d was a fake insider, it was he who said the devkit for PS5 was 40CU's. But looks false.

Oberon remains a mystery, maybe it just haft to do with backwards compatibility and nothing else.

Oberon says 3 things, that the gpu is clocked at 2ghz and it have 2 backwards compatility modes. 1 mode where 18CU's is active which matches PS4 and another with 40CU's which don't match PS4 pro so you would assumed this was Boost mode with all CU's active, but the insider at beyond3d is false.

Another leak from Klee , he said Xbox Series X are using 64 CU Navi clock at 1500 Mhz 

Nope :) he said Bingo to 12.083TF for Xbox Series X

64CU's + 1475mhz = 12.083TF



6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Pemalite said:
HollyGamer said:

You  brought Bethesda as an example, it's mean you prove my point. Bethesda never has any new engine, they always using the same engine from 2001 era. Their engine are limited so it performed bad on hardware that come out after 2001 and new hardware , many effect, graphic and gameplay, AI, NPC etc look and played very outdated. 

Did you even bother to read? Or only pick and choose what you want?

It pretty much happens with every major game engine.

HollyGamer said:

Yes flop is flop, but how Flop perform are different on every uarc, the equation of effectiveness  from one uarc to other uarc is very different . the effectiveness of TFLOPS can be measured from one UARC to other UARC. Navi it's indeed 1.4 times then GCAN.  

No. A flop is exactly the same regardless of the Architecture in question.

A flop is the exact same mathematical equation regardless if it's GCN or RDNA, RNDA isn't taking that mathematical operation and doing it differently, the flop is the same.

The issue is... The flops you standby are a theoretical denominator, not a real world one.

And the reason why RDNA gets more performance than GCN, isn't because of FLOPS at all. It's everything else that feeds that hardware as RDNA has the EXACT same instruction set as GCN. - Meaning how it handles mathematical operations is identical to GCN. So you are wrong on all accounts.

DonFerrari said:

On your comparison of GPUs you used one with DDR4 and other with GDDR5 that would already impact the comparison. We know that the core of your argument is that TFlop have almost no relevance (and after all your explanations I think very little people here put much stock in the TFlop alone), but what I said is ceteris paribus. If everything else on both GPUs is perfectly equal and just the flops are different (let's say because one have a 20% higher clockrate) then the one with the 20% higher clockrate is a stronger GPU (that sure the rest of the system would have to be made to use this advantage). Now if you mix the memory quantity, speed, bandwidth, design of the APU itself and everything else of course you will only be able to go and have a real life performance after they release. And even so you won't really have a very good measurement because same game running on 2 system the difference in performance may not be because one is worse than the other but just how proficient in that HW the dev are.

You do actually get diminishing returns though.

If we take for example:
* 1024 Stream processors * 2 instructions per clock * clockrate
And had...
* 1024 * 2 * 1000Mhz = 2 Teraflops.
And
* 1024 * 2 * 1500Mhz = 3 Teraflops.

The 3 Teraflop part isn't going to necessarily be 50% faster in floating point calculations... The GPU may be having the caches run at an offset of the core clock speed, and may not see the same 50% increase in performance. - Thus bottlenecks in the design come into play which limits your total throughput.

It's similar to when nVidia had the shader clock independent of the core clock back in the Fermi days.

Plus FLOPS doesn't account for the entire capabilities of a chip... It doesn't take into account integer, quarter/half/double precision floating point, geometry, texturing, Ray Tracing and more, it's only one aspect of a GPU, not the complete picture.

It's like using "bits" to determine a consoles capabilities.

I do agre that it isn't a linear comparison and that 50% more flops (when all other things are equal) isn't equal to 50% more power.

Also I would say that the difference will also have relevance if the base is the slower and the other one was "boosted" or if the base is the fast and the other was "capped" let's say because of thermal concerns.

But yes the core of your point remains that flops don't account for almost anything in comparing two systems. Still marketing won't care about it =p

kirby007 said:
CGI-Quality said:

That was down to development, not power. The PS3 had more computing power than the 360, despite the fact that the 360 had the better GPU and unified RAM.

No, but its the prime example what pema tried to say

Not really. He isn't talking about good use or bad use of the computacional power. He is saying that comparing flops don't tell anything. When PS3 was fully utilized it stood above X360. Not to forget that we are talking about GPU and in this area X360 had a lead over PS3.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

CGI-Quality said:
kirby007 said:

No shit sherlock

⚠️ WARNED: Flaming ~ CGI

Glad you agree then that your statement was pointless. ^_^

So i got warned because the pointless statement you made in reply to my post, where is your own warning?

SITUATION :

PEMA : Flops isnt the best way to compare
KIRBS : ye since ps3 had more flops but xbox360 had parity in most games if not better looking in some
CGI : reitterates my statement

how am i being the one without a basis here?

Last edited by kirby007 - on 17 December 2019

 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.