By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Rumor:PS5 & Anaconda Scarlet GPU on par with RTX 2080, Xbox exclusives focus on Cross gen, Developer complain about Lockhart.UPDATE: Windows Central said Xbox Anaconda target 12 teraflop

 

What do you think

I am excited for next gen 22 61.11%
 
I cannot wait to play next gen consoles 4 11.11%
 
I need to find another th... 2 5.56%
 
I worried about next gen 8 22.22%
 
Total:36
HollyGamer said:

I never said that, i said focusing on old tech on old hardware is not helping future game design, because it's obviously and logically holdback any possible idea that could be implemented on games . Evolution happen when we moved from old to new hardware . Using old hardware for game design hampering the ambitious and the imagination for game creator,  game developer, game designer, graphic designer, level artist, level designer , AI engineer, even programer like yourself  on building new games on better environment.  

"You do have diminishing returns, there is absolutely zero point building games to the metal anymore with how good compilers are these days, when was the last time a game was written entirely in Assembly? Didn't happen even last console generation... The same is happening to Graphics API's."

I just proves on the other thread. 

PC though. I can run the latest games on a CPU from 2007. Game design isn't affected.

DonFerrari said:

Just to reinforce I have gave you sources that a lot of coding on TLOU was made on Assembly.

We had this discussion once before I think and the conclusion was that... Assembly was certainly used, but only for the scripting using GOAL. (Game Oriented Assembly Lisp) and it was far from the norm. - Obviously my statement was inaccurate to that end, thanks for pointing it out.

https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp

One thing to remember is that Assembly isn't machine code either. It's a low-level programming language.

drkohler said:
And yes, flops are perfectly relevant to gauge the new consoles. They both use the same identical technology so a direct comparison can be made. If we take the best rumours for the XBox hoard, a 12TF 24GByte new XBox will be significantly "better" than a 10TF 16Gbyte PS5, no question about that.

No they aren't relevant.

FLOPS doesn't account for things like the Ray Tracing Cores and their performance, one console might have twice the Ray Tracing performance than the other... Peoples use of FLOPS does NOT account for that.

There might be other silicon differences as well, Microsoft may continue to opt for a semi-custom command processor in order to hardware accelerate API routines.

And of course... You have units like Geometry, Texturing and so on that flops doesn't account for as well.

Flops is a useless metric, it's a theoretical number, not a real world one. It's always been that way.

Barkley said:

Firstly, I didn't mention anything about performance at all in that post, it's literally just a post saying what the difference is in flops, nothing else.

Secondly, if two GPU's have the same architecture then surely greater flops = greater performance.

Nope.

More flops doesn't equate to greater performance.

Take the Geforce 1030 DDR4 and GDDR5 variants, even if you overclocked the DDR4 variant to beyond the GDDR5 variant, meaning the DDR4 card has more flops, it will still be slower by a significant amount.
Normally at the same clocks/flops the DDR4 variant has half the performance.

DonFerrari said:

After your 10th "it's final" and equivalent, nope not going to fall for it.

He loves to pass rumors off as fact... Lol.






--::{PC Gaming Master Race}::--

Around the Network
DonFerrari said:
EricHiggin said:

I agree PS4 would still win but not by near as much like you've mentioned, barring any unforeseen problems for either brand.

PS3 vs 360 didn't have a higher end XB SKU above PS3 though. While I don't think Anaconda would sell a lot if more expensive than PS5, I do think just having it as the most powerful console for the next few years anyway, would change the image of Lockhart to a degree. That wouldn't make Lockhart any better than it actually was, but it's perception won't be seen as poorly as if it were the only XB console next gen. Kind of like a little brother who's a pain and pushes the limits to the point where most people would definitely get their butts kicked, yet as long as big brother exists, the perception of little brother is somewhat different than it would be otherwise. You can still try to kick his ass, but you're definitely less likely to if you know you might end up having to deal with big brother eventually. With Anaconda around, Lockhart won't just be the weak little console that's going to get 'beat up' by PS5, even if Anaconda sales aren't that strong. Anaconda will partially legitimize Lockhart in some consumers eyes.

X360 had the best version of multiplats for most of its life, so to customers that is a higher end HW for a lot lower. The change to Lockhart image would be detrimental, because it would show it as even worse by comparison to Anaconda. X1X didn't manage to sell more than PS4Pro while being a lot stronger and 100 more expensive, so I don't know where you would expect Anaconda to sell more than PS5 even more with Lockhart taking cost sensitive MS fans. Your analogy of the brothers is actually worse, anytime you have a brilliant big brother even a regular little brother is seem as bad in comparison. Doesn't make any sense to say "owww I will buy Lockhart that is a lot weaker but a bit cheaper because Anaconda really looks good", that would be the mentality of someone that buy a Ferrari cap because he can't buy Ferrari, but with a Lamborghini costing a cap and keychan. Only people that are fan of MS or want their exclusives would think like that.

You also have to put a spotlight on the RROD. MS didn't have that problem this gen, and likely won't next gen, so that significant problem won't scare consumers away. While more than a few victims actually went out and bought a second or third 360 after the fact, just as many if not more, didn't buy one because of their reliability issues. If the RROD had never happened, 360 very likely would have stayed ahead of PS3 to the end.

With over 85M X360, with several accounts of double or more dips, we can't really say there were that many consumers scared. If there was, how would we justify they being scared but buying X360 and then not buying X1 when it doesn't have RROD? You are speculating that more people decide not to buy X360 due to RROD than people that bought more than one, X360 won with a big lead (similar numbers to PS2) in USA and UK, wasn't that fast seeling in Europe or Japan even before RROD was a thing. So I don't think X360 would sell much more without RROD.

PS5 would certainly be the value option in these scenario's, but will the people who want to jump into next gen asap, who typically can't financially, just pick up a Lockhart because they can? That doesn't necessarily mean they are locked into the XB camp for good. They might stay with Lockhart, or maybe wait until PS5 drops in price and trade Lockhart for it plus savings, or flat out upgrade to Anaconda instead eventually.

Sure some cost sensitive people will do that, others will stay on this gen and buy games for cheap (just look that pre-cuts or first year aren't much more than 15% of end life sales) for a little longer, others will try the subs, etc. But that would be more relevant if Lockhart was 199, because at 299 if person wants to save money they can either buy a X1SAD for 99, that 100 saving against PS5 probably won't look that much save when person see what he wins and loses on the trade-off.

Switch nor Wii had next gen performance. Power isn't everything though. There does have to be something else along with it to make it worth it, like motion controls or hybrid design. XB will have Game Pass/Xcloud as their something else. At $299, Lockhart would have little competition. XB1X almost certainly would have to be dropped, and PS4 Pro will only be temporary until PS5 Pro, 2 or 3 years down the road, where as Lockhart would be around for the entire gen. Also assuming it has a similar price to PS4 Pro, and PS exclusives are now only on PS5, are you going to buy that outdated third party only hardware, or the up to date XB hardware with it's exclusives?

XCloud will run even outside of Lockhart so they don't need to buy it, for such a person would be more likely to buy the SAD. MS is talking about keeping their games crossgen so even a X1 would serve the purpose you don't need to migrate to Lockhart for that. Yes PS4Pro and PS4 will probably lose support 2 or 3 years after PS5 release, but considering 1st party seems like Sony will cut clear by release of PS5 instead of some years 

I'm not saying PS needs to fear a sales reversal with MS. That would make this gen's mistakes seem like a joke. Just that if MS shows up and puts forward a worthy effort, we're likely to see an competitive race and not another blow out.

That should be the case.

How sure are you that XB1S sales aren't being propped up by XB1X, and would be even worse without it?

XB1 had it's own issues. Just because RROD was fixed yet replaced with other things that were seen as issues, doesn't automatically make it a purchase this time around.

People may stick with or buy into last gen, but the main point is about next gen sales. If someone isn't going to bother with next gen, why take them into consideration?

How many people bought Wii just because of motion controls? Most of those people bought in because it was also cheap. One main feature to influence a customer to buy is great if you can provide it, but quite often there's more than just one thing that makes you want to buy a certain console.



Pemalite said:
HollyGamer said:

I never said that, i said focusing on old tech on old hardware is not helping future game design, because it's obviously and logically holdback any possible idea that could be implemented on games . Evolution happen when we moved from old to new hardware . Using old hardware for game design hampering the ambitious and the imagination for game creator,  game developer, game designer, graphic designer, level artist, level designer , AI engineer, even programer like yourself  on building new games on better environment.  

"You do have diminishing returns, there is absolutely zero point building games to the metal anymore with how good compilers are these days, when was the last time a game was written entirely in Assembly? Didn't happen even last console generation... The same is happening to Graphics API's."

I just proves on the other thread. 

PC though. I can run the latest games on a CPU from 2007. Game design isn't affected.

DonFerrari said:

Just to reinforce I have gave you sources that a lot of coding on TLOU was made on Assembly.

We had this discussion once before I think and the conclusion was that... Assembly was certainly used, but only for the scripting using GOAL. (Game Oriented Assembly Lisp) and it was far from the norm. - Obviously my statement was inaccurate to that end, thanks for pointing it out.

https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp

One thing to remember is that Assembly isn't machine code either. It's a low-level programming language.

drkohler said:
And yes, flops are perfectly relevant to gauge the new consoles. They both use the same identical technology so a direct comparison can be made. If we take the best rumours for the XBox hoard, a 12TF 24GByte new XBox will be significantly "better" than a 10TF 16Gbyte PS5, no question about that.

No they aren't relevant.

FLOPS doesn't account for things like the Ray Tracing Cores and their performance, one console might have twice the Ray Tracing performance than the other... Peoples use of FLOPS does NOT account for that.

There might be other silicon differences as well, Microsoft may continue to opt for a semi-custom command processor in order to hardware accelerate API routines.

And of course... You have units like Geometry, Texturing and so on that flops doesn't account for as well.

Flops is a useless metric, it's a theoretical number, not a real world one. It's always been that way.

Barkley said:

Firstly, I didn't mention anything about performance at all in that post, it's literally just a post saying what the difference is in flops, nothing else.

Secondly, if two GPU's have the same architecture then surely greater flops = greater performance.

Nope.

More flops doesn't equate to greater performance.

Take the Geforce 1030 DDR4 and GDDR5 variants, even if you overclocked the DDR4 variant to beyond the GDDR5 variant, meaning the DDR4 card has more flops, it will still be slower by a significant amount.
Normally at the same clocks/flops the DDR4 variant has half the performance.

DonFerrari said:

After your 10th "it's final" and equivalent, nope not going to fall for it.

He loves to pass rumors off as fact... Lol.

He would love to be a lawyer in Brazil with the "full fledged defense and contradictory" because he pass 4 rumors as true at the same time with they disagreeing between one another.

And about TLOU yep. That was as low as I could find, and I don't think ND have done anything similar on PS4. Also it showed more that PS3 either didn't have a mature enough library for programming or that it was so complicated architecture to make a good use that you had to really put the work (and that is one of the reasons I don't think any game looked better on PS360, because no company put as much work to make a pretty game on it).

Also me not being an expert it would be just a rough guess, but probably the difference you would get from doing low level, down to the metal versus using an unreal or the like is perhaps like 10%. The rest of the difference is just talent in making the assets, animations, etc (just look at the difference in quality between games using same engine).

EricHiggin said:
DonFerrari said:

How sure are you that XB1S sales aren't being propped up by XB1X, and would be even worse without it?

XB1 had it's own issues. Just because RROD was fixed yet replaced with other things that were seen as issues, doesn't automatically make it a purchase this time around.

People may stick with or buy into last gen, but the main point is about next gen sales. If someone isn't going to bother with next gen, why take them into consideration?

How many people bought Wii just because of motion controls? Most of those people bought in because it was also cheap. One main feature to influence a customer to buy is great if you can provide it, but quite often there's more than just one thing that makes you want to buy a certain console.

We can't be sure, but considering the sales only dropped after X1X there isn't any strong showing that either PS4Pro or X1X improved sales of base model (the bump we saw with X1X was mostly due to a drop caused by it when they announced so much earlier and then near launch they announced a price reduction for X1S a to early as well).

Sure X1 have its issue, PS4 does as well. But the point was that PS3 got a lot more hurdle to overcome and was still able to do it.

Sure the point is for nextgen, but we were talking about affordability as well, so we would need someone that wants a new machine, that is nextgen (and won't bother with MS saying that their games will keep being crossgen so X1 would still suffice for him) and also the cheapest one without caring that the performance is much lower with lets say 1080p instead of 4k for a mere 100USD difference. I don't really think that is such an expressive number.

Most people I know and news we have was that people bought it because of motion controls. Such evidence is present that for the first 2 years or so people were paying a lot above MSRP to buy one.

I'm a very conservative person, so for me to go against what we have historically seem on console sales I would need hard evidence instead of speculation for a future state that would be very different from what already happened but without much difference in the situation being present so don't fell bad if I don't agree with you =p



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Pemalite said:
HollyGamer said:

I never said that, i said focusing on old tech on old hardware is not helping future game design, because it's obviously and logically holdback any possible idea that could be implemented on games . Evolution happen when we moved from old to new hardware . Using old hardware for game design hampering the ambitious and the imagination for game creator,  game developer, game designer, graphic designer, level artist, level designer , AI engineer, even programer like yourself  on building new games on better environment.  

"You do have diminishing returns, there is absolutely zero point building games to the metal anymore with how good compilers are these days, when was the last time a game was written entirely in Assembly? Didn't happen even last console generation... The same is happening to Graphics API's."

I just proves on the other thread. 

PC though. I can run the latest games on a CPU from 2007. Game design isn't affected.

Yes , because the games you played are using engines that build using new CPU from 2008 and above as baseline. Imagine if game developer still use SNES as baseline of gaming design until now, we might still stuck on 2D even we have raytracing that is under utilize. 

Having Xbox One as baseline, means you just stuck on old Jaguar while underutilize the tech available on Scarlett with SSD, AVX 256 on Ryzen 3000 , faster ram, Ray Tracing, and Iq per geometry that only available on RDNA etc etc , not include the tech for machine learning that can used on enhancing gameplay and a lot possibility if Scarlet is the baseline.  

As game designer you are limited by the canvas , you need bigger canvas and better ink. 



Pemalite said:

No they aren't relevant.
...........

blablabla removed, particularly completely irrelevant "command processor special sauce" and other silly stuff.
Ray tracing doesn't use floating point operations? I thought integer ray tracing was a more or less failed attempt in the early 2000s so colour me surprised.

Look, as many times as you falsely yell "Flops are irrelevant", you are still wrong.

The technical baseplate for the new console SoCs are identical. AMD has not gone the extra miles to invent different paths for the identical goals of both consoles. Both MS and Sony have likely added "stuff" to the baseplate, but at the end of the day, it is still the same baseplate both companies relied on when they started designing the new SoCs MANY YEARS AGO.

And for ray tracing, which seems to be your pet argument, do NOT expect to see anything spectacular. You can easily drive a $1200 NVidia 2080Ti into the ground using ray tracing, what do you think entire consoles priced around $450-500 are going to deliver on that war front?



Around the Network
DonFerrari said:
Trumpstyle said:
-

After your 10th "it's final" and equivalent, nope not going to fall for it.

I'm glad you read my post, I have been saying Lockhart 4TF, Ps5 8TF, Anaconda 12TF for a long time but gave up on it after Navi reveal, turns out it was very close after all. My post was very long so here's the short version.

Oberon for PS5 is real (10.2TF, 40CU,2ghz) but it's the devkit, Sony/Microsoft are using RDNA2 which has big improvements over RDNA1/Navi.

Xbox Lockhart, RDNA2 GPU 4TF+ (18CU, 1.8ghz+), Radeon 580 gaming performance

PS5, RDNA2 GPU 9.2TF (36CU's, 2ghz), Geforce 2080 gaming performance

Xbox Anaconda, RDNA2 GPU 12TF (48CU, 2ghz/52CU's, 1.8ghz), Geforce 2080 Ti gaming performance

Next year we will have our vengeance on the pc-gamers, do not let them move around the facts, Geforce 1650S/Radeon 5500XT are mid-range cards not geforce 2070.



6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

CGI-Quality said:
drkohler said:

blablabla removed, particularly completely irrelevant "command processor special sauce" and other silly stuff.
Ray tracing doesn't use floating point operations? I thought integer ray tracing was a more or less failed attempt in the early 2000s so colour me surprised.

Look, as many times as you falsely yell "Flops are irrelevant", you are still wrong.

The technical baseplate for the new console SoCs are identical. AMD has not gone the extra miles to invent different paths for the identical goals of both consoles. Both MS and Sony have likely added "stuff" to the baseplate, but at the end of the day, it is still the same baseplate both companies relied on when they started designing the new SoCs MANY YEARS AGO.

And for ray tracing, which seems to be your pet argument, do NOT expect to see anything spectacular. You can easily drive a $1200 NVidia 2080Ti into the ground using ray tracing, what do you think entire consoles priced around $450-500 are going to deliver on that war front?

To start, please tone down the condescending approach. It is unnecessary and Pem is simply trying to educate people.

Next, it is easy to see his point (if you know this material). For the average gamer, the folks who frequent these discussions, floating point operations/second are largely irrelevant. Certainly not something they can use like RAM amount/speed, Bus speed, etc... That’s what he’s saying. FLOPS are theoretical, meaning , theoretically, the higher number meanings a theoretically higher peak performance. However, ask the average gamer what that means ~ they’ll look at you like you’re from another dimension. That’s all he’s saying. Yes, they have a meaning, but they don’t dictate the bottom line.

CGI I guess the problem is that the way Pema put was that Flops are totally irrelevant.

But if we are looking at basically the same architeture and most stuff on them being the same, looking at the GPU point one being 10TF and other 12TF hardly the 10TF would be the better one.

Now sure on real world application if one have better memory (be it speed, quantity, etc) or CPU that advantage may be reversed.

So basically yes when Pema says it what he wants to say is that Tflop isn't the end all "simple number show it is better", not that it really doesn't matter at all.

Trumpstyle said:
DonFerrari said:

After your 10th "it's final" and equivalent, nope not going to fall for it.

I'm glad you read my post, I have been saying Lockhart 4TF, Ps5 8TF, Anaconda 12TF for a long time but gave up on it after Navi reveal, turns out it was very close after all. My post was very long so here's the short version.

Oberon for PS5 is real (10.2TF, 40CU,2ghz) but it's the devkit, Sony/Microsoft are using RDNA2 which has big improvements over RDNA1/Navi.

Xbox Lockhart, RDNA2 GPU 4TF+ (18CU, 1.8ghz+), Radeon 580 gaming performance

PS5, RDNA2 GPU 9.2TF (36CU's, 2ghz), Geforce 2080 gaming performance

Xbox Anaconda, RDNA2 GPU 12TF (48CU, 2ghz/52CU's, 1.8ghz), Geforce 2080 Ti gaming performance

Next year we will have our vengeance on the pc-gamers, do not let them move around the facts, Geforce 1650S/Radeon 5500XT are mid-range cards not geforce 2070.

I read all the posts on this thread. And you can't claim Oberon is real, no rumor can be claimed real until official information is gave.

Even consoles that were released in the market the real processing power were never confirmed because the measures made by people outside the company aren't reliable. Switch and WiiU we never discovered what is the exact performance of their GPU, we had just good guesses.

So please stop trying to pass rumor as official information. And also you can't claim 4 rumors that are different are all true.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

HollyGamer said:

Yes , because the games you played are using engines that build using new CPU from 2008 and above as baseline. Imagine if game developer still use SNES as baseline of gaming design until now, we might still stuck on 2D even we have raytracing that is under utilize. 

Having Xbox One as baseline, means you just stuck on old Jaguar while underutilize the tech available on Scarlett with SSD, AVX 256 on Ryzen 3000 , faster ram, Ray Tracing, and Iq per geometry that only available on RDNA etc etc , not include the tech for machine learning that can used on enhancing gameplay and a lot possibility if Scarlet is the baseline.  

As game designer you are limited by the canvas , you need bigger canvas and better ink. 


Engines are simply scalable, that is all there is to it, that doesn't change when new console hardware comes out with new hardware features that gets baked into new game engines.

You can turn effects down/off, you can use different (less demanding) effects in place of more demanding ones and more, which is why we can take a game like Doom, The Witcher 3, Overwatch, Wolfenstein 2 which scales from high-end PC CPU's, right down to the Switch... A game like the Witcher 3 still fundamentally plays the same as the PC variant despite the catastrophic divide in CPU capabilities.

Scaling a game from 3 CPU cores @ 1ghz on the Switch to 6 CPU cores at 1.6Ghz on the Playstation 4 to 8+ CPU cores @3.4Ghz on the PC just proves that.

The Switch was certainly not the baseline for those titles, the Switch didn't even exist when those games were being developed, yet a big open world game like the Witcher 3 plays great, game design didn't suffer.

I mean, I get what you are saying, developers do try and build a game to a specific hardware set, but that doesn't mean you cannot scale a game downwards or upwards after the fact.

At the end of the day, things like Ray Tracing can simply be turned off, you can reduce geometric complexity in scenes by playing around with Tessellation factors and more and thus scale across different hardware.

drkohler said:

blablabla removed, particularly completely irrelevant "command processor special sauce" and other silly stuff.
Ray tracing doesn't use floating point operations? I thought integer ray tracing was a more or less failed attempt in the early 2000s so colour me surprised.

You have misconstrued my statements.

The Single Precision Floating Point numbers being propagated around are NOT including the Ray Tracing capabilities of the part, because the FLOPS are a function of Clockrate multiplied by functional CUDA/RDNA/GCN shader units multiplied by number of instructions per clock. - It excludes absolutely everything else, that includes Ray Tracing capabilities.

drkohler said:

Look, as many times as you falsely yell "Flops are irrelevant", you are still wrong.

The technical baseplate for the new console SoCs are identical. AMD has not gone the extra miles to invent different paths for the identical goals of both consoles. Both MS and Sony have likely added "stuff" to the baseplate, but at the end of the day, it is still the same baseplate both companies relied on when they started designing the new SoCs MANY YEARS AGO.

And for ray tracing, which seems to be your pet argument, do NOT expect to see anything spectacular. You can easily drive a $1200 NVidia 2080Ti into the ground using ray tracing, what do you think entire consoles priced around $450-500 are going to deliver on that war front?

You can have identical flops with identical chips and still have half the gaming performance.

Thus flops are certainly irrelevant as it doesn't account for the capabilities of the entire chip.

Even overclocked the Geforce 1030 DDR4 cannot beat the GDDR5 variant, they are the EXACT same chip, roughly the same flops.
https://www.gamersnexus.net/hwreviews/3330-gt-1030-ddr4-vs-gt-1030-gddr5-benchmark-worst-graphics-card-2018

nVidia's Ray Tracing on the 2080Ti is not the same as RDNA2's Ray Tracing coming out next year, the same technology that next-gen consoles are going to leverage, so it's best not to compare.

Plus, developers are still coming to terms on how to more effectively implement Ray Tracing, it is certainly a technology that is a big deal.

DonFerrari said:

CGI I guess the problem is that the way Pema put was that Flops are totally irrelevant.

But if we are looking at basically the same architeture and most stuff on them being the same, looking at the GPU point one being 10TF and other 12TF hardly the 10TF would be the better one.

Now sure on real world application if one have better memory (be it speed, quantity, etc) or CPU that advantage may be reversed.

So basically yes when Pema says it what he wants to say is that Tflop isn't the end all "simple number show it is better", not that it really doesn't matter at all.

Well. They are irrelevant, it's a theoretical number, not a real world one, the relevant "Flop number" would be one that is based on actual, real-world capabilities that the chips can actually achieve.

And like the Geforce 1030 example above, you can have identical/more flops, but because of other compromises, you end up with significantly less performance.

DonFerrari said:

I read all the posts on this thread. And you can't claim Oberon is real, no rumor can be claimed real until official information is gave.

Even consoles that were released in the market the real processing power were never confirmed because the measures made by people outside the company aren't reliable. Switch and WiiU we never discovered what is the exact performance of their GPU, we had just good guesses.

So please stop trying to pass rumor as official information. And also you can't claim 4 rumors that are different are all true.

The Switch we know exactly what it's capabilities are because Nintendo are using off-the-shelf Tegra components, we know what clockspeed and how many functional units it has as well thanks to Homebrew efforts that cracked the console open.

The WiiU is still a big unknown because it was a semi-custom chip, we do know it's an AMD Based VLIW GPU with an IBM PowerPC CPU though.


And exactly, you can't claim 4 different rumors as all being true.







--::{PC Gaming Master Race}::--

Pemalite said:
HollyGamer said:

Yes , because the games you played are using engines that build using new CPU from 2008 and above as baseline. Imagine if game developer still use SNES as baseline of gaming design until now, we might still stuck on 2D even we have raytracing that is under utilize. 

Having Xbox One as baseline, means you just stuck on old Jaguar while underutilize the tech available on Scarlett with SSD, AVX 256 on Ryzen 3000 , faster ram, Ray Tracing, and Iq per geometry that only available on RDNA etc etc , not include the tech for machine learning that can used on enhancing gameplay and a lot possibility if Scarlet is the baseline.  

As game designer you are limited by the canvas , you need bigger canvas and better ink. 


Engines are simply scalable, that is all there is to it, that doesn't change when new console hardware comes out with new hardware features that gets baked into new game engines.

You can turn effects down/off, you can use different (less demanding) effects in place of more demanding ones and more, which is why we can take a game like Doom, The Witcher 3, Overwatch, Wolfenstein 2 which scales from high-end PC CPU's, right down to the Switch... A game like the Witcher 3 still fundamentally plays the same as the PC variant despite the catastrophic divide in CPU capabilities.

Scaling a game from 3 CPU cores @ 1ghz on the Switch to 6 CPU cores at 1.6Ghz on the Playstation 4 to 8+ CPU cores @3.4Ghz on the PC just proves that.

The Switch was certainly not the baseline for those titles, the Switch didn't even exist when those games were being developed, yet a big open world game like the Witcher 3 plays great, game design didn't suffer.

I mean, I get what you are saying, developers do try and build a game to a specific hardware set, but that doesn't mean you cannot scale a game downwards or upwards after the fact.

At the end of the day, things like Ray Tracing can simply be turned off, you can reduce geometric complexity in scenes by playing around with Tessellation factors and more and thus scale across different hardware.

You can scaling down games engine

But you will possibly lose the benefit of future hardware and more powerful hardware can do. The tech will stagnant like how COD use the same engine from PS3 era or Bethesda on every Fallout Games. 



12 teraflop confirmed, LOL. in GCN number Navi teraflop are equal to 1,4 times of GCN performance 12X 1.4 = 16.8 teraflop of GCN from Xbox One = 16.8/1.3= 12.9 times more powerful than Xbox One.

Flops is not everything when it comes final look of games, but it's there to measure theoretically . So it's a humongous improvement.

So we still need to confirmed PS5 performance before we close this nthread.