By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Goodby Teraflop (PS5 and Xbox Scarlet probably will not contest on Teraflop number anymore) expect 8 to 9 teraflop for PS5 and Scarlet

Tagged games:

 

What do you think with these teraflop number

Yes 1 2.94%
 
No 0 0%
 
i don't care teraflop , i... 19 55.88%
 
I am expecting more 6 17.65%
 
These within my expecation 4 11.76%
 
I am impressed we get mor... 2 5.88%
 
I still believe even with... 2 5.88%
 
Total:34
Pemalite said:
Lafiel said:

dev kits being more capable than the actual console is normal, as devs needs a healthy overhead for analytics programs, debugging software etc etc

at E3 there were contradictory statements what "4x more powerful" means, Phil afaik said this was based on CPU performance, Matt Booty said it was a combination of several factors like CPU,GPU,SSD and RAM

The SSD is like a 60-70x increase in bandwidth... And just as impressive on the latency reductions too.

CPU performance should be a no-brainer. 8x-10x depending on the clockrates of the Zen 2 8-core chip and the instructions/workload.

GPU is probably around double the performance of the Xbox One X at most, significantly more once we start talking Ray Tracing.

Memory bandwidth we are probably looking at 50% to double the bandwidth of the Xbox One X, depending how wide they take the bus... But it is likely to be closer to 50%.

It's the storage and CPU that we will see the largest leap with, no doubt about it.

HollyGamer said:

10 teraflop is achievable on the current size and power tdp for APU. Because it will just slightly more 44 cu or 48 cu to achieve 10 teraflop. The problem is when it will targeted above 48 CU (56 to 52), because it size will be more than 420 mm^ .

The problem are we still don't know what manufacturing process Sony and Microsoft will use for their APU. Will it be 7nm or 7nm+, also the final APU chip size for both. I bet if they are willing to lose money on early release year, they can easily achieve 10 to 11. 

At the current progress actually RX 5700 Xt already slightly more powerful or on par with RTX 2070 on benchmark , the final chip for PS5 according to leak  are close to RTX 2080 in firestrikes benchmark. I can see 2070 are hit the mark, now i wish they can push more to at least more closer to RTX 2080. So by the end of 2020 PS5 will compete with RTX 3070 and not with mainstream GPU RTX 3060. unlike PS4 that competed with GTX 660 (Nvidia mainstream GPU )when PS4 was released in 2013. 

Flops is unimportant, always has. In saying that... AMD's fastest clock on a GPU is the Navi-based 5700XT with it's 1680mhz base clock. You would need 48 CU's at 1680Mhz to break that 10 Teraflop barrier. (Number of CU's * Shaders * 2 Instructions * Clockspeed)

7nm is what they will be using, it's what is ramping up now... They will be months building up chip inventory before the consoles launch, so they need to use what is feasible right now... And that is 7nm.

The GPU side of the equation, the faster the better. Hopefully the next-gen consoles can beat the 4-year old (by that point) Geforce 1080 upon release... But one thing we need to keep in mind is that... Even though the next-gen consoles are based on the PC's Navi GPU, they have a plethora of enhancements which means they are not directly comparable in benchmarks.


vivster said:

But an RX5700 wouldn't put the PS5 in the range of a 2070, which was my point. Add to that the additional possible chips and their super duper SSD it's easy to see that we won't get 2070 levels of performance on a $400 price point.

I wish Sony would just put out an expensive version that has the hardware cranked up and costs $500 or more but I'm not holding my breath. The thing is, even with a 2070 doing 4k60fps is a very tough ask. I believe lots of people are expecting to see 4k60 across the board and they're gonna be very disappointed.

The Geforce 2070 is pretty much upper-mid range in performance now... And by 2020 will definitely be a mid-range part with a Geforce 3060 likely being it's equal.
Shows how far back AMD is currently in the GPU space... But at-least they are smashing it out in the CPU space.

I don't expect a $400 USD price point though, I think $500 is the likely target.

DonFerrari said:

Well I do remember The Cloud Power. And I wouldn't put it besides MS to spin the truth. But until we can be certain I'll give the benefit of doubt and expect a system that is roughly 4x better than X1X, or about 16x better than X1, which was a reasonable jump gen-to gen (a little above usual 8-10x)

I think every console manufacturer is kinda' guilty about "spinning" the truth of their platforms capabilities to some degree or another.

Even doubling the Xbox One X's GPU capabilities is still an impressive task though... One thing we need to keep in mind is that games released for the Playstation 4 Pro and Xbox One X are not utilizing the hardware from the bottom-up, games are still very much designed with the base Xbox One and PlayStation 4 consoles in mind... The next-gen hardware won't have that problem once we are fully transitioned into the 9th gen, so games should look significantly better than the hardware leap relative to the mid-gen refresh consoles would otherwise imply.

HollyGamer said:

Well Cerny already talk no loading times and Ray tracing including sound ray tracing 4K 60fps, 8k , and 120 fps . Also secret sauce will be more important jargon this time , but i believe console vendor will just compare directly to their previous console. For us hardcore and enthusiast the hype will be coming from comparison from gaming news and site like Digital Foundry and gaming forum. Actually it's a lot easy to hype console in these day and age using social media and Internet , unlike past generation. 

Ray Traced sound is not a new thing. I hope that isn't a buzz word that people cling to entering next-gen without an actual understanding of it's ramifications. Haha

Sony is likely piggybacking off AMD's True-Audio though, so the Xbox should have the same capability.

Yes certainly we would have better graphics if games were focused on X1X and Pro instead of the base model.

And my fear for MS is that they keep X1X "current" as the new base model (which would be a double issue as it have more expensive solutions so they wouldn't win much on price) and limit what they can achieve on Scarlet, which also can be a problem on the rumor that Anaconda (or whatever is the name of each version) would be a 1080p version of Scarlet for same game and rest of performance about equal.

Also that reply to HollyGamer and ManUtdFan (you buying a 1080p won't make devs decide you should be the focus =p). Pro and X1X had the options for performance or graphics because they were just "scale" adjustment for baseline, so they made that baseline version and took the simple approach of offer just higher FPS or pixel for the games. But when they milk all for PS5 you won't have choices.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Around the Network
Chrkeller said:

Well I have 3 thoughts on performance

1) I am fine with the power of the Switch. Perhaps it is because I grew up playing ps1/N64, but any game that is HD with a stable frame rate is good enough for me. The ps4 Pro looks beyond amazing.  Pretty much all games are leaps and bounds better than the stuff I played growing up. 

2) Graphics mean nothing without art style. The "prettiest" game I have ever played is Wind Waker HD. I'll take style over graphics all day long.

3) Games matter. I don't care how good something looks, what matters is how good the game is.  And the quality of a game is 99% gameplay, controls and level design.  Dark Souls 3 isn't the prettiest game on the ps4, but it is the best.

I would say that is closer to about 70% in the general sense.  Storytelling has been a large part of gaming for a while.  In some cases it could be an even 50/50 split.  For games that hardly have much control or skill needed like Life is Strange, the story becomes even more important than the gameplay.  Lots of different ways to enjoy a game since the mid to late 90s.



Lube Me Up

Pemalite said:
EricHiggin said:

Not sure you got the main point, that in terms of the future marketing of the PS5, pointing out that XB1X at 6TF wasn't going to be "true 4k" like MS was saying, and that the next PS console would be at least double the GPU performance of the Pro (on paper), and would basically be guaranteed full native 4k capable. Maybe that's what he was hinting, maybe not. That wouldn't mean all games would be 4k/60 or whatever, but that they all would be capable of running 4k/30 if a dev wanted to without having to downgrade the eye candy. Also would depend on what PS would require from devs who make games for PS5. Cerny may have had an idea about where Navi was planned to land in terms of it's capabilities, and would know approximately where PS5 was planned to land in terms of performance, so he would be able to say something like he did, knowing he wouldn't be all that far off, and worst case hopefully, it's actually 8TF and no lower his sake. I was looking at the statement in less of a tech perspective and more of an overall marketing perspective. If Cerny is going to be the lead architect but also one of the faces and voices of the company, he has to play the PR game too.

And he does play the PR game. That was made abundantly clear when the Playstation 4 and Playstation 4 Pro dropped.
That doesn't mean his statements don't carry some credible tidbits of information... But his claims need to be weighed appropriately and within a non-hyped context.

The point I am trying to convey is that all the console manufacturers play these "games" in order to assert their platform as the best gaming platform, it's been going on for decades.

While it's possible that Cerny was playing it super safe, based on where GCN was headed in terms of TF calculated performance, 8TF seems unbelievably low for the PS5, without knowing that the Navi GPU you're going to be using is likely to land around that calculated performance with RDNA. This was probably a hint way back, but also PR to smear XB1X even though it wouldn't be lying technically. Smart PR though because he could have said 12TF basing it off of old GCN, while potentially causing PS a headache later on if the PS5 launched with less than that, which is very well possible if not likely at this point.

From more of a tech perspective, the Flops in general mean very little yes. It's just a ballpark figure, which is typically used to compare models within a series, or gaming performance for most casuals. It really only matters if it's an extremely direct comparison, which almost never is the case, be it from one iteration to the next or between brands. Even worse when considerable changes are finally made to the arch. While this message is being pushed more, to your typical casual gamer, it's meaningless for the most part. The best seller and the price matter way more, which should come from the best balance of tech and games.

It's getting much tougher to sell someone on your hardware based on the games themselves visually. Trying to prove it through video is extremely tough today for so many reasons. Like for one, how do you prove your 4k box is better than their 4k box, when your 4k video can only be viewed by many at 1080p online? A bigger TF number is a much easier and simpler way of 'proving' that, even though it doesn't mean all that much. For a consumer who doesn't have the time or knowledge or ability to know the difference, specs matter more and more, especially if you can't actually outsell your cheaper 'inferior' competition.



But I thought the VGChartz expert users were saying monster box, seems they scattered fast. Not even getting a 5700 in that box, 180W no chance, this will be a gen all about CPU and probably the smallest visual upgrade ever seen in console generations.



DonFerrari said:
Pemalite said:

The SSD is like a 60-70x increase in bandwidth... And just as impressive on the latency reductions too.

CPU performance should be a no-brainer. 8x-10x depending on the clockrates of the Zen 2 8-core chip and the instructions/workload.

GPU is probably around double the performance of the Xbox One X at most, significantly more once we start talking Ray Tracing.

Memory bandwidth we are probably looking at 50% to double the bandwidth of the Xbox One X, depending how wide they take the bus... But it is likely to be closer to 50%.

It's the storage and CPU that we will see the largest leap with, no doubt about it.

Flops is unimportant, always has. In saying that... AMD's fastest clock on a GPU is the Navi-based 5700XT with it's 1680mhz base clock. You would need 48 CU's at 1680Mhz to break that 10 Teraflop barrier. (Number of CU's * Shaders * 2 Instructions * Clockspeed)

7nm is what they will be using, it's what is ramping up now... They will be months building up chip inventory before the consoles launch, so they need to use what is feasible right now... And that is 7nm.

The GPU side of the equation, the faster the better. Hopefully the next-gen consoles can beat the 4-year old (by that point) Geforce 1080 upon release... But one thing we need to keep in mind is that... Even though the next-gen consoles are based on the PC's Navi GPU, they have a plethora of enhancements which means they are not directly comparable in benchmarks.


The Geforce 2070 is pretty much upper-mid range in performance now... And by 2020 will definitely be a mid-range part with a Geforce 3060 likely being it's equal.
Shows how far back AMD is currently in the GPU space... But at-least they are smashing it out in the CPU space.

I don't expect a $400 USD price point though, I think $500 is the likely target.

I think every console manufacturer is kinda' guilty about "spinning" the truth of their platforms capabilities to some degree or another.

Even doubling the Xbox One X's GPU capabilities is still an impressive task though... One thing we need to keep in mind is that games released for the Playstation 4 Pro and Xbox One X are not utilizing the hardware from the bottom-up, games are still very much designed with the base Xbox One and PlayStation 4 consoles in mind... The next-gen hardware won't have that problem once we are fully transitioned into the 9th gen, so games should look significantly better than the hardware leap relative to the mid-gen refresh consoles would otherwise imply.

Ray Traced sound is not a new thing. I hope that isn't a buzz word that people cling to entering next-gen without an actual understanding of it's ramifications. Haha

Sony is likely piggybacking off AMD's True-Audio though, so the Xbox should have the same capability.

Yes certainly we would have better graphics if games were focused on X1X and Pro instead of the base model.

And my fear for MS is that they keep X1X "current" as the new base model (which would be a double issue as it have more expensive solutions so they wouldn't win much on price) and limit what they can achieve on Scarlet, which also can be a problem on the rumor that Anaconda (or whatever is the name of each version) would be a 1080p version of Scarlet for same game and rest of performance about equal.

Also that reply to HollyGamer and ManUtdFan (you buying a 1080p won't make devs decide you should be the focus =p). Pro and X1X had the options for performance or graphics because they were just "scale" adjustment for baseline, so they made that baseline version and took the simple approach of offer just higher FPS or pixel for the games. But when they milk all for PS5 you won't have choices.

That's exactly is MS plan, all xboxes as low as the S will play everything.



Around the Network
Random_Matt said:
DonFerrari said:

Yes certainly we would have better graphics if games were focused on X1X and Pro instead of the base model.

And my fear for MS is that they keep X1X "current" as the new base model (which would be a double issue as it have more expensive solutions so they wouldn't win much on price) and limit what they can achieve on Scarlet, which also can be a problem on the rumor that Anaconda (or whatever is the name of each version) would be a 1080p version of Scarlet for same game and rest of performance about equal.

Also that reply to HollyGamer and ManUtdFan (you buying a 1080p won't make devs decide you should be the focus =p). Pro and X1X had the options for performance or graphics because they were just "scale" adjustment for baseline, so they made that baseline version and took the simple approach of offer just higher FPS or pixel for the games. But when they milk all for PS5 you won't have choices.

That's exactly is MS plan, all xboxes as low as the S will play everything.

That will be a very bad thing imho.

PS5 exclusives will be much better visually than Xb4



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Barkley said:

The main thing the extra powers going to be wasted on is 4k. 

Agreed, the power spent on doing native 4K would be better off spent on any number of other things IMO, but even just 60fps but lighting, materials, effects. Unfortunately, 4K is a sexy buzzword that the companies will use to sell their shiny new boxes.

Given the huge jump in CPU power, what I'm most hoping to see is exactly what I wanted to see from the 8th gen but which they failed to deliver; a big jump in simulation, interactivity, like BOTW's physics/chemistry stuff but x10.



ManUtdFan said:
All this extra power will be wasted on raytracing, which won't ever deliver photorealistic graphics, unlike path tracing. Don't want no shiny bricks or cartoonish, sterile environments...

And so we will have to wait another 10 years for discernible upgrade to graphics quality.

Path Tracing is Ray Tracing. Not all Ray Tracing is created equal... Games have been dipping their toes into the Ray Tracing waters for over a decade now to various degrees... It's being popularized now because of nVidia and what next-gen hardware might potentially bring to the table.

The biggest limitation to games currently is certainly lighting though, developers over the years have tried their best to 'fake' Ray tracing with baked lighting and so on which occurred heavily during the 7th gen.

I say reserve judgement until we see the games and hardware in action.

HollyGamer said:

You better buy PC monitor with 1080p and  120hz refresh rate  for next gen if you planned to buy either of two, because i bet they will allow us to choose between performance (choosing frame rates over resolution) like with PS4 pro and Xbox One X. And PS5/Scarlet will have 120 fps capability with HDMI 2.1 and freesync. It's more cheaper especially  Freesync Monitor. 

I don't think that really happened on the Xbox One X often though? I know a few titles did... But it was far from the norm really.

HollyGamer said:

It can be a tool to compare capability measurement  on doing compte workload and comparison between similar architecture. Let say RTX 2080 and 2070.

Well. No it can't... I have been over this a millions times on this forum... But I'll go again.

There are a ton of different types of compute workloads... Again we have 8bit, 16bit, 32bit, 64bit floating point and so on.
Then we have 8bit, 16bit, 24bit, 32bit, 64bit integers as well. - For example most GPU's don't have native 64bit integer support, so they emulate such functionality on the 32bit blocks... Which comes with a corresponding hit to performance.

And FLOPS only refers to one of those. 32bit floating point. Aka. Single Precision.

Not to mention that other parts of the chip come into play during pure compute workloads as well... For example the Geforce 1030 has a DDR4 and GDDR5 variant... In terms of compute there is only a difference of 6.5%. (1152Mhz vs 1227Mhz @384 Cuda cores.)
But the real world performance is often 50% or more slower.
https://www.gamersnexus.net/hwreviews/3330-gt-1030-ddr4-vs-gt-1030-gddr5-benchmark-worst-graphics-card-2018

Eutherium mining for example which tends to be a pure compute workload also sees significant gains with more memory bandwidth on the graphics card... Which just further emphasizes that flops is not a relevant capability measurement tool even between GPU's of the same architecture.

HollyGamer said:

The possibility of using 7nm+ is still there especially when Sony/Microsoft can let others fabrication manufacturer produce PS5/Scarlet chip because AMD are not producing chip they are just designing chip. Also the problem is not on Sony/ Microsoft money,  because they have the money.  They can either choose TSMC or Samsung differ from AMD own GPU lineup for PC, just like how PS4 pro using 16 nm fabrication while Polaris RX 480 using 14 nm. 

Also Samsung already made statement they can already produce 7nm+ for mass produce. Some smartphones already use it for their parts. So 7nm+ is still be in the realm of possibility.

Global Foundries have stepped away from being in the fabrication process race... So they are out.
TSMC's 7nm+ is going to be leveraging EUV, so the designs won't just automatically translate over.

TSMC's 7nm+ is likely to only be a marginal improvement over 7nm anyway with 20% density improvements, 10% performance increase... Probably not worth making the gamble on that process for the next gen consoles monolithic chips. - Yields are stupidly important.

TSMC's capacity is also only going to be 1,000~ wafers per day... That can't all be reserved for the next gen consoles, other partners building ARM processors, GPU's and other pieces of logic will be jumping at that as well.

Samsungs 7nm process however will be employing EUV... But lets keep in mind that it's not the same as TSMC's 7nm+ process, don't fall into the trap of their marketing shenanigans... You can't compare Samsungs 7nm to TSMC's 7nm, marketing has made the comparisons useless on a number to number basis.
Both TSMC and Samsung will have up-to quad patterning for their fabs.

Is it possible that the next-gen consoles could use 7nm+? Yes. It's possible, it's just highly unlikely at this stage.

And yes Money is important for Microsoft and Sony, the more you spend on Chips and the production-of... The higher your costs become for designing and building a console which flows on to the consumer.
Long gone are the days where it's even financially feasible for console manufacturers to dump billions on designing chips for their devices... Sony and Microsoft have limits you know, which are generally governed by shareholder expectations.

HollyGamer said:

I know It slower than 1080 ti but Gonzalo benchmark slightly the same performance with RTX 2070 or RX 5700 xt, it's always be the  targeting spec for PS5/scarlet and it's enough for console.

I think myself and many others had hoped that the next-gen consoles would be targeting high-end GPU performance rather than the mid-range that the Xbox One and Playstation 4 eventually settled on to see a bigger leap in general fidelity.
Don't get me wrong, we will see a big leap, it's just not going to be as impressive as it could be... Partly that is down to AMD not being able to keep pace with nVidia's performance cadence.

Is it enough for a console? I would argue more is better.

HollyGamer said:

Even 1080 ti performance can be achieve by GTX 1080 using optimization if there is a game specified made for GTX 1080 .  

But if you optimize for a Geforce 1080Ti, then the same performance gap between the GTX 1080 and 1080Ti will continue to exist.

Optimization isn't some magical construct that makes hardware more capable and excludes all other pieces of hardware.

HollyGamer said:

The problem is PC always need more raw power to run games because PC is struggling on optimization (most PC games demo and trailer using 1080 Ti to avoid bug and trouble). Also PS4 are using a slower modified 7870 ( less 2 CU and less GPU clock speed) and better than 7850. 

The PC gets optimizations... I think console gamers forget this.
For example... Whenever AMD and nVidia roll out a driver update it's not just to make things look prettier or fix bugs... But to introduce optimizations that improve performance...
For example here we can see where optimizations improved performance by 15%.
https://www.tomshardware.com/reviews/amd-nvidia-driver-updates-performance-tested,5707.html

Microsoft does a similar thing with Windows, which will often increase performance. For example:
https://www.techpowerup.com/255843/windows-10-may-2019-update-1903-gaming-performance-tested-in-21-titles-with-rtx-2080-ti-and-radeon-vii

And of course we have improvements at the API level:
https://www.redgamingtech.com/how-much-better-is-performance-with-modern-apis-directx-12-vs-directx-11-opengl-vs-vulkan/

And often game developers will roll out updates that also improve the performance of their title.
https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/

So obviously "optimizations" isn't just a console-only thing. - The evidence is simply undeniable at this point.

In short, there is absolutely no game that runs on a Playstation 4 that can't run on a Radeon 7870... Often, games will run better on a Radeon 7870 at the same visual settings as the Playstation 4 too... Like Overwatch, Battlefield 1, Grand Theft Auto 5 and so on.

Plus you get to choose your settings on the PC... Game doesn't run at full 1080P on the Playstation 4? Well, on a Radeon 7870 it can, just lower a couple of settings.

HollyGamer said:
Please read again, I said "The bottom Line " I am agree with u PS5 will just be comparable to RTX 3060/RTX 2070 or close enough. I am a realist but at the same time I also a dreamer. Everybody can have a dream right? Because we also don't know what is the final price of PS5/ Scarlet, if they want they can just increase the price to have better GPU and larger die size , even though it will sacrifice TDD/TBP , price and size. 

They could, but... People will whinge.
Price is a stupidly important factor for allot of people, especially for those who sit lower on the socio-economic ladder.
Remember the Xbox One at $500, remember the Playstation 3 at $600... They were all contentious price points.

HollyGamer said:
Ray tracing is the holy grail of every 3D games developer, it's very expensive technique that require expensive hardware, that's why it could be a selling point at least for gamers. I know it will creeple performance especially for console , but developer can just use it as marketing tool and hyping the console. Probably even on PS5/Scarlet  we will not see many games using RT , probably some first party ip or low RT on some triple A games. It will just be a combination of Rt and  Resteraser  

We are still a long way away from a full ray traced gaming world, it might take a few more console generations for rasterization to fall away to the side.

Ray Tracing will be used on the 9th gen hardware, just like it is being used in some 8th gen games, it's the extent of it's use that is up for debate.

HollyGamer said:
I am not saying backward compatibility " alone can achieve that"  but  it's enough to sell PS5 , because there are so many factor that make all the console you mentioned are failed on the market at least not achieving their target,  even though they have backward compatibility. PS brand and names after PS4 success in the other hand already has names, credibility, backward compatibility and brand bigger than Xbox even in US alone (not even counting across the globe). If PS5 are not repeating the same problem like PS3 did (expensive but no power advantage over it's competitor that can be seen on normal people , late to the market, bad controler ) than PS5 will sell like hot cakes to every PS4 owner. 

It's a value-added incentive. Not the be-all, end-all selling point.

Scarlett will be rolling out Xbox 360 and Original Xbox games in it's backwards compatibility efforts as well, hence why Microsoft pulled that team away from the Xbox One, will Sony do the same with Playstation 1, 2 and 3 backwards compatibility on the Playstation 5? Or would you deem it as unimportant?

I am not willing to place bets on how well any console is going to sell, my tastes don't align with the average consumer.

HollyGamer said:
Indeed it's minority but on first year or early year, all PS4 owner will go and adopt easily PS5 because they have a lot of backlog and games that can be replay with enhanced option on PS5. PS3 /Xbox 360 transition on it's early year are different because PS4 and early Xbox One hardware were not able to run old games. If Xbox One has these feature on it's early day, then it would be a different stories

Good thing I provided a fairly comprehensive list where successive consoles launched with full backwards compatibility (In hardware!) yet didn't sell as well as their predecessor.

And as the evidence I provided earlier, backwards compatibility isn't actually a big selling point for most gamers, it's a value-added incentive, sure. But it's far from being the most important aspect... Otherwise everyone would be a PC gamer as you can run your PC games from 30~ years ago.

DonFerrari said:

Yes certainly we would have better graphics if games were focused on X1X and Pro instead of the base model.

And my fear for MS is that they keep X1X "current" as the new base model (which would be a double issue as it have more expensive solutions so they wouldn't win much on price) and limit what they can achieve on Scarlet, which also can be a problem on the rumor that Anaconda (or whatever is the name of each version) would be a 1080p version of Scarlet for same game and rest of performance about equal.

Yeah. I would dislike for the Xbox One X to be retained as any kind of model for the next-gen.
I don't see it happening though, the cooling set-up, power delivery and so on means it's not the most cost-effective device to build, it would need a significant cost-reduction revision if Microsoft intended to have it as a low-end alternative to next-gen.

EricHiggin said:

While it's possible that Cerny was playing it super safe, based on where GCN was headed in terms of TF calculated performance, 8TF seems unbelievably low for the PS5, without knowing that the Navi GPU you're going to be using is likely to land around that calculated performance with RDNA. This was probably a hint way back, but also PR to smear XB1X even though it wouldn't be lying technically. Smart PR though because he could have said 12TF basing it off of old GCN, while potentially causing PS a headache later on if the PS5 launched with less than that, which is very well possible if not likely at this point.

I don't think 8 Teraflops is low at all.
I think why Teraflops wasn't mentioned earlier before RDNA became a "thing" was for this very reason, flops is irrelevant and they can't use it for marketing (like bits!) forever.

EricHiggin said:

From more of a tech perspective, the Flops in general mean very little yes. It's just a ballpark figure, which is typically used to compare models within a series, or gaming performance for most casuals. It really only matters if it's an extremely direct comparison, which almost never is the case, be it from one iteration to the next or between brands. Even worse when considerable changes are finally made to the arch. While this message is being pushed more, to your typical casual gamer, it's meaningless for the most part. The best seller and the price matter way more, which should come from the best balance of tech and games.

It's not even a ballpark figure, it's a theoretical denominator that is simply unachievable in the real world... Otherwise there wouldn't be a constant race to making chip designs more efficient every year...
I mean GPU's with more flops can end up slower than a GPU with less flops.
For example the Radeon 5870 @2.72 Teraflops is slower than the Radeon 7850 @ 1.76 Teraflops. - Almost 1 Teraflop less, but sometimes faster by almost 50%... And they have the same amount of bandwidth too. (153GB/s)
https://www.anandtech.com/bench/product/1062?vs=1076

Not to mention, the majority of people have absolutely no idea how FLOPS even pertains to the rendering of a games world anyway.

EricHiggin said:

It's getting much tougher to sell someone on your hardware based on the games themselves visually. Trying to prove it through video is extremely tough today for so many reasons. Like for one, how do you prove your 4k box is better than their 4k box, when your 4k video can only be viewed by many at 1080p online? A bigger TF number is a much easier and simpler way of 'proving' that, even though it doesn't mean all that much. For a consumer who doesn't have the time or knowledge or ability to know the difference, specs matter more and more, especially if you can't actually outsell your cheaper 'inferior' competition.

Downsampling/Supersampling means that 4k can look better on a 1080P display than native 1080P content on a 1080P display.
We are far from the point of photorealism in gaming even at 720P... Which means it's still possible to showcase differences.

The real crutch is that the Xbox and Playstation consoles are getting closer and closer in terms of capability that it's really unimportant unless you are an enthusiast... And let's face it, if you gave that much of a shit about hardware, chances are that you are part of the PC Gaming Master Race anyway.



--::{PC Gaming Master Race}::--

Pemalite said:

I don't think that really happened on the Xbox One X often though? I know a few titles did... But it was far from the norm really.

That's why next gen will be even better. Freesync Monitor are cheap so it's worth to invest 

Well. No it can't... I have been over this a millions times on this forum... But I'll go again.

Well.  it can... I have been over this a millions times on this forum... But I'll go again.


There are a ton of different types of compute workloads... Again we have 8bit, 16bit, 32bit, 64bit floating point and so on.
Then we have 8bit, 16bit, 24bit, 32bit, 64bit integers as well. - For example most GPU's don't have native 64bit integer support, so they emulate such functionality on the 32bit blocks... Which comes with a corresponding hit to performance.

And FLOPS only refers to one of those. 32bit floating point. Aka. Single Precision.

Well i know about all of you mention above no need to write  , FLOPS can be used to compare for the same bit workload and the same integer support and functionality  . Let say you can compare 32 bit fp to 32  bit fp on the same architecture design CPU or GPU. Like  "comparing the same line up GPU like RTX 2070 with RTX 2080.  FLOPS can be used to market and to tell the difference between those card, so seller or vendor dont have to write all the detail of the advantage everytime they want to sell better product. Unless if you really want telling all those crazy detail to consumer LOL 

Not to mention that other parts of the chip come into play during pure compute workloads as well... For example the Geforce 1030 has a DDR4 and GDDR5 variant... In terms of compute there is only a difference of 6.5%. (1152Mhz vs 1227Mhz @384 Cuda cores.)
But the real world performance is often 50% or more slower.
https://www.gamersnexus.net/hwreviews/3330-gt-1030-ddr4-vs-gt-1030-gddr5-benchmark-worst-graphics-card-2018

Like I Said I agree, the problem who the hell want to explain all of those to consumer. LMAO 

Eutherium mining for example which tends to be a pure compute workload also sees significant gains with more memory bandwidth on the graphics card... Which just further emphasizes that flops is not a relevant capability measurement tool even between GPU's of the same architecture.

Now you getting out of the topic LOL.

Global Foundries have stepped away from being in the fabrication process race... So they are out.
TSMC's 7nm+ is going to be leveraging EUV, so the designs won't just automatically translate over.

We still have more than a year and half  and we don't know if Sony/Microsoft planned behind close door, there is still room for theory and prediction  

TSMC's 7nm+ is likely to only be a marginal improvement over 7nm anyway with 20% density improvements, 10% performance increase... Probably not worth making the gamble on that process for the next gen consoles monolithic chips. - Yields are stupidly important.

20% density improvement is big , especially combined with 10% performance. 

TSMC's capacity is also only going to be 1,000~ wafers per day... That can't all be reserved for the next gen consoles, other partners building ARM processors, GPU's and other pieces of logic will be jumping at that as well.


Samsungs 7nm process however will be employing EUV... But lets keep in mind that it's not the same as TSMC's 7nm+ process, don't fall into the trap of their marketing shenanigans... You can't compare Samsungs 7nm to TSMC's 7nm, marketing has made the comparisons useless on a number to number basis.
Both TSMC and Samsung will have up-to quad patterning for their fabs.

Is it possible that the next-gen consoles could use 7nm+? Yes. It's possible, it's just highly unlikely at this stage.

We just predicting for fun , we still don't know how Sony and Microsoft real chip design is and timeline/stage of  testing. They might been planned to use 7nm+ 

And yes Money is important for Microsoft and Sony, the more you spend on Chips and the production-of... The higher your costs become for designing and building a console which flows on to the consumer.
Long gone are the days where it's even financially feasible for console manufacturers to dump billions on designing chips for their devices... Sony and Microsoft have limits you know, which are generally governed by shareholder expectations.

Money is important that's why they are looking for future investment not sort temporary sells gain. Spending a lot of money on expansives chip that will be cheaper down the line is very chip proposal , especially when you want to compete on the saturated market. Both companies are ready to loose money to gain consumer and ready to lose some to gain market share.  

I think myself and many others had hoped that the next-gen consoles would be targeting high-end GPU performance rather than the mid-range that the Xbox One and Playstation 4 eventually settled on to see a bigger leap in general fidelity.
Don't get me wrong, we will see a big leap, it's just not going to be as impressive as it could be... Partly that is down to AMD not being able to keep pace with nVidia's performance cadence.

Is it enough for a console? I would argue more is better.

bigger leap in infidelity is not determined by how powerful the GPU alone, it need mass market/peoples that using the platform so Game developer can utilize it , optimize it, and develop for the mainstream. Just look at Ray Tracing , it will be a fad it's only small people buy the GPU and no games are utilizing it because nobody by the games. 

Like I said you're contradicting yourself, you said you want more powerful console but at the same time you are pessimist , with the progrest. Unlike me i am a realist. With all the current leak and progress i am already happy enough, but i still have hope and dream based on the unknown info that can be theorize to used as debate and speculation.

But if you optimize for a Geforce 1080Ti, then the same performance gap between the GTX 1080 and 1080Ti will continue to exist.

But than nobody optimizing the games for 1080ti except modders or making your own games. Because games are made and optimize for low spec PC or mainstream PC

Optimization isn't some magical construct that makes hardware more capable and excludes all other pieces of hardware.

Agree but in reality without optimization hardware is just a bunch of lifeless piece of metal without function.  

The PC gets optimizations... I think console gamers forget this.
For example... Whenever AMD and nVidia roll out a driver update it's not just to make things look prettier or fix bugs... But to introduce optimizations that improve performance...
For example here we can see where optimizations improved performance by 15%.
https://www.tomshardware.com/reviews/amd-nvidia-driver-updates-performance-tested,5707.html

Driver is not the only one, they have API and OS and game design . PC GPU are stuck with bloated API and OS and in fact most games on PC are made on low spec PC on mind, it was held by lowest spec PC. And also PC are used for not just gaming, all its power are divided. for console you only need to play games. All of 8 core PC will be divided by driver and OS and API to run multi tasking while console only for games. That alone speak why PC need double the performance of console to run the same games.

Microsoft does a similar thing with Windows, which will often increase performance. For example:
https://www.techpowerup.com/255843/windows-10-may-2019-update-1903-gaming-performance-tested-in-21-titles-with-rtx-2080-ti-and-radeon-vii

And of course we have improvements at the API level:
https://www.redgamingtech.com/how-much-better-is-performance-with-modern-apis-directx-12-vs-directx-11-opengl-vs-vulkan/

Again Vulcan is still PC low level API, cannot compare directly to the metal optimization on console, and on top of that Directx 12 is still suck , you cannot beat Vulcan on optimization.

And often game developers will roll out updates that also improve the performance of their title.
https://www.techspot.com/review/1759-ray-tracing-benchmarks-vol-2/

So obviously "optimizations" isn't just a console-only thing. - The evidence is simply undeniable at this point.

Of Course isn't a console thing, but my point is still correct, PC need raw performance due to compatibility on every spec combination on the market, thus it's sacrifice maximal optimization that can only happen on single device like console. 

In short, there is absolutely no game that runs on a Playstation 4 that can't run on a Radeon 7870... Often, games will run better on a Radeon 7870 at the same visual settings as the Playstation 4 too... Like Overwatch, Battlefield 1, Grand Theft Auto 5 and so on.
Plus you get to choose your settings on the PC... Game doesn't run at full 1080P on the Playstation 4? Well, on a Radeon 7870 it can, just lower a couple of settings.

Like I said it depend on games developer, not all developer want to re construct their games using console API, most of the times they just lazy and ported directly using Directx or Open GL 

They could, but... People will whinge.
Price is a stupidly important factor for allot of people, especially for those who sit lower on the socio-economic ladder.
Remember the Xbox One at $500, remember the Playstation 3 at $600... They were all contentious price points.

 If you follow the progress of price and consumer capability on buying product,  599 USD is very cheap if we compared in 2019 to  2007. There is a thing called inflation , 600 USD in 2007 will be equal to 750  USD in 2020 . Consoles are not for lower socio - economic peoples. In fact , poor gamer will just buy budget PC because they can be used for working and play old games, hell they will mostly buy discounted games and often pirated and play free games. Console is for people who spend more money for simple device and don't want to buy maintaining PC. Console is  niche product that's why it's always stay below 200 millions sells. 499 USD is super chips for console that will be release in 2020, if the power ratio is good, the problem with PS3 with 599 price is the price ratio . even Xbox 360 is better in most of the games and came up early.   

We are still a long way away from a full ray traced gaming world, it might take a few more console generations for rasterization to fall away to the side.

Ray Tracing will be used on the 9th gen hardware, just like it is being used in some 8th gen games, it's the extent of it's use that is up for debate.

That's why a hybrid design will be used in this gean, but we are talking about marketing jargon, not what will Developer used, so ray tracing is still a selling point to sell a console. 

It's a value-added incentive. Not the be-all, end-all selling point.

Scarlett will be rolling out Xbox 360 and Original Xbox games in it's backwards compatibility efforts as well, hence why Microsoft pulled that team away from the Xbox One, will Sony do the same with Playstation 1, 2 and 3 backwards compatibility on the Playstation 5? Or would you deem it as unimportant?

I am not willing to place bets on how well any console is going to sell, my tastes don't align with the average consumer.

On console  early year it big advantage. Especially if they have less launch games . But remember it's only works if their previous console are the champion on the market and for the consumer. Xbox will struggle because this gen they only got 40 million consumer, so their fans from this gen will only migrate within that number. PS5 in the other hand will guarantee 100 million  PS4 player/gamer who like PS ip and have all their games to buy PS5.  So as long PS5 is not following  Xbox One X disaster or PS3 disaster . They are on great position on the market. But I bet they will be fine , Sony are not stupid. But i do believe Xbox also will do a great Job, but they need to do more than Sony to gain more consumer.

Good thing I provided a fairly comprehensive list where successive consoles launched with full backwards compatibility (In hardware!) yet didn't sell as well as their predecessor.

That's because the console  you mention have other factor that make them failed on the market. 

And as the evidence I provided earlier, backwards compatibility isn't actually a big selling point for most gamers, it's a value-added incentive, sure. But it's far from being the most important aspect... Otherwise everyone would be a PC gamer as you can run your PC games from 30~ years ago.

Like I said nobody buy PC alone to just playing games alone, and nobody want to spend time using emulator and update it , there is a reason why console exist , people want something simple. 

backward compatibility is not a big selling point,  but it's still a selling point and crucial. Especially if you want all your loyal fans to buy your product and will be a great factor on deciding on buying process "if everything worked as planned " (no more disaster on launching or announcement or wrong at pricing or designing the system) 



curl-6 said:
Barkley said:

The main thing the extra powers going to be wasted on is 4k. 

Agreed, the power spent on doing native 4K would be better off spent on any number of other things IMO, but even just 60fps but lighting, materials, effects. Unfortunately, 4K is a sexy buzzword that the companies will use to sell their shiny new boxes.

Given the huge jump in CPU power, what I'm most hoping to see is exactly what I wanted to see from the 8th gen but which they failed to deliver; a big jump in simulation, interactivity, like BOTW's physics/chemistry stuff but x10.

BotW has very rudimentary and case by case physics.

What I'm hoping the most is first atempts at fully voxel based worlds with fine detail. Take a look at this tank (at around 3-4 seconds in):

Fully made of voxels - yet at first glance you wouldn't notice. This is just simple example, but imagine the whole game world built like that - whatever happens in that world, stays that way cause voxel octrees preserve all data. Trouble is, they need astonishing amoung of data, to be fine detailed, and need ways to traverse octree very fast.

Now I'm not sure about AMD's implementation of RT, but nVidia's RTX cards have additional hardware for traversing certain hierarchical structures, and it's my guess that this can be used to speed up octree traversal. In addition, voxel octrees make RT much easier to do from the get-go. This combined with SSDs gives me hope that sometime in next gen there will be at least some games that utilize this new potential and that are huge step up above standard "shiny, but non-interactive" polygons approach we've been stuck with for decades.