Quantcast
PS5 Coming at the End of 2020 According to Analyst: High-Spec Hardware for Under $500

Forums - Sony Discussion - PS5 Coming at the End of 2020 According to Analyst: High-Spec Hardware for Under $500

Price, SKUs, specs ?

Only Base Model, $399, 9-10TF GPU, 16GB RAM 18 26.87%
 
Only Base Model, $449, 10-12TF GPU, 16GB RAM 10 14.93%
 
Only Base Model, $499, 12-14TF GPU, 24GB RAM 18 26.87%
 
Base Model $399 and PREMIUM $499 specs Ans3 10 14.93%
 
Base Mod $399 / PREM $549, >14TF 24GB RAM 5 7.46%
 
Base Mod $449 / PREM $599, the absolute Elite 6 8.96%
 
Total:67
CrazyGPU said:

I ment 0.24 Tf, but I should have put 230 Gigaflops, So the correct number would have been 0.23 Tf. Source: https://www.gamespot.com/gallery/console-gpu-power-compared-ranking-systems-by-flop/2900-1334/7/

Wikipedia on the other side gives your number, 0.192 Teraflops.

Wiipedia is basing that on information direct from nVidia. My own math on determining the GPU's floating capabilities aligns with that too.

It's 192Gflop.

CrazyGPU said:
Lets take your number. 1.84 / 0.192 is 9,6 times. Next jump should be to 17.6 Tf to be equal. Not going to happen.

That is just single precision floating point math, there is far more to a GPU than that. Flops isn't accurate when comparing GPU's of different architectures, it's entirely a theoretical denominator, not a real world one.

CrazyGPU said:
you said arround 417 MB or 0.42 (approximating here) GB vs 4.5 GB for games. 11 times more memory. 88 GB of RAM would be the same jump. We don´t need that amount of ram now of course, but It´s clear that the jump will be much lower next gen. Arround double like you say.

Streaming assets into DRAM on a per-needs basis is going to be significantly better next-gen, caching is becoming significantly more important, so they can do more with less memory due to that factor.

But you are right, we don't need 88GB of DRAM, not for a substantial increase in fidelity anyway.

CrazyGPU said:
Bandwith... 22,4 GB/s to 176 GB/s. close to 8 times more bandwith. We are not going to get to 1400 GB/s of bandwith so the jump will be much lower too.

Comparing raw numbers is a little disingenuous.
Modern Delta Colour Compression techniques can add another 50% or more to the bandwidth numbers...
Draw Stream Binning Rasterization severely cuts down the amount of work that needs to be performed to start with, making better use of limited bandwidth...
And I could go on.

CrazyGPU said:
In 2007 I was playing Crysis with my DX10 high end PC and consoles were a joke in texture quality. All washy and blurry. Now a PS4 or Xbox are still worse than my PC in ultra settings (GF 1070 2k monitor) but consoles are much closer than they used to be.

I disagree, Metro on PC is a night and day difference from consoles.
And the same goes for most Frostbite powered games.

...And the difference is starting to look generational when comparing PC against the base Xbox One and Playstation 4.

Crysis however was a unique one, it was PC exclusive, it pushed PC's to the limits... And it wasn't the best optimized title as Crytek made forward projections on "possible" PC hardware (I.E. Dramatic increases in CPU clockrates) and thus it's taken a very long time for it to stop being a hardware killer.

Consoles today (Xbox One X and Playstation 4 Pro) are generally sitting around the PC's medium quality preset. - It's where the bulk of the 7th gen sat when compared against the PC, obviously resolution and framerates are a bit better this time around of course, but the gap still exists and will always exist.

CrazyGPU said:
So clearly, graphically we are going to have an improvement but it will be much lower than old jumps, and Im not even considering the jump from PS1 to PS2 or PS2 to PS3.

Well sure. Mostly because as far as general rasterization is concerned... The bulk of the low-hanging performance/graphics fruit has been picked.
But that will at some point end.
I think 10th console generation will be a significant leap in graphics as graphics is undergoing a paradigm shift that we haven't seen since programmable pixel shaders burst onto the scene with the Original Xbox/Geforce 3... The 9th generation will be a bridge to that.

CrazyGPU said:
Graphics will be better but nothing to write home about compared to XBOX one X or PS4 pro specially. Not a Dream Ray tracing machine with 20 tf, 32 GB of ram and 1 TB of bandwith like Ken Kutaragui would like at 600 USS that no one would buy.

The Xbox One X and Playstation 4 Pro have yet to impress me yet... And the main reason is mostly that games are designed with the base Xbox One and Playstation 4 in mind... If games stuck to a lower resolution and framerate on the Xbox One X and pushed fidelity far more strongly... I would probably be more impressed.

As for Ray Tracing, we might not need 20 Teraflops, 32GB of Ram and 1TB/s of bandwidth, there is a ton of Research going on to make it more efficient, I mean... Rasterization took years to become more efficient, Compression and Culling were the first big techniques to make it more efficient, same thing will happen with Ray Tracing.

CrazyGPU said:
On the other side, CPU specs and capability will be the force that will make next gen something good. More games with FPS at 60 , AI, Simulation, Objects in maps,etc. For many people that would be a game changer.

Better CPU doesn't guarantee 60fps if you are GPU or DRAM limited. Simulation quality will be amazing, it should result in more immersive worlds overall.

CrazyGPU said:

That´s my opinion, I hope I´m wrong and PS5 will become a incredible graphic beast that shows graphics on a whole new level. 

The way I see it, There is more difference when you see cars from Gran Turismo 1 to Gran Turismo Sport, than between Gran turismo Sport, to real world cars, the woow feeling is getting smaller and smaller.

I think with AMD's graphics processors being the weapon of choice for next gen... I think we need to keep expectations in check either way... AMD isn't generally making industry leading high-end hardware.



Around the Network
CGI-Quality said:

There's a problem with using Crysis. Of course it 'shit all over the consoles', it was a PC exclusive, taking advantage of the latest and greatest technology. Modern machines aren't getting closer, they're simply having their goods taken advantage of and games made only for PC, particularly of that flavor, are virtually non-existent.

Although the PC version of Metro: Exodus stomps the console versions into the ground, if it were a PC exclusive, there would truly be no discussion to have. But, go look at the beginning of this gen and then look at the Exodus shots I've posted. It isn't even close. And that's one example. Video game graphics are NOWHERE NEAR their pinnacle.

Finally we go straight to the heart of the matter !   And thanks ;)

 Scalability is not the miraculous solution if we wanna squeeze several SKUs, or an infinite plethora of hardware components with different architecture and specs like on PCs; it is just a way to make everybody "somehow happy", without ever using the available resources at best.

  Just imagine if the best developers could develop exclusively for the best PC hardware; the result would blow away anything you have seen on PC, by a very long margin.  

 Crysis was the perfect example, and still developers could do even more if they could choose the very best components, put together, and build a super powerful Box, like a single SKU to use.

 



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Pemalite said:
CrazyGPU said:

I ment 0.24 Tf, but I should have put 230 Gigaflops, So the correct number would have been 0.23 Tf. Source: https://www.gamespot.com/gallery/console-gpu-power-compared-ranking-systems-by-flop/2900-1334/7/

Wikipedia on the other side gives your number, 0.192 Teraflops.

Wiipedia is basing that on information direct from nVidia. My own math on determining the GPU's floating capabilities aligns with that too.

It's 192Gflop.

CrazyGPU said:
Lets take your number. 1.84 / 0.192 is 9,6 times. Next jump should be to 17.6 Tf to be equal. Not going to happen.

That is just single precision floating point math, there is far more to a GPU than that. Flops isn't accurate when comparing GPU's of different architectures, it's entirely a theoretical denominator, not a real world one.

CrazyGPU said:
you said arround 417 MB or 0.42 (approximating here) GB vs 4.5 GB for games. 11 times more memory. 88 GB of RAM would be the same jump. We don´t need that amount of ram now of course, but It´s clear that the jump will be much lower next gen. Arround double like you say.

Streaming assets into DRAM on a per-needs basis is going to be significantly better next-gen, caching is becoming significantly more important, so they can do more with less memory due to that factor.

But you are right, we don't need 88GB of DRAM, not for a substantial increase in fidelity anyway.

CrazyGPU said:
Bandwith... 22,4 GB/s to 176 GB/s. close to 8 times more bandwith. We are not going to get to 1400 GB/s of bandwith so the jump will be much lower too.

Comparing raw numbers is a little disingenuous.
Modern Delta Colour Compression techniques can add another 50% or more to the bandwidth numbers...
Draw Stream Binning Rasterization severely cuts down the amount of work that needs to be performed to start with, making better use of limited bandwidth...
And I could go on.

CrazyGPU said:
In 2007 I was playing Crysis with my DX10 high end PC and consoles were a joke in texture quality. All washy and blurry. Now a PS4 or Xbox are still worse than my PC in ultra settings (GF 1070 2k monitor) but consoles are much closer than they used to be.

I disagree, Metro on PC is a night and day difference from consoles.
And the same goes for most Frostbite powered games.

...And the difference is starting to look generational when comparing PC against the base Xbox One and Playstation 4.

Crysis however was a unique one, it was PC exclusive, it pushed PC's to the limits... And it wasn't the best optimized title as Crytek made forward projections on "possible" PC hardware (I.E. Dramatic increases in CPU clockrates) and thus it's taken a very long time for it to stop being a hardware killer.

Consoles today (Xbox One X and Playstation 4 Pro) are generally sitting around the PC's medium quality preset. - It's where the bulk of the 7th gen sat when compared against the PC, obviously resolution and framerates are a bit better this time around of course, but the gap still exists and will always exist.

CrazyGPU said:
So clearly, graphically we are going to have an improvement but it will be much lower than old jumps, and Im not even considering the jump from PS1 to PS2 or PS2 to PS3.

Well sure. Mostly because as far as general rasterization is concerned... The bulk of the low-hanging performance/graphics fruit has been picked.
But that will at some point end.
I think 10th console generation will be a significant leap in graphics as graphics is undergoing a paradigm shift that we haven't seen since programmable pixel shaders burst onto the scene with the Original Xbox/Geforce 3... The 9th generation will be a bridge to that.

CrazyGPU said:
Graphics will be better but nothing to write home about compared to XBOX one X or PS4 pro specially. Not a Dream Ray tracing machine with 20 tf, 32 GB of ram and 1 TB of bandwith like Ken Kutaragui would like at 600 USS that no one would buy.

The Xbox One X and Playstation 4 Pro have yet to impress me yet... And the main reason is mostly that games are designed with the base Xbox One and Playstation 4 in mind... If games stuck to a lower resolution and framerate on the Xbox One X and pushed fidelity far more strongly... I would probably be more impressed.

As for Ray Tracing, we might not need 20 Teraflops, 32GB of Ram and 1TB/s of bandwidth, there is a ton of Research going on to make it more efficient, I mean... Rasterization took years to become more efficient, Compression and Culling were the first big techniques to make it more efficient, same thing will happen with Ray Tracing.

CrazyGPU said:
On the other side, CPU specs and capability will be the force that will make next gen something good. More games with FPS at 60 , AI, Simulation, Objects in maps,etc. For many people that would be a game changer.

Better CPU doesn't guarantee 60fps if you are GPU or DRAM limited. Simulation quality will be amazing, it should result in more immersive worlds overall.

CrazyGPU said:

That´s my opinion, I hope I´m wrong and PS5 will become a incredible graphic beast that shows graphics on a whole new level. 

The way I see it, There is more difference when you see cars from Gran Turismo 1 to Gran Turismo Sport, than between Gran turismo Sport, to real world cars, the woow feeling is getting smaller and smaller.

I think with AMD's graphics processors being the weapon of choice for next gen... I think we need to keep expectations in check either way... AMD isn't generally making industry leading high-end hardware.

Thanks Pemalite for all the explanations and clarification;  a common mistake is to compare raw numbers and drawing conclusions without understanding the whole architecture, etc;

  I hope you will keep posting here.



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

CGI-Quality said:
Trumpstyle said:

About the graphics improvement, I think most next-gen games will look like BF2 at Ultra settings, some games will probably even look worse and those that beat it, it will just be hard to see the different. So if you playing BF2 on Xbox one X it will be like going from Medium settings to Ultra+ setting, a slightly more polished looking game.

There is already a game out that looks better than BFII on Ultra.

Hehe it was just a example, I could've used Far cry 5 or Wolfenstein 2 but BF2 is probably the game most gamers believe is the best looking game. I have seen your screenshots of Metro Exodus and looked at youtube videos at that game, it probably beats BF2.

Metro exodus at extreme settings is where I believe next-gen games will land. But the real question is can Metro Exodus beat the CURRENT king of graphics?

https://www.youtube.com/watch?v=wliXa2FQ6xU

or

https://www.youtube.com/watch?v=7S5SCMNUY-k&t=

It's very very close, looks like Metro has a small minor edge but I don't like to judge games intill I played them. But I don't play on PC anymore.



"Donald Trump is the greatest president that god has ever created" - Trumpstyle

6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Trumpstyle said:
CGI-Quality said:

There is already a game out that looks better than BFII on Ultra.

Hehe it was just a example, I could've used Far cry 5 or Wolfenstein 2 but BF2 is probably the game most gamers believe is the best looking game. I have seen your screenshots of Metro Exodus and looked at youtube videos at that game, it probably beats BF2.

Metro exodus at extreme settings is where I believe next-gen games will land. But the real question is can Metro Exodus beat the CURRENT king of graphics?

https://www.youtube.com/watch?v=wliXa2FQ6xU

or

https://www.youtube.com/watch?v=7S5SCMNUY-k&t=

It's very very close, looks like Metro has a small minor edge but I don't like to judge games intill I played them. But I don't play on PC anymore.

Crysis 3 isn’t even in the same Universe. Last Light beat it in 2013. Exodus doesn’t even look in its outdated direction.



                                                                                                                                            

Around the Network
CGI-Quality said:
Trumpstyle said:

Hehe it was just a example, I could've used Far cry 5 or Wolfenstein 2 but BF2 is probably the game most gamers believe is the best looking game. I have seen your screenshots of Metro Exodus and looked at youtube videos at that game, it probably beats BF2.

Metro exodus at extreme settings is where I believe next-gen games will land. But the real question is can Metro Exodus beat the CURRENT king of graphics?

https://www.youtube.com/watch?v=wliXa2FQ6xU

or

https://www.youtube.com/watch?v=7S5SCMNUY-k&t=

It's very very close, looks like Metro has a small minor edge but I don't like to judge games intill I played them. But I don't play on PC anymore.

Crysis 3 isn’t even in the same Universe. Last Light beat it in 2013. Exodus doesn’t even look in its outdated direction.

CGI, let me ask you a question.

Undoubtly next gen consoles are coming with HDMI 2.1 ports, but good 4K TV's are still mostly sticking with HDMI 2.0 ports themselves, very likely changing it to 2.1 next year as the base for any high end model.

Do you think getting a TV with only 2.0 support this year (say, as the Sony Bravia XBR-X900F I've been wanting since my old Samsung broke last year) would make me lose any kind of benefit from the PS5?

As far as I understood the major benefits is the vastly improved bandwidth, which allows primarly the connection to be able to transfer up to 4K120FPS or 8K60FPS.

PS5 won't be hitting neither of those, they look like extreme enthusiast PC gaming focused needs.

But there is other stuff coming with the 2.1 aside from just bandwidth, and there is where I'm getting a bit reluctant, specially about VRR and 10-bit 4K60FPS with 4:4:4 chroma on HDR (since it's too much for 2.0 to take on the same time) as even if the mentioned TV does have support to Dolby Vision (which is basically equivalle to HDR10+) it's still coming from a 2.0 port.

Do you think I would be foolish to not to go for a TV with 2.1 support as it's going to be my TV for use during all of next gen? Or anything else than improved bandwidth is not so relevant or still doable on 2.0 speed, and aside from PC gaming there are no loss to stick with 2.0, even if consoles do come with 2.1?

I dunno if you can help me with that but as knowing you are vastly more well versed than me on current technology/techniques I think it's worth a shot to ask xD

I can't really afford to make such an investing and switch if needed in 1 or 2 years, I'll have to stick with it for at least 5 years or more.



Despite texture compression, geometry culling , new Rasterization tecniques, voxels and so on, I don´t see the next gen being game changing. Raw numbers are an indicator, not an exact comparator, but still, considering all this tecniques we are still away from old days jumps and the feeling people will have even with the same amount of improvement will be lower because we are at a much better image quality level than we were in the old days.

Talking about metro, it has great atmosphere, particles, light, and so on, but faces and animation are not at the same level. Games like RDR2 and God of War look more real because of that. When I see RDR2 woods, it feels more real than metro to me.



BraLoD said:

CGI, let me ask you a question.

Undoubtly next gen consoles are coming with HDMI 2.1 ports, but good 4K TV's are still mostly sticking with HDMI 2.0 ports themselves, very likely changing it to 2.1 next year as the base for any high end model.

Do you think getting a TV with only 2.0 support this year (say, as the Sony Bravia XBR-X900F I've been wanting since my old Samsung broke last year) would make me lose any kind of benefit from the PS5?

As far as I understood the major benefits is the vastly improved bandwidth, which allows primarly the connection to be able to transfer up to 4K120FPS or 8K60FPS.

PS5 won't be hitting neither of those, they look like extreme enthusiast PC gaming focused needs.

But there is other stuff coming with the 2.1 aside from just bandwidth, and there is where I'm getting a bit reluctant, specially about VRR and 10-bit 4K60FPS with 4:4:4 chroma on HDR (since it's too much for 2.0 to take on the same time) as even if the mentioned TV does have support to Dolby Vision (which is basically equivalle to HDR10+) it's still coming from a 2.0 port.

Do you think I would be foolish to not to go for a TV with 2.1 support as it's going to be my TV for use during all of next gen? Or anything else than improved bandwidth is not so relevant or still doable on 2.0 speed, and aside from PC gaming there are no loss to stick with 2.0, even if consoles do come with 2.1?

I dunno if you can help me with that but as knowing you are vastly more well versed than me on current technology/techniques I think it's worth a shot to ask xD

I can't really afford to make such an investing and switch if needed in 1 or 2 years, I'll have to stick with it for at least 5 years or more.

I think you'll be fine with a 2.0 and above. You won't see much of the benefit from the higher bandwidth with 4K being the main standard next gen (highly doubt it'll be any higher even for premium machines).



                                                                                                                                            

CrazyGPU said:

Talking about metro, it has great atmosphere, particles, light, and so on, but faces and animation are not at the same level. Games like RDR2 and God of War look more real because of that. When I see RDR2 woods, it feels more real than metro to me.

God of War and RDR2 only beat Metro in characters. Everything else goes to the latter. Neither of those games have the significant lighting advances of Metro either, so by default, neither of their forests look more real (Metro's environmental textures also have a considerable advantage). The atmosphere just isn't there. Physics and other tech (like flies being attracted to newly killed bodies) are just an extra touch that neither of those games compete with. When talking 'next gen', those are the kinds of things I look for.

Last edited by CGI-Quality - on 03 March 2019

                                                                                                                                            

CGI-Quality said:
BraLoD said:

CGI, let me ask you a question.

Undoubtly next gen consoles are coming with HDMI 2.1 ports, but good 4K TV's are still mostly sticking with HDMI 2.0 ports themselves, very likely changing it to 2.1 next year as the base for any high end model.

Do you think getting a TV with only 2.0 support this year (say, as the Sony Bravia XBR-X900F I've been wanting since my old Samsung broke last year) would make me lose any kind of benefit from the PS5?

As far as I understood the major benefits is the vastly improved bandwidth, which allows primarly the connection to be able to transfer up to 4K120FPS or 8K60FPS.

PS5 won't be hitting neither of those, they look like extreme enthusiast PC gaming focused needs.

But there is other stuff coming with the 2.1 aside from just bandwidth, and there is where I'm getting a bit reluctant, specially about VRR and 10-bit 4K60FPS with 4:4:4 chroma on HDR (since it's too much for 2.0 to take on the same time) as even if the mentioned TV does have support to Dolby Vision (which is basically equivalle to HDR10+) it's still coming from a 2.0 port.

Do you think I would be foolish to not to go for a TV with 2.1 support as it's going to be my TV for use during all of next gen? Or anything else than improved bandwidth is not so relevant or still doable on 2.0 speed, and aside from PC gaming there are no loss to stick with 2.0, even if consoles do come with 2.1?

I dunno if you can help me with that but as knowing you are vastly more well versed than me on current technology/techniques I think it's worth a shot to ask xD

I can't really afford to make such an investing and switch if needed in 1 or 2 years, I'll have to stick with it for at least 5 years or more.

I think you'll be fine with a 2.0 and above. You won't see much of the benefit from the higher bandwidth with 4K being the main standard next gen (highly doubt it'll be any higher even for premium machines).

Thanks.

I've been looking and it seems chroma subsampling won't make me lose any quality from 4:4:4 to 4:2:2 (whick should allow 4K60FPS with HDR on, using 2.0 full bandwidth) except for some very minor text edges.