By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - PS5 Coming at the End of 2020 According to Analyst: High-Spec Hardware for Under $500

 

Price, SKUs, specs ?

Only Base Model, $399, 9-10TF GPU, 16GB RAM 24 30.00%
 
Only Base Model, $449, 10-12TF GPU, 16GB RAM 13 16.25%
 
Only Base Model, $499, 12-14TF GPU, 24GB RAM 21 26.25%
 
Base Model $399 and PREMIUM $499 specs Ans3 10 12.50%
 
Base Mod $399 / PREM $549, >14TF 24GB RAM 5 6.25%
 
Base Mod $449 / PREM $599, the absolute Elite 7 8.75%
 
Total:80
Trumpstyle said:
Pemalite said:

 

I just heard CGI groan from the other side of the planet.


I very much disagree... Ray Tracing is the future... We have seen a small inkling of what that path entails with Battlefield 5 and Metro.

There is simply no way next-gen will see a dramatic increase in graphics. We can expect next-gen gpus to be about 50-100% faster than Xbox One X. Compare that to radeon 7850 (PS4 Gpu) to X800XL (Xbox 360 Gpu), the PS4 gpu is 9,9x faster according to gaming benchmarks I checked a few months ago.

People need to keep expection in check, if you played Far cry 5, Red dead redemption 2 or Battlefront 2 on Xbox one X you will see slightly more polished version of those games.

I don't agree with Ray-tracing either, for most people I think ray-tracing will just look different than normal light effects, not better.

I think your expectations for Next Gen are so low to an extent that might sound unrealistic even for the most pessimistic on the entire Net.  

  Maybe you have missed some of the interesting tidbits Pemalite said about efficiency and CPU/GPU combo.   Don't be so quick to talk about 50% or 100% faster than XBox One X;   I'm quiet confident you will be highly surprised by the potential of next gen, and I can't wait to quote you again on this after the official announcement :D

Last edited by Nate4Drake - on 01 March 2019

”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Around the Network
Trumpstyle said:

There is simply no way next-gen will see a dramatic increase in graphics. We can expect next-gen gpus to be about 50-100% faster than Xbox One X. Compare that to radeon 7850 (PS4 Gpu) to X800XL (Xbox 360 Gpu), the PS4 gpu is 9,9x faster according to gaming benchmarks I checked a few months ago.

The Xbox 360 isn't using an R430 derived GPU... So if you made a comparison between a Radeon 7850 and Radeon X800XL, then your comparison is inaccurate from the get go.

Plus, I wouldn't mind knowing how you came to the 9.9x faster assumption?

Trumpstyle said:

People need to keep expection in check, if you played Far cry 5, Red dead redemption 2 or Battlefront 2 on Xbox one X you will see slightly more polished version of those games.

Initially sure, happens almost every console generation... Xbox 360 ports of Playstation 2/Original Xbox were just slightly more polished.

The strong point of the Xbox One X is it's GPU, but it's sinking most of that into driving up resolutions/framerates, something Lockheart shouldn't have an obligation to do.

Trumpstyle said:

I don't agree with Ray-tracing either, for most people I think ray-tracing will just look different than normal light effects, not better.

We haven't even seen what Ray Tracing is fully capable of yet in gaming.
Lighting is also one of the biggest aspects that can also be improved in games.

Nate4Drake said:

I think your expectations for Next Gen are so low to an extent that might sound unrealistic even for the most pessimistic on the entire Net.  

  Maybe you have missed some of the interesting tidbits Pemalite said about efficiency and CPU/GPU combo.   Don't be so quick to talk about 50% or 100% faster than XBox One X;   I'm quiet confident you will be highly surprised by the potential of next gen, and I can't wait to quote you again on this after the official announcement :D

That's just it, efficiency is the big take away. PC technology hasn't sat still in the last half decade since the Xbox One and Playstation 4 launched.



--::{PC Gaming Master Race}::--

Nate4Drake said:
Trumpstyle said:

There is simply no way next-gen will see a dramatic increase in graphics. We can expect next-gen gpus to be about 50-100% faster than Xbox One X. Compare that to radeon 7850 (PS4 Gpu) to X800XL (Xbox 360 Gpu), the PS4 gpu is 9,9x faster according to gaming benchmarks I checked a few months ago.

People need to keep expection in check, if you played Far cry 5, Red dead redemption 2 or Battlefront 2 on Xbox one X you will see slightly more polished version of those games.

I don't agree with Ray-tracing either, for most people I think ray-tracing will just look different than normal light effects, not better.

I think your expectations for Next Gen are so low to an extent that might sound unrealistic even for the most pessimistic on the entire Net.  

  Maybe you have missed some of the interesting tidbits Pemalite said about efficiency and CPU/GPU combo.   Don't be so quick to talk about 50% or 100% faster than XBox One X;   I'm quiet confident you will be highly surprised by the potential of next gen, and I can't wait to quote you again on this after the official announcement :D

Maybe :) We'll see, I remember last-gen, then most games on console were playing at lowest PC-settings with half the pixel count. Right now most console games is still playing with medium/high settings compared to PC and Xbox One X is a sub 4K machine right now. So I just don't see how we will get a high bump in graphics.

Pemalite said:
Trumpstyle said:

There is simply no way next-gen will see a dramatic increase in graphics. We can expect next-gen gpus to be about 50-100% faster than Xbox One X. Compare that to radeon 7850 (PS4 Gpu) to X800XL (Xbox 360 Gpu), the PS4 gpu is 9,9x faster according to gaming benchmarks I checked a few months ago.

The Xbox 360 isn't using an R430 derived GPU... So if you made a comparison between a Radeon 7850 and Radeon X800XL, then your comparison is inaccurate from the get go.

Plus, I wouldn't mind knowing how you came to the 9.9x faster assumption?

Trumpstyle said:

People need to keep expection in check, if you played Far cry 5, Red dead redemption 2 or Battlefront 2 on Xbox one X you will see slightly more polished version of those games.

Initially sure, happens almost every console generation... Xbox 360 ports of Playstation 2/Original Xbox were just slightly more polished.

The strong point of the Xbox One X is it's GPU, but it's sinking most of that into driving up resolutions/framerates, something Lockheart shouldn't have an obligation to do.

You're correct, according to wikipedia the xbox 360 is using an x1800xl gpu, not sure how I screwed that up as I looked at exactly the same place as the last time.

I just looked at Pc gaming benchmarks and 9,9x was the number I got. I decided to do it again with x1800xl to radeon 7850 and the PS4 Gpu is 6,75x faster.

(x1800xl) 1 * 2,5x (Geforce 8800 GTX) * 2,7 (Radeon 7850) = 6,75  This how I got it exactly.

 

About the graphics improvement, I think most next-gen games will look like BF2 at Ultra settings, some games will probably even look worse and those that beat it, it will just be hard to see the different. So if you playing BF2 on Xbox one X it will be like going from Medium settings to Ultra+ setting, a slightly more polished looking game.



6x master league achiever in starcraft2

Beaten Sigrun on God of war mode

Beaten DOOM ultra-nightmare with NO endless ammo-rune, 2x super shotgun and no decoys on ps4 pro.

1-0 against Grubby in Wc3 frozen throne ladder!!

Trumpstyle said:
Pemalite said:

The Xbox 360 isn't using an R430 derived GPU... So if you made a comparison between a Radeon 7850 and Radeon X800XL, then your comparison is inaccurate from the get go.

Plus, I wouldn't mind knowing how you came to the 9.9x faster assumption?

Initially sure, happens almost every console generation... Xbox 360 ports of Playstation 2/Original Xbox were just slightly more polished.

The strong point of the Xbox One X is it's GPU, but it's sinking most of that into driving up resolutions/framerates, something Lockheart shouldn't have an obligation to do.

You're correct, according to wikipedia the xbox 360 is using an x1800xl gpu, not sure how I screwed that up as I looked at exactly the same place as the last time.

I just looked at Pc gaming benchmarks and 9,9x was the number I got. I decided to do it again with x1800xl to radeon 7850 and the PS4 Gpu is 6,75x faster.

(x1800xl) 1 * 2,5x (Geforce 8800 GTX) * 2,7 (Radeon 7850) = 6,75  This how I got it exactly.

It's not using an x1800xl chip either.
It's actually using a semi-custom hybrid GPU which is based upon the x1800xl, but also takes some cues from the 2900 series like Unified Pixel Shaders.

It probably sits around the level of a Radeon HD 2600XT in terms of overall capability, it has some advantages and some disadvantages, but certainly a few advantages over the Playstation 3's Graphics Processor.

Trumpstyle said:

About the graphics improvement, I think most next-gen games will look like BF2 at Ultra settings, some games will probably even look worse and those that beat it, it will just be hard to see the different. So if you playing BF2 on Xbox one X it will be like going from Medium settings to Ultra+ setting, a slightly more polished looking game.

Well. Once you go Ultra+4k+60fps... It's difficult to go back. Haha

But Metro has stepped things up massively in the PC space, that is the baseline I want for next-gen graphics... In short, simulation quality will likely receive a massive treatment next-gen thanks to an abundance of CPU resources becoming available.



--::{PC Gaming Master Race}::--

PS3 had a graphic card with arround 0,24 Mflops compared to 1,84 Tf of PS4´s one. Many people said that it wasn´t much of a jump in graphics. I think it was, probably the most important factor was going from 512 MB (256 VRAM) to 8 GB for textures. Resolution jumped from HD to full HD doubling pixel count. But most people say that the jump was nothing compared from ps1 to ps2 and ps2 to ps3.
Next jump will feel even less impresive. Specially if you own a PS4 pro. Of course I´m talking about graphics. Resolution will be better of course, Ram will double or triple but not increase 16 times like from ps3 to ps4 for sure. The thing is we reached a level of good textures and good poligon count, and even if we are going to get better it won´t feel like older generation jumps in graphics.

CPU will be a different animal.
going from Jaguar cores to Ryzen cores will be masive and will allow massive worlds, better AI, better physics, and better FPS. That will be the jump to be wainting for. No doubt about it. The experience will improve a lot.
Now going back to graphics, it will be like ohh meh, its the same but looks better because it´s 4k.
Ray tracing may be implemented here or there but we still need another 10 years to get global ray tracing graphics the way it should be.
It would be fun though, I´m eager for the new gen to come up. Good CPU with 4k it´s something that I´ve been waiting for , even if RT is not ready for prime time. But it´s not a matter of hardware alone, maybe developers find ways to get better graphics with the same hardware next gen too.



Around the Network
CrazyGPU said:

PS3 had a graphic card with arround 0,24 Mflops compared to 1,84 Tf of PS4´s one.

Playstation 3 didn't have 0.24 Mflops. (Aka. MegaFlops, step down from GigaFlops.)
It was 192Gflop or 192,000 Mflops.

CrazyGPU said:

I think it was, probably the most important factor was going from 512 MB (256 VRAM) to 8 GB for textures.

Playstation 3's split memory pool meant using all the 512MB/256MB for VRAM wasn't going to happen, memory transactions sadly cost when shuffling data between memory pools.
Not only that, but the Playstation 3's OS was consuming 120MB-95MB-50MB across both pools... Meaning you had 392MB-417-462MB for the actual games themselves. (Sony reduced their footprint as time went on.)

The Playstation 4 by comparison might have an 8GB GDDR5 memory pool, but not all of that is available for games either.
3.5GB-3GB was reserved by the OS meaning you had 4.5GB-5GB available for the games themselves.
You also had a 256MB DDR3 pool for background duties on top of that.

If we get next-gen consoles with a 16GB pool and similar OS memory usage, then we would have 13GB available for games, which is more than a doubling in available memory for the games themselves.

CrazyGPU said:

The thing is we reached a level of good textures and good poligon count, and even if we are going to get better it won´t feel like older generation jumps in graphics.

People say the same thing every console generation... I remember users on this very site stating that they are "fine" with the textures current day games run with. I scoffed at the idea of course.
Going from a high-end PC game to a console game, the differences in texture resolution and polygon count can actually be rather startling... There is still a ton of room for improvement I am afraid.

CrazyGPU said:

CPU will be a different animal.
going from Jaguar cores to Ryzen cores will be masive and will allow massive worlds, better AI, better physics, and better FPS. That will be the jump to be wainting for. No doubt about it. The experience will improve a lot.


We already have massive worlds, the limiter on that front tends to be memory, but there are ways to work around that, developers got really clever last console generation to that end by implementing techniques like impostering.
In short, Open-World has been a massive buzzword this generation, I don't see that trend stopping next console generation.

But A.I, Physics and General simulation quality should see a marked improvement next gen, the worlds should seem more "alive" with the ability to have heavier levels of scripting.

The upgrade to Ryzen should provide a 6-10x increase in performance... Maybe more depending on the instructions developers end up leveraging and the CPU itself in question, probably one of the largest jumps in CPU capability in a very long time.

CrazyGPU said:


Ray tracing may be implemented here or there but we still need another 10 years to get global ray tracing graphics the way it should be.
It would be fun though, I´m eager for the new gen to come up. Good CPU with 4k it´s something that I´ve been waiting for , even if RT is not ready for prime time. But it´s not a matter of hardware alone, maybe developers find ways to get better graphics with the same hardware next gen too.

7th gen games were startling to dabble in ray tracing, especially in engines that used deferred rendering.

Ray Tracing is one of those technologies that is going to ramp up and become more significant over time, not just be thrown in your face all at once.



--::{PC Gaming Master Race}::--

Pemalite said:
CrazyGPU said:

PS3 had a graphic card with arround 0,24 Mflops compared to 1,84 Tf of PS4´s one.

Playstation 3 didn't have 0.24 Mflops. (Aka. MegaFlops, step down from GigaFlops.)
It was 192Gflop or 192,000 Mflops.

CrazyGPU said:

I think it was, probably the most important factor was going from 512 MB (256 VRAM) to 8 GB for textures.

Playstation 3's split memory pool meant using all the 512MB/256MB for VRAM wasn't going to happen, memory transactions sadly cost when shuffling data between memory pools.
Not only that, but the Playstation 3's OS was consuming 120MB-95MB-50MB across both pools... Meaning you had 392MB-417-462MB for the actual games themselves. (Sony reduced their footprint as time went on.)

The Playstation 4 by comparison might have an 8GB GDDR5 memory pool, but not all of that is available for games either.
3.5GB-3GB was reserved by the OS meaning you had 4.5GB-5GB available for the games themselves.
You also had a 256MB DDR3 pool for background duties on top of that.

If we get next-gen consoles with a 16GB pool and similar OS memory usage, then we would have 13GB available for games, which is more than a doubling in available memory for the games themselves.

CrazyGPU said:

The thing is we reached a level of good textures and good poligon count, and even if we are going to get better it won´t feel like older generation jumps in graphics.

People say the same thing every console generation... I remember users on this very site stating that they are "fine" with the textures current day games run with. I scoffed at the idea of course.
Going from a high-end PC game to a console game, the differences in texture resolution and polygon count can actually be rather startling... There is still a ton of room for improvement I am afraid.

CrazyGPU said:

CPU will be a different animal.
going from Jaguar cores to Ryzen cores will be masive and will allow massive worlds, better AI, better physics, and better FPS. That will be the jump to be wainting for. No doubt about it. The experience will improve a lot.


We already have massive worlds, the limiter on that front tends to be memory, but there are ways to work around that, developers got really clever last console generation to that end by implementing techniques like impostering.
In short, Open-World has been a massive buzzword this generation, I don't see that trend stopping next console generation.

But A.I, Physics and General simulation quality should see a marked improvement next gen, the worlds should seem more "alive" with the ability to have heavier levels of scripting.

The upgrade to Ryzen should provide a 6-10x increase in performance... Maybe more depending on the instructions developers end up leveraging and the CPU itself in question, probably one of the largest jumps in CPU capability in a very long time.

CrazyGPU said:


Ray tracing may be implemented here or there but we still need another 10 years to get global ray tracing graphics the way it should be.
It would be fun though, I´m eager for the new gen to come up. Good CPU with 4k it´s something that I´ve been waiting for , even if RT is not ready for prime time. But it´s not a matter of hardware alone, maybe developers find ways to get better graphics with the same hardware next gen too.

7th gen games were startling to dabble in ray tracing, especially in engines that used deferred rendering.

Ray Tracing is one of those technologies that is going to ramp up and become more significant over time, not just be thrown in your face all at once.

I ment 0.24 Tf, but I should have put 230 Gigaflops, So the correct number would have been 0.23 Tf. Source: https://www.gamespot.com/gallery/console-gpu-power-compared-ranking-systems-by-flop/2900-1334/7/

Wikipedia on the other side gives your number, 0.192 Teraflops.

Lets take your number. 1.84 / 0.192 is 9,6 times. Next jump should be to 17.6 Tf to be equal. Not going to happen. 

you favor my conclusion even more with your numbers here.

It doesn´t matter the exact numbers though. 

let´s take memory. 

you said arround 417 MB or 0.42 (approximating here) GB vs 4.5 GB for games. 11 times more memory. 88 GB of RAM would be the same jump. We don´t need that amount of ram now of course, but It´s clear that the jump will be much lower next gen. Arround double like you say. 

Bandwith... 22,4 GB/s to 176 GB/s. close to 8 times more bandwith. We are not going to get to 1400 GB/s of bandwith so the jump will be much lower too. 

In 2007 I was playing Crysis with my DX10 high end PC and consoles were a joke in texture quality. All washy and blurry. Now a PS4 or Xbox are still worse than my PC in ultra settings (GF 1070 2k monitor) but consoles are much closer than they used to be. 

So clearly, graphically we are going to have an improvement but it will be much lower than old jumps, and Im not even considering the jump from PS1 to PS2 or PS2 to PS3. 

Graphics will be better but nothing to write home about compared to XBOX one X or PS4 pro specially. Not a Dream Ray tracing machine with 20 tf, 32 GB of ram and 1 TB of bandwith like Ken Kutaragui would like at 600 USS that no one would buy. 

Im not saying graphics are fine, Im just saying that the jump in graphics won´t make people go woooow. 

On the other side, CPU specs and capability will be the force that will make next gen something good. More games with FPS at 60 , AI, Simulation, Objects in maps,etc. For many people that would be a game changer.

That´s my opinion, I hope I´m wrong and PS5 will become a incredible graphic beast that shows graphics on a whole new level. 

The way I see it, There is more difference when you see cars from Gran Turismo 1 to Gran Turismo Sport, than between Gran turismo Sport, to real world cars, the woow feeling is getting smaller and smaller. 



CrazyGPU said:

I ment 0.24 Tf, but I should have put 230 Gigaflops, So the correct number would have been 0.23 Tf. Source: https://www.gamespot.com/gallery/console-gpu-power-compared-ranking-systems-by-flop/2900-1334/7/

Wikipedia on the other side gives your number, 0.192 Teraflops.

Wiipedia is basing that on information direct from nVidia. My own math on determining the GPU's floating capabilities aligns with that too.

It's 192Gflop.

CrazyGPU said:
Lets take your number. 1.84 / 0.192 is 9,6 times. Next jump should be to 17.6 Tf to be equal. Not going to happen.

That is just single precision floating point math, there is far more to a GPU than that. Flops isn't accurate when comparing GPU's of different architectures, it's entirely a theoretical denominator, not a real world one.

CrazyGPU said:
you said arround 417 MB or 0.42 (approximating here) GB vs 4.5 GB for games. 11 times more memory. 88 GB of RAM would be the same jump. We don´t need that amount of ram now of course, but It´s clear that the jump will be much lower next gen. Arround double like you say.

Streaming assets into DRAM on a per-needs basis is going to be significantly better next-gen, caching is becoming significantly more important, so they can do more with less memory due to that factor.

But you are right, we don't need 88GB of DRAM, not for a substantial increase in fidelity anyway.

CrazyGPU said:
Bandwith... 22,4 GB/s to 176 GB/s. close to 8 times more bandwith. We are not going to get to 1400 GB/s of bandwith so the jump will be much lower too.

Comparing raw numbers is a little disingenuous.
Modern Delta Colour Compression techniques can add another 50% or more to the bandwidth numbers...
Draw Stream Binning Rasterization severely cuts down the amount of work that needs to be performed to start with, making better use of limited bandwidth...
And I could go on.

CrazyGPU said:
In 2007 I was playing Crysis with my DX10 high end PC and consoles were a joke in texture quality. All washy and blurry. Now a PS4 or Xbox are still worse than my PC in ultra settings (GF 1070 2k monitor) but consoles are much closer than they used to be.

I disagree, Metro on PC is a night and day difference from consoles.
And the same goes for most Frostbite powered games.

...And the difference is starting to look generational when comparing PC against the base Xbox One and Playstation 4.

Crysis however was a unique one, it was PC exclusive, it pushed PC's to the limits... And it wasn't the best optimized title as Crytek made forward projections on "possible" PC hardware (I.E. Dramatic increases in CPU clockrates) and thus it's taken a very long time for it to stop being a hardware killer.

Consoles today (Xbox One X and Playstation 4 Pro) are generally sitting around the PC's medium quality preset. - It's where the bulk of the 7th gen sat when compared against the PC, obviously resolution and framerates are a bit better this time around of course, but the gap still exists and will always exist.

CrazyGPU said:
So clearly, graphically we are going to have an improvement but it will be much lower than old jumps, and Im not even considering the jump from PS1 to PS2 or PS2 to PS3.

Well sure. Mostly because as far as general rasterization is concerned... The bulk of the low-hanging performance/graphics fruit has been picked.
But that will at some point end.
I think 10th console generation will be a significant leap in graphics as graphics is undergoing a paradigm shift that we haven't seen since programmable pixel shaders burst onto the scene with the Original Xbox/Geforce 3... The 9th generation will be a bridge to that.

CrazyGPU said:
Graphics will be better but nothing to write home about compared to XBOX one X or PS4 pro specially. Not a Dream Ray tracing machine with 20 tf, 32 GB of ram and 1 TB of bandwith like Ken Kutaragui would like at 600 USS that no one would buy.

The Xbox One X and Playstation 4 Pro have yet to impress me yet... And the main reason is mostly that games are designed with the base Xbox One and Playstation 4 in mind... If games stuck to a lower resolution and framerate on the Xbox One X and pushed fidelity far more strongly... I would probably be more impressed.

As for Ray Tracing, we might not need 20 Teraflops, 32GB of Ram and 1TB/s of bandwidth, there is a ton of Research going on to make it more efficient, I mean... Rasterization took years to become more efficient, Compression and Culling were the first big techniques to make it more efficient, same thing will happen with Ray Tracing.

CrazyGPU said:
On the other side, CPU specs and capability will be the force that will make next gen something good. More games with FPS at 60 , AI, Simulation, Objects in maps,etc. For many people that would be a game changer.

Better CPU doesn't guarantee 60fps if you are GPU or DRAM limited. Simulation quality will be amazing, it should result in more immersive worlds overall.

CrazyGPU said:

That´s my opinion, I hope I´m wrong and PS5 will become a incredible graphic beast that shows graphics on a whole new level. 

The way I see it, There is more difference when you see cars from Gran Turismo 1 to Gran Turismo Sport, than between Gran turismo Sport, to real world cars, the woow feeling is getting smaller and smaller.

I think with AMD's graphics processors being the weapon of choice for next gen... I think we need to keep expectations in check either way... AMD isn't generally making industry leading high-end hardware.



--::{PC Gaming Master Race}::--

CGI-Quality said:

There's a problem with using Crysis. Of course it 'shit all over the consoles', it was a PC exclusive, taking advantage of the latest and greatest technology. Modern machines aren't getting closer, they're simply having their goods taken advantage of and games made only for PC, particularly of that flavor, are virtually non-existent.

Although the PC version of Metro: Exodus stomps the console versions into the ground, if it were a PC exclusive, there would truly be no discussion to have. But, go look at the beginning of this gen and then look at the Exodus shots I've posted. It isn't even close. And that's one example. Video game graphics are NOWHERE NEAR their pinnacle.

Finally we go straight to the heart of the matter !   And thanks ;)

 Scalability is not the miraculous solution if we wanna squeeze several SKUs, or an infinite plethora of hardware components with different architecture and specs like on PCs; it is just a way to make everybody "somehow happy", without ever using the available resources at best.

  Just imagine if the best developers could develop exclusively for the best PC hardware; the result would blow away anything you have seen on PC, by a very long margin.  

 Crysis was the perfect example, and still developers could do even more if they could choose the very best components, put together, and build a super powerful Box, like a single SKU to use.

 



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.

Pemalite said:
CrazyGPU said:

I ment 0.24 Tf, but I should have put 230 Gigaflops, So the correct number would have been 0.23 Tf. Source: https://www.gamespot.com/gallery/console-gpu-power-compared-ranking-systems-by-flop/2900-1334/7/

Wikipedia on the other side gives your number, 0.192 Teraflops.

Wiipedia is basing that on information direct from nVidia. My own math on determining the GPU's floating capabilities aligns with that too.

It's 192Gflop.

CrazyGPU said:
Lets take your number. 1.84 / 0.192 is 9,6 times. Next jump should be to 17.6 Tf to be equal. Not going to happen.

That is just single precision floating point math, there is far more to a GPU than that. Flops isn't accurate when comparing GPU's of different architectures, it's entirely a theoretical denominator, not a real world one.

CrazyGPU said:
you said arround 417 MB or 0.42 (approximating here) GB vs 4.5 GB for games. 11 times more memory. 88 GB of RAM would be the same jump. We don´t need that amount of ram now of course, but It´s clear that the jump will be much lower next gen. Arround double like you say.

Streaming assets into DRAM on a per-needs basis is going to be significantly better next-gen, caching is becoming significantly more important, so they can do more with less memory due to that factor.

But you are right, we don't need 88GB of DRAM, not for a substantial increase in fidelity anyway.

CrazyGPU said:
Bandwith... 22,4 GB/s to 176 GB/s. close to 8 times more bandwith. We are not going to get to 1400 GB/s of bandwith so the jump will be much lower too.

Comparing raw numbers is a little disingenuous.
Modern Delta Colour Compression techniques can add another 50% or more to the bandwidth numbers...
Draw Stream Binning Rasterization severely cuts down the amount of work that needs to be performed to start with, making better use of limited bandwidth...
And I could go on.

CrazyGPU said:
In 2007 I was playing Crysis with my DX10 high end PC and consoles were a joke in texture quality. All washy and blurry. Now a PS4 or Xbox are still worse than my PC in ultra settings (GF 1070 2k monitor) but consoles are much closer than they used to be.

I disagree, Metro on PC is a night and day difference from consoles.
And the same goes for most Frostbite powered games.

...And the difference is starting to look generational when comparing PC against the base Xbox One and Playstation 4.

Crysis however was a unique one, it was PC exclusive, it pushed PC's to the limits... And it wasn't the best optimized title as Crytek made forward projections on "possible" PC hardware (I.E. Dramatic increases in CPU clockrates) and thus it's taken a very long time for it to stop being a hardware killer.

Consoles today (Xbox One X and Playstation 4 Pro) are generally sitting around the PC's medium quality preset. - It's where the bulk of the 7th gen sat when compared against the PC, obviously resolution and framerates are a bit better this time around of course, but the gap still exists and will always exist.

CrazyGPU said:
So clearly, graphically we are going to have an improvement but it will be much lower than old jumps, and Im not even considering the jump from PS1 to PS2 or PS2 to PS3.

Well sure. Mostly because as far as general rasterization is concerned... The bulk of the low-hanging performance/graphics fruit has been picked.
But that will at some point end.
I think 10th console generation will be a significant leap in graphics as graphics is undergoing a paradigm shift that we haven't seen since programmable pixel shaders burst onto the scene with the Original Xbox/Geforce 3... The 9th generation will be a bridge to that.

CrazyGPU said:
Graphics will be better but nothing to write home about compared to XBOX one X or PS4 pro specially. Not a Dream Ray tracing machine with 20 tf, 32 GB of ram and 1 TB of bandwith like Ken Kutaragui would like at 600 USS that no one would buy.

The Xbox One X and Playstation 4 Pro have yet to impress me yet... And the main reason is mostly that games are designed with the base Xbox One and Playstation 4 in mind... If games stuck to a lower resolution and framerate on the Xbox One X and pushed fidelity far more strongly... I would probably be more impressed.

As for Ray Tracing, we might not need 20 Teraflops, 32GB of Ram and 1TB/s of bandwidth, there is a ton of Research going on to make it more efficient, I mean... Rasterization took years to become more efficient, Compression and Culling were the first big techniques to make it more efficient, same thing will happen with Ray Tracing.

CrazyGPU said:
On the other side, CPU specs and capability will be the force that will make next gen something good. More games with FPS at 60 , AI, Simulation, Objects in maps,etc. For many people that would be a game changer.

Better CPU doesn't guarantee 60fps if you are GPU or DRAM limited. Simulation quality will be amazing, it should result in more immersive worlds overall.

CrazyGPU said:

That´s my opinion, I hope I´m wrong and PS5 will become a incredible graphic beast that shows graphics on a whole new level. 

The way I see it, There is more difference when you see cars from Gran Turismo 1 to Gran Turismo Sport, than between Gran turismo Sport, to real world cars, the woow feeling is getting smaller and smaller.

I think with AMD's graphics processors being the weapon of choice for next gen... I think we need to keep expectations in check either way... AMD isn't generally making industry leading high-end hardware.

Thanks Pemalite for all the explanations and clarification;  a common mistake is to compare raw numbers and drawing conclusions without understanding the whole architecture, etc;

  I hope you will keep posting here.



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.