By using this site, you agree to our Privacy Policy and our Terms of Use. Close
CrazyGPU said:

I ment 0.24 Tf, but I should have put 230 Gigaflops, So the correct number would have been 0.23 Tf. Source: https://www.gamespot.com/gallery/console-gpu-power-compared-ranking-systems-by-flop/2900-1334/7/

Wikipedia on the other side gives your number, 0.192 Teraflops.

Wiipedia is basing that on information direct from nVidia. My own math on determining the GPU's floating capabilities aligns with that too.

It's 192Gflop.

CrazyGPU said:
Lets take your number. 1.84 / 0.192 is 9,6 times. Next jump should be to 17.6 Tf to be equal. Not going to happen.

That is just single precision floating point math, there is far more to a GPU than that. Flops isn't accurate when comparing GPU's of different architectures, it's entirely a theoretical denominator, not a real world one.

CrazyGPU said:
you said arround 417 MB or 0.42 (approximating here) GB vs 4.5 GB for games. 11 times more memory. 88 GB of RAM would be the same jump. We don´t need that amount of ram now of course, but It´s clear that the jump will be much lower next gen. Arround double like you say.

Streaming assets into DRAM on a per-needs basis is going to be significantly better next-gen, caching is becoming significantly more important, so they can do more with less memory due to that factor.

But you are right, we don't need 88GB of DRAM, not for a substantial increase in fidelity anyway.

CrazyGPU said:
Bandwith... 22,4 GB/s to 176 GB/s. close to 8 times more bandwith. We are not going to get to 1400 GB/s of bandwith so the jump will be much lower too.

Comparing raw numbers is a little disingenuous.
Modern Delta Colour Compression techniques can add another 50% or more to the bandwidth numbers...
Draw Stream Binning Rasterization severely cuts down the amount of work that needs to be performed to start with, making better use of limited bandwidth...
And I could go on.

CrazyGPU said:
In 2007 I was playing Crysis with my DX10 high end PC and consoles were a joke in texture quality. All washy and blurry. Now a PS4 or Xbox are still worse than my PC in ultra settings (GF 1070 2k monitor) but consoles are much closer than they used to be.

I disagree, Metro on PC is a night and day difference from consoles.
And the same goes for most Frostbite powered games.

...And the difference is starting to look generational when comparing PC against the base Xbox One and Playstation 4.

Crysis however was a unique one, it was PC exclusive, it pushed PC's to the limits... And it wasn't the best optimized title as Crytek made forward projections on "possible" PC hardware (I.E. Dramatic increases in CPU clockrates) and thus it's taken a very long time for it to stop being a hardware killer.

Consoles today (Xbox One X and Playstation 4 Pro) are generally sitting around the PC's medium quality preset. - It's where the bulk of the 7th gen sat when compared against the PC, obviously resolution and framerates are a bit better this time around of course, but the gap still exists and will always exist.

CrazyGPU said:
So clearly, graphically we are going to have an improvement but it will be much lower than old jumps, and Im not even considering the jump from PS1 to PS2 or PS2 to PS3.

Well sure. Mostly because as far as general rasterization is concerned... The bulk of the low-hanging performance/graphics fruit has been picked.
But that will at some point end.
I think 10th console generation will be a significant leap in graphics as graphics is undergoing a paradigm shift that we haven't seen since programmable pixel shaders burst onto the scene with the Original Xbox/Geforce 3... The 9th generation will be a bridge to that.

CrazyGPU said:
Graphics will be better but nothing to write home about compared to XBOX one X or PS4 pro specially. Not a Dream Ray tracing machine with 20 tf, 32 GB of ram and 1 TB of bandwith like Ken Kutaragui would like at 600 USS that no one would buy.

The Xbox One X and Playstation 4 Pro have yet to impress me yet... And the main reason is mostly that games are designed with the base Xbox One and Playstation 4 in mind... If games stuck to a lower resolution and framerate on the Xbox One X and pushed fidelity far more strongly... I would probably be more impressed.

As for Ray Tracing, we might not need 20 Teraflops, 32GB of Ram and 1TB/s of bandwidth, there is a ton of Research going on to make it more efficient, I mean... Rasterization took years to become more efficient, Compression and Culling were the first big techniques to make it more efficient, same thing will happen with Ray Tracing.

CrazyGPU said:
On the other side, CPU specs and capability will be the force that will make next gen something good. More games with FPS at 60 , AI, Simulation, Objects in maps,etc. For many people that would be a game changer.

Better CPU doesn't guarantee 60fps if you are GPU or DRAM limited. Simulation quality will be amazing, it should result in more immersive worlds overall.

CrazyGPU said:

That´s my opinion, I hope I´m wrong and PS5 will become a incredible graphic beast that shows graphics on a whole new level. 

The way I see it, There is more difference when you see cars from Gran Turismo 1 to Gran Turismo Sport, than between Gran turismo Sport, to real world cars, the woow feeling is getting smaller and smaller.

I think with AMD's graphics processors being the weapon of choice for next gen... I think we need to keep expectations in check either way... AMD isn't generally making industry leading high-end hardware.



--::{PC Gaming Master Race}::--