There's a problem with using Crysis. Of course it 'shit all over the consoles', it was a PC exclusive, taking advantage of the latest and greatest technology. Modern machines aren't getting closer, they're simply having their goods taken advantage of and games made only for PC, particularly of that flavor, are virtually non-existent.
Although the PC version of Metro: Exodus stomps the console versions into the ground, if it were a PC exclusive, there would truly be no discussion to have. But, go look at the beginning of this gen and then look at the Exodus shots I've posted. It isn't even close. And that's one example. Video game graphics are NOWHERE NEAR their pinnacle.
Finally we go straight to the heart of the matter ! And thanks ;)
Scalability is not the miraculous solution if we wanna squeeze several SKUs, or an infinite plethora of hardware components with different architecture and specs like on PCs; it is just a way to make everybody "somehow happy", without ever using the available resources at best.
Just imagine if the best developers could develop exclusively for the best PC hardware; the result would blow away anything you have seen on PC, by a very long margin.
Crysis was the perfect example, and still developers could do even more if they could choose the very best components, put together, and build a super powerful Box, like a single SKU to use.
I cringe when people start talking about scalability and how it will save ports.
Yes engines and games are scalable, but hardly they make the best game possible to the highest HW and then cut back until they reach the base. They select a floor and make the whole game on that then will start putting more up to the highest HW but usually on a very lazy way.
So base HW will always hold down what is being made on the game. Also once a base is decided to go for a even lower HW will make severe cuts that hardly payback.
Wiipedia is basing that on information direct from nVidia. My own math on determining the GPU's floating capabilities aligns with that too.
That is just single precision floating point math, there is far more to a GPU than that. Flops isn't accurate when comparing GPU's of different architectures, it's entirely a theoretical denominator, not a real world one.
Streaming assets into DRAM on a per-needs basis is going to be significantly better next-gen, caching is becoming significantly more important, so they can do more with less memory due to that factor.
But you are right, we don't need 88GB of DRAM, not for a substantial increase in fidelity anyway.
Comparing raw numbers is a little disingenuous.
Modern Delta Colour Compression techniques can add another 50% or more to the bandwidth numbers...
Draw Stream Binning Rasterization severely cuts down the amount of work that needs to be performed to start with, making better use of limited bandwidth...
And I could go on.
I disagree, Metro on PC is a night and day difference from consoles.
And the same goes for most Frostbite powered games.
...And the difference is starting to look generational when comparing PC against the base Xbox One and Playstation 4.
Crysis however was a unique one, it was PC exclusive, it pushed PC's to the limits... And it wasn't the best optimized title as Crytek made forward projections on "possible" PC hardware (I.E. Dramatic increases in CPU clockrates) and thus it's taken a very long time for it to stop being a hardware killer.
Consoles today (Xbox One X and Playstation 4 Pro) are generally sitting around the PC's medium quality preset. - It's where the bulk of the 7th gen sat when compared against the PC, obviously resolution and framerates are a bit better this time around of course, but the gap still exists and will always exist.
Well sure. Mostly because as far as general rasterization is concerned... The bulk of the low-hanging performance/graphics fruit has been picked.
But that will at some point end.
I think 10th console generation will be a significant leap in graphics as graphics is undergoing a paradigm shift that we haven't seen since programmable pixel shaders burst onto the scene with the Original Xbox/Geforce 3... The 9th generation will be a bridge to that.
The Xbox One X and Playstation 4 Pro have yet to impress me yet... And the main reason is mostly that games are designed with the base Xbox One and Playstation 4 in mind... If games stuck to a lower resolution and framerate on the Xbox One X and pushed fidelity far more strongly... I would probably be more impressed.
As for Ray Tracing, we might not need 20 Teraflops, 32GB of Ram and 1TB/s of bandwidth, there is a ton of Research going on to make it more efficient, I mean... Rasterization took years to become more efficient, Compression and Culling were the first big techniques to make it more efficient, same thing will happen with Ray Tracing.
Better CPU doesn't guarantee 60fps if you are GPU or DRAM limited. Simulation quality will be amazing, it should result in more immersive worlds overall.
I think with AMD's graphics processors being the weapon of choice for next gen... I think we need to keep expectations in check either way... AMD isn't generally making industry leading high-end hardware.
Thanks Pemalite for all the explanations and clarification; a common mistake is to compare raw numbers and drawing conclusions without understanding the whole architecture, etc;
I hope you will keep posting here.
Yep after our dear Pema the most I would use Tflop or other RAW numbers is like a ballpark expectation and mostly when looking at very similar architeture. Otherwise much better evaluate real world implementations.
Does anyone else think that pushing further graphical fidelity is a waste?
This console generation has taught me more than others that while nice, pixel counting doesn't exactly equate to better games. It actually doesn't push sales like it once did either. So many games today that are the top sellers are not graphical power houses.
I say this as someone who greatly enjoys cinematic story driven games. The likes of Horizon, Uncharted, The Last of Us, God of War, my top games of the Gen. Yet..... I think we have reached a point where further fidelity is only putting your game farther from being able to make a profit. Minecraft, Fortnite, Everything from Nintendo.....these games are huge and didn't need it. This isn't to say every game need to be the same. There are just certain things that probably need to be pushed more with whatever added power the PS5 will bring.
-Performance: this finally needs to be more of a focus. 60fps and steady. If we can achieve this at 4k, so be it but I no longer care about pixel ratio.
-Features: This to the extreme! One of the best advances of this gen is the now standard video/pic streaming capabilities. Far better use of all that extra RAM. I would like to see more quality of life additions such as this. Like the ability to Run programs in duality. Such as PnP Game and Internet Browser at same time. Think the new Samsung phone method. Maybe a Playstation store that is speedy and runs like a dream.
-Return of Game Features: this is the most important. What if we used that extra power to not go full 4K but bring back Quality features such as Split Screen coop?! Before Pixel Wars it was things like this that really stood out. In particular Nintendo is finding much success with this. Maybe the return of NPC AI in traditional multiplayer games, as there blind focus on courting people for multiplayer has them missing those who still want to play their game but alone.
-VR to further progress is a given. I am very interested to see what the PS5 will do to make VR even better. With stronger tech maybe this will mean less demo-like games and we can get full experiences. No more floating hands. Give me the full Mirror's Edge experience in VR. Imagining games like Cyberpunk in VR.....
I just really feel we need a good gen where we don't try to push any more past 4K and just concentrate on making games that can more easily take advantage of the hardware. Akin to the era of PS2 before devs started going broke over graphics.
Some think, I don't.
You don't have to use all the power just for photo-realism, you can make fantastic and unreal games as well, it is just how you use that power. We are very far from the apex and I want to keep see it improving. At once when I was 15 and played FF IX for the first time (coming from Genesis colorful cartoon games) the CGI blow me up and I thought it couldn't get better, then Tekken5 on PS2 with pores and fur on the demo was unthinkable, then Gran Turismo on PS3 reveal (something like "welcome to real life") was still showing it could get better, Detroit on PS4 shows that well it is almost a real human on the game... but after thinking it couldn't get better so many times now I just want to wait and see how much more can it improve.
Yeah, I just don't want to miss any major benefits from next gen, but it looks like I won't.
I saw that advice on some sites when people questioned similar questions too, btw, lol.
Thank you, I appreciate it.
As it stands I am currently using a 4k HDR tv with HDMI 2.0. Switched from using projectors back in 2017. I will be upgrading to a 75" or up TV around 2020. I would have done that this year but my current setup is good enough for at least another 18 months.
So I'll be buying my new TV right along with my PS5 if it comes around november 2020. On the plus side I will be able et a 75" - 85" tv for less then that what I would this year. Oh and electronics are generally cheaper around that time f the year.
If you ask me... I would say just wait.... but ifyou dn have a 4k TV now and wat to et in on the 4k bandwagon.... the find the cheapest best 4k TV out there.... preferably from TCL and call it a day. You could get a god set for under $400.
Don't worry to much. There is a possibility the TV is already 2.1 capable but without a standard being defined or in much use they haven't advertised, then with a FW upgrade will allow it just like on PS4. Also great chance that it will take several years before it is being used enough to be an important factor, and by that time you'll be looking at another TV.