DonFerrari said:
Mummelmann said:
My point was never what was required for 4K gaming; it was about mainstream priced hardware being able to pull it off. Proper 4K gaming on mainstream devices and with decent frame rates and effects is still a long way off. 20GB of GDDR6 will be expensive as hell in and on itself, especially given that the final build will need to be ready within 9-12 months, most likely, as it's unlikely that Sony will wait all that long before releasing a new PS.
And even if, by some miracle, one managed to pack a mainstream 500$ device with a heap of GDDR6 memory, shared or otherwise, the true bottlenecks would be the rest of the build where they'd inevitably need to save a ton on cheaper solutions, especially if rumors of BC are to be believed. A RTX 2080 with 8 GB memory currently sits at around 750-850$ alone, it can do 4K at good frame rates in most games (65-80 and above is good in my opinion). To get performance at nearly that level in a mainstream box for 500$ within a year or so is quite simply not happening.
Consoles aren't future proof, that's the whole point in all of this. Any console released today will be hopelessly outdated long before it's replaced. Seeing the state of streamed 4K content on TV right now, one can imagine the time it will take before games with rendered assets will reach an acceptable point on any mainstream device. Proper, stable 4K gaming has only been possible on high-end PC's for a couple of years as it is. The 1080 Ti struggled to creep past the 60 mark on fps in most titles.
Again, my point was never what will be required in the future, rather that what will be required in the future will not be met by upcoming consoles, not even close if they want any sort of approachable price point. As far as limits on video memory, it's hard to find ways to strain a modern high-end GPU with 11GB memory or more, even my aging 980 Ti with only 6GB is still doing okay, albeit at 1440p resolutions and not 4K. And, one last time; if high-end GPU's right now are getting on nicely with their allotted memory, there's no way that any affordable machine in 2020 or so will release with similar or better specs.
|
except 60fps is hardly something console gaming requires or cares for most genres.
|
Perhaps, but developers do, and aiming for higher fps means better margins. As it stands, a lot of games with 30 fps aim dip into slideshow territory, games like AC: Odyssey are a good example, I don't think I've ever seen worse frame rates in a modern AAA title. With more effects and higher resolutions, severe drops in frame rate become even more jarring, a more advanced and crisp video or render stands out all the more when it slows down.
Xbox One X doesn't provide proper 4K, it uses checkerboard rendering and rather few effects, frame rates on many titles are very low, Destiny 2 runs at a somewhat unstable 30 fps.
Half-assed resolutions with poor performance is not what developers want to work with, it's better to find an entry point where they can get both visual fidelity and performance and not be forced to choose. And if the hardware ends up too costly, developers and publishers have a smaller market to sell their software to. I'd much rather the standard be a stable 1440p with full effects and shading than stripped down or faked 4K, then perhaps release a more expensive version of the console that does more or less what the Pro and One X do to the line-up right now.
The fact that 30 fps is still more or less industry standard in console gaming in 2019 is downright shameful, especially since games often dip well below.