Mr Puggsly said:
Again, a game aiming for 1440p and 30 fps would likely have exceptional visuals. These are scenarios where graphics settings should probably be lowered. Hence, I can respond saying nothing new. Even if Series S doesen't exist, I don't think 1440p/30 fps is going to be a common goal for MS. Nor will it be for Sony. You keep trying to create extreme scenarios to make Series S seem like a bad idea. The PS5 can turn God of War into a 4K/60 fps game with maybe half of its GPU power. So God of War 2 doesen't need to be 1440p/30 fps for a massive visual boost. Ultimately, if the Series S can cost about $150-200 less and plays the same games, then its existence would be justified. A visual downgrade would be expected for the price disparity. You seem to feel, "physics, AI, world simulation" are important. The Series S would could keep all that, its just visual fidelity taking a hit. I have a X1X so I already play 4K content. A 1080p game can objectively look more impressive than 4K content. Hence, the quality of the pixels can matter more than the number. So even if Series S is generally doing 720p-1080p in AAA games, that's fine. Because it will still come with the advancements of 9th gen game design. There will also be improved image reconstruction tecniques, there can still be 4K UIs and the games will still be a noticeable upgrade over 8th gen. |
I don't think it's an extreme scenario that next gen consoles will take a giant leap in fidelity, isn't that pretty much what we should be expecting from next gen games lol. But I don't see that happening if they're all running at native 4k/60fps. Especially if there are already some current gen games that can't hit 60 fps on a RTX2080ti at native 4k. And no GOW isn't native 4k/60fps, it's checkerboard 4k at 30fps or 1080p/60fps.
Ultimately, while the debate will continue to rage about the effectiveness of the technique, checkerboarding makes a lot of sense here - it would be impossible to render a game like God of War at native 4K on a PS4 Pro while maintaining a smooth frame-rate, and the visual payback compared to some of the 1800p and 1620p games we've seen is self-evident.
https://www.eurogamer.net/articles/digitalfoundry-2018-god-of-war-tech-analysis
Series X's specs are great but MS is really creating a problem for themselves by focusing so much on compatibility with pc and their current gen consoles. I mean think about Halo Infinite. We will have the base Xone running the game in 1080p/30fps, a X1X version probably running at 4k/30fps, a Lockhart version running at 4k/60fps and a Series X version running in 4k/120fps with a bump in (inefficient) graphics settings. In the end, no matter which console you're playing on, you will still be getting the exact same game that was designed for the lowest common denominator. They'll probably drop X1 support after a year or so, but what about X1X and the fact that all their exclusives will still have to run on a 4Tflops Lockhart for the remainder of the console generation?
It's going to be weird as hell and Sony will be laughing their asses off when their ps5 exclusives start pouring out, completely unhindered by compatibility with weaker hardware, and where those extra resources can be used in a ton of stuff that's more exciting than just native 4k and extra fps. Especially if their SSD tech is such a game changer as Sony believes it is and isn't even available on pc.
Last edited by goopy20 - on 21 March 2020






