By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mr Puggsly said:
goopy20 said:

That's almost accurate except that going from 1440p to 900p isn't going to double the fps. If a game runs at the exact same graphics settings on ps5 at 1440p/30fps they would have to scale it down all the way to 720p to hit 30fps on Series S. And if a game would be 1080p on ps5, they would need to drop it to 540p on Series S.

The Switch might be able to get away with 540p but that's because it's a handheld. On a 55 inch tv it would look like a giant smudge and there's just no way MS would allow developers to release 720p or even 540p games on their "next gen" console. They will likely target 1080p on Series S, cut visual fidelity in half across the board so it can run on 4Tflops, and we'll be playing the same games on Series X at native 4k and checkerboard 4k on ps5. This is why I pray to the gaming gods that this whole Lockhart thing is not real.  

Now of course you could lower some graphics settings on pc. However, console games don't have graphics settings because they are completely optimized for the specific hardware. Therefore, they tend not to use rendering effects that are too expensive with a relatively small gain in visuals. 

Developers can adjust resolution and graphics settings if necessary. Modern engines are generally designed to scale well, your last paragraph ignores that.

For example, some X1 games weren't just a resolution drop vs PS4. They also adjusted graphics settings if necessary. Meanwhile PS4 Pro and X1X sometimes have higher graphics settings along with resolution. You're suggesting console games don't benefit much from tweaking graphics settings, that simply is not reality.l

The odds of Series X and PS5 having 1080p content seems highly unlikely or rare if it ever happns. So we don't need to think about unlikely scenarios, most games next might be 4K or close to even on PS5.

Hence, you're creating unlikely scenarios to make Series S seem like a bad idea.

PS4 had 1.8TF and generally hit 1080p. PS5 probably has 5x the GPU power in practice. Therefore Sony could still hit 4K and still make significantly better looking games. Espeically when you consider X1X hit 4K in something like RDR2.

The ps4 pro and X1X didn't have totally different graphics settings. They were designed to be 4k consoles, not next gen consoles and anyone who doesn't have a 4k tv would hardly be able to tell the difference. Games on X1X could potentially look a ton better but developers didn't make any game that really made good use of the hardware. Instead the games were exactly the same as on the base consoles and all the extra processing power was used on 4k and/or 60fps. 

4k Is nice but it's a tremendous waste of resources on consoles as most people can hardly tell the difference unless they have a 65inch tv. That's why so few people upgraded to a mid-gen console. So if we're talking about things like Ray Tracing there's no way developers would compromise on that in favor of native 4k. Keep in mind that even a 2080Ti can barely hit 30fps in most current gen games with Ray Tracing enabled and native 4k. 

Fact is that developers will always have to make compromises if they're building an ambitious game and resolution is usually pretty low on the priority list. Just look at how many games are 720p on Xone, and is that really such a big deal? Not at all, because while Red Dead Redemption on the 360 runs at almost the same resolution as RDR2 on Xone, RDR2 does look a helluva lot better.

Last edited by goopy20 - on 11 March 2020