By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mr Puggsly said:
goopy20 said:

Look, we can argue about resolutions all day, but the fact remains that resolution isn't that big a deal for the average console gamer. Most people watch 1080p content all day and can hardly tell the difference between a 1080p or a 4k movie on a 55inch tv. If a game looks great, nobody is going to sit right in front of their tv and say "wait a minute, this is not even 4k!" and feel like they're getting a lackluster experience. 

The real question is how ambitious next gen games will still be if they're designed to run on a 4Tflops Series S and the Xone. Also, if we're talking about parity, what will developers be able to use those extra 8 Tflops for on Series X? My point is that they wouldn't be able to use it for anything except resolution, fps and a bump in graphics settings. Not the things that actually matter like larger/ richer levels, ai, physics, world simulations etc.

Having cheap options might sounds great, but it does come with a major trade off. The worst thing that could happen is if 3rd party developers would start using Series S as the lowest common denominator, limiting their ambitions across all platforms. I'm sure games would still look better than what we're seeing today, but it simply would be a much smaller leap than what it could have been without Series S.  

Agree with your first paragragh. But your previous comments about X1 resolutions was mostly wrong.

The ambition of future MS games isn't inherently limited by Series S or even X1. Is Doom Eternal and Witcher 3 limited by Switch? It also depends on the individual project as games can have features removed.

"larger/ richer levels, ai, physics, world simulations etc." For Series S, all of that can be maintained as that often has little to do with GPU TF. For X1, which will be supported for a period, can scale aspects back like we see on Switch ports for example.

In theory the only significant compromise Series S would make is 4TF vs 12TF. That means the S and X should support the exact same content with GPU heavy aspects being compromised for S. Whether that be performance, graphics settings and/or resolution.

Anyway, you aren't say anything new and for decades we've seen PCs games scale significantly for GPUs.

A powerful graphics card can play RDR2 at 4K/60/Ultra. A lesser card can play it well at 720p/30/Low. Either way you would be playing the exact same game. So if GPU TF is the only disparity, they can likely handle the same large scale and ambitious games.

If all ps4/Xone games were designed to release alongside a Switch version with parity in mind, those games would also be seriously compromised and we wouldn't even have games like RDR2 right now. Luckily that's not happening because games like Witcher 3 were developed separately and released much later. It does run, but it can drop to 810x456 in a town for example. So if that game would have been designed with parity in mind, they probably would have had to cut that from the game across all platforms. The Switch can somewhat get away with 456p because its a handheld, but good luck playing that resolution on a 55inch tv. 

All I'm saying is that any game will have to make concessions if it has to run on a range of different specs and the high-end will never be fully used to its potential. Just look at the mid-gen consoles. They were like 4 or 5 times more powerful than base consoles but they never felt like that big of an improvement. Why? Because developers could never really take advantage of the extra hardware since they had to aim for parity with the base consoles. In the end we got the exact same games with only a boost in res/ framerate. But here's a tech-demo of what's possible on a ps4 pro if it's not held back by base ps4 and is running at 1080p/30fps: https://www.youtube.com/watch?v=Lhpn96bbzkk

Last edited by goopy20 - on 23 March 2020