By using this site, you agree to our Privacy Policy and our Terms of Use. Close
goopy20 said:
Mr Puggsly said:

How do you keep ignoring the obvious? Again, resolution isn't the only visual setting that can be adjusted.

You're suggesting a 1440p/30 fps game could look amazing on 9th gen specs. So developers have a couple options when supporting a 4TF Series S.

Option 1: They can drop the resolution down to whatever works (maybe dynamic 720p-1080p/30 fps), while maintaining the same graphics settings as Series X/PS5.

Option 2: Target 1080p/30 fps, but numerous graphics settings would be dropped down.

If dropping the resolution alone isn't enough OR developers want to target 1080p, then other visual settings can be reduced.

Hence, if you play games on Series S you may get lower resolution and visual settings. But overall the games will still be a big upgrade over 8th gen visuals with the exact same gameplay of 9th gen. So you opt for Series S for access to 9th gen games at a lower price. While the fidelity of the presentation could vary depending on developers choices.

If you disagree with this, fine. But you keep arguing resolution when developers have more options than that. Especially when most games are already designed for varying specs.

We've already went over scaling a million times. Well optimized console games don't use ultra graphics settings that take too big a hit on performance with relatively little gain in visuals.

You understand that "ultra graphic settings" is just an arbitrary name? Settings that are currently called "ultra" will be called "medium" or "high"  in future games. Settings that are currently called "medium" or "high" will be called "low" in future games.

Diminishing returns of current "ultra settings" compared with current "medium" or "high" settings will probably be very similar to diminishing returns of future "medium" or "high" compared with future "low" settings.

goopy20 said:

Fact is if you have 2 different SKU's with such a gap in specs, one console will always be held back by the other. Like Otto said, Minecraft with Path Tracing runs at 1080p/30fps on Series X. Now you tell me what looks better, Minecraft running at native 4k or 1080p with path tracing? 

Minecraft + Quake 2 RTX are extreme examples where the raytracing version needs one or two magnitudes of performance than the non-raytracing versions.

So let's have a look at other games which aren't 10 year old indie games or 20 year old classics.

  • Battlefield V at native 4K without RT or at 1080p with RT. Now you tell me what looks better.
  • Metro Exodus at native 4K without RT or at 1080p with RT. Now you tell me what looks better.
  • Control at native 4K without RT or at 1080p with RT. Now you tell me what looks better.

That's not so easy to answer and opinions will widely differ, especially with different TV sizes and distance from the TV and preferences.

goopy20 said:

Now sure, you could say Path tracing also takes too big a hit on resources and it's not going to be viable in big AAA games. However, what about Indy games? I'm sure they could come up with some pretty cool games that could use path tracing as a gameplay mechanic, but throw Series S in the mix and that's already not possible.

And why shouldn't it be possible on Lockhart with reduced resolution and/or reduced number of rays?