By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mr Puggsly said:
goopy20 said:

That doesn't make sense. Why could, or should these next gen console aim for native 4k when even a 2080Ti can't even hit 60fps on current gen games at that resolution? I'm not even talking about RT, as that would cut that framerate by another 50%. We would practically be playing the same games we're playing now lol.

An obvious problem with your argument and graphs is you're looking at 4K resolution with highest graphics settings.

Current gen games on consoles don't necessarily run games at highest graphics settings, its often more like a mix of low, medium and high. They lower the graphics settings primarily to maintain a high resolution.

Look at this video of RDR2 for example. The low and medium settings is probably more reflective of the console settings. The ultra settings are well beyond what we get on consoles and gets about half the frame rate of low and medium. Meanwhile high settings looks good and sits comfortably at 60 fps.

Like you I agree aiming for 4K can be a waste of resources, but aiming for ultra settings can also be a waste when high already looks comparable and runs much better. However, another compromise is ultra settings with a dynamic resolution. But if developers do aim for 4K/60 fps, its evident 4K with high settings is fine in this scenario.

That's true but I already said ultra settings are also an enormous waste of resources. You can take almost any current gen game and buckle a 2080ti to it's knees by using ultra or insane settings. That's why developers usually stay clear of settings like that on consoles. They want to be as efficient as possible and won't use settings that take up too much resources with a relatively small gain in visuals. It's why things like FXAA and checkerboard rendering exists.