By using this site, you agree to our Privacy Policy and our Terms of Use. Close
goopy20 said:
DonFerrari said:

Actually since CPU would be the same we could even see something strange like 720p60fps on Series S for a 1440p30fps Series X title right?

Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful. So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 720p (probably 540p) if it's running in 1440p on Series X. 

Going from 1280x720 to 2560x1440 will use around 4x the resources, not 2x the resources!

1280x720 x 30 fps = 27.6 million pixels per second

1920x1080 x 30 fps = 62.2 million pixels per second

2560x1440 x 30 fps = 110.6 million pixels per second

So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 1080p but stay above 720p if it's running in 1440p on Series X. 

2560x1440 x 30 fps = 110.6 million pixels per second; one third of that = 36.9 million pixels per second

1600x900 x 30 fps = 43.2 million pixels per second

1422x800 x 30 fps = 34.1 million pixels per second

1280x720 x 40 fps = 36.9 million pixels per second

So dynamic 800p - 900p could be theoretically possible in this scenario. Or constant 900p with a bit of tweaking.

Or 720p with 40 fps on a 120 Hz display. Or 800p with 30 - 50 fps VRR...

Last edited by Conina - on 21 March 2020