Conina said:
Going from 1280x720 to 2560x1440 will use around 4x the resources, not 2x the resources! 1280x720 x 30 fps = 27.6 million pixels per second 1920x1080 x 30 fps = 62.2 million pixels per second 2560x1440 x 30 fps = 110.6 million pixels per second So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 1080p but stay above 720p if it's running in 1440p on Series X. 2560x1440 x 30 fps = 110.6 million pixels per second; one third of that = 36.9 million pixels per second 1600x900 x 30 fps = 43.2 million pixels per second 1422x800 x 30 fps = 34.1 million pixels per second 1280x720 x 40 fps = 36.9 million pixels per second So dynamic 800p - 900p could be theoretically possible in this scenario. Or constant 900p with a bit of tweaking. Or 720p with 40 fps on a 120 Hz display. Or 800p with 30 - 50 fps VRR... |
If it would be 4x times the resources you would only make my point more obvious. However, for most games you'll see about half the fps if you're going from 1440p to 720p and half the fps going from native 4k to 1440p https://www.youtube.com/watch?v=AKUqQhSz210







