Too_Talls said:
their guess is that it still runs at about 1200P which is more than enough. 1080 has been the standard for seeing for a while, the option to output a higher resolution has been there for a while, most people stuck to 1080p with more frames as opposed to bumping up the res. And trust me, if the option to lower that res further to push more frames, many people would do that, but unfortunately these boxes cap off at 120Hz. Ask yourself why you think 1080p 240Hz monitors even exist. |
I was thinking about Series S, dropping as low as 576p in one game that offered 120fps
They exist for motion clarity. I don't buy the 4 or 8ms advantage compared to 200ms human response time. Higher fps mainly creates a more stable picture with better motion clarity making it much easier to identify and track threats. For the same reason, turn off motion blur, dof, bloom, lens flare, film grain, vignette, dirty lens, screen shake, head bob, particle effects, anything that distracts from seeing the enemy. Even lower res textures can help.
Dropping resolution should be a last resort to stay competitive, fps > resolution >>>>> eye candy when it comes to competitive multiplayer. Screen size helps as well. I'm faster in racing games on bigger screens (or sit closer). Fastest in VR despite crap resolution. Peripheral vision helps a lot to know exactly how far you are from the side and how fast you are moving laterally. However the resolution of VR is too low for shooters.
Anyway, 1200p is indeed plenty. I wonder if CoD separates the modes in matchmaking or otherwise lists who is playing on/with what. It would be interesting to see stats whether 120fps players really have an advantage over 60fps players and how much.







