CGI-Quality said:
First, it’s false that ‘almost nobody plays in 4K on PC’. Plenty of PC users do. Next, the reason(s) that a good few more monitors are 1440 is because of the prices of the devices and the hardware available won’t do 4K/144Hz justice (unless you build like I do). That still doesn’t remove the fact that there are a lot of 4K monitors out there. Besides, can’t compare next gen (or any console, for that matter) to PC 1:1. For starters, consoles skip 1440 because of content (movies, Netflix, etc...). The TV manufacturers are push 4K, and since most console users are using TVs, that’s what they’re aiming for. |
Well obviously there are exceptions but according to the steam hardware survey, only 1,9% of its users are gaming at native 4k, while over 80% is using 1080p or 1440p. But my question was more related to game development. Wouldn't native 4k suck up way too much resources and limit a developer's ambitions on what they can do on these next gen consoles?







