goopy20 said:
I'm just curious what your thoughts are on using native 4k for the next gen console games. Wouldn't that be a massive waste of resources compared to 1440p? I mean there has to be a reason why almost nobody plays pc games in native 4k and why the high-end pc monitors are mostly 1440p. |
4k is going to be far more achievable next gen and not come at such a big cost to visual fidelity.
The consoles will have the fillrate to handle it fine for the most part.
The issue between 1440P and 2160P varies from person to person and game to game.
1440P is certainly a step up over 1080P, it's sharper, cleaner and not as resource heavy as 4k, that gives headroom to dial up the visuals, but aliasing/stair stepping/shimmering will be more prevalent than 4k.
Micro-details in texture work aren't likely to pop as much either.
Resolution and framerates will be entirely up to developers, developers chasing the graphics dream will likely opt for 1440P+Ray Tracing @30fps I would imagine.
The reason why allot of PC displays are only 1440P are many... It's a good price/performance level, that way manufacturers can focus on refresh rates and contrasts ratios rather than pure pixel counts... Plus 2160P on a 24" panel is an insane pixel density, keeping that resolution for larger panels makes more sense, plus display processing needs are well equipped to handle 1440P today, which keeps input latency down, you do notice input latency more readily on PC due to the mouse being such a precise form of control.
Also plenty of gamers use 4k. (Steam isn't the entire PC community, it's just a large sampling size.)
Many PC gamers also render games at 4k and downsample them to lower resolutions like 1080P and 1440P.
| Radek said: 1. Because 90% of PC gamers play on monitors and monitor would have to be at least 40 inches to fully utilize the extra pixels, and that is hard to fit on most desks, and still be able to use keyboard and mouse. 2. Because 90% of PC gamers want to play at 60 fps or more so it's better to stick with 1440p, while 90% of console games run at 30 fps regardless the resolution. Apples and oranges, just because most PC players prefer 1440p 60-144 fps doesn't mean console players don't want 4K 30 fps on their 55" TV's. |
Personally I can tell the difference between 1440P and 2160P on a 27-31.5" panel, granted it's not going to be as big of a perceivable difference as the move from 1080P to 1440P on the same panel size... For instance, I wanted to stab my eye with a pitch fork once when I had to use a 1080P 27" panel.
At around 21-22-23-24" you might as well stick with 1080P.
25-27" panels, 1440P is a really good fit.
32" and larger 2160P is fantastic.
You can get 32" 1440P panels... And I have one, it has roughly the same pixel density as a 23-24" 1440P panel, so it's doable, but I would hope anyone purchasing such a panel is doing so not because of the resolution but because of the refresh rates instead...
PC displays are also a little different to Televisions, there is less emphasis on post-processing and a larger emphasis on keeping input latencies as low as possible... As the venerable mouse is a highly accurate input method, so manufacturers will often prioritize things differently.

www.youtube.com/@Pemalite








