goopy20 said:
It wasn't a trick question here. I just asked, from a game development point of view, wouldn't native 4k suck up too much resources and limit the ambitions of developers on next gen console games? Maybe for some people half the overall visual fidelity in favor of 4k or a higher fps isn't a waste of resources, for example if you're gaming on a 144hz monitor. But we've already seen with the X1X that saying "this plays the same games but better" just doesn't fire up consumers like the promise of completely new games that take a dramatic leap from current gen titles does. Obviously there are games that do run at 4k/60fps on a $1300 2080Ti, but there are already plenty that don't. So lets take RDR as an example, how is Rockstar supposed to push things even further with RDR3, if the game was designed from the ground up to run in native 4k and RT on ps5? Roughly speaking, the boost in resolution and RT (depending on how AMD's RT cores perform) would leave Rockstar with about the same resources left for making the actual game as they had when they were making RDR2 for the ps4. So just let me ask you another question, what kind of leap in overall visual fidelity are you expecting from these next gen games? |
I completely agree with you, 1080p is more than enough for anything up to 32inches, and 1440p is more than enough for anything up to 120inches.
We will not get much benefit from 4K, they should use resources to improve graphics and performance in games. Example, would people prefer red dead 2 graphics at 1440p or red dead 1 at 4K? I'd prefer graphics, because my red dead 1 at 4K on my xbox X doesn't look better than my red dead 2 on my ps4
Resolution is the new megapixels, all about numbers to upsell.
For me, having experience with PCs, I know for a fact that I prefer 16xAA over higher resolution, the 16xAA will have a cleaner picture than 4K with no AA







