CGI-Quality said:
goopy20 said:
It wasn't a trick question here. I just asked, from a game development point of view, wouldn't native 4k suck up too much resources and limit the ambitions of developers on next gen console games? Maybe for some people half the overall visual fidelity in favor of 4k or a higher fps isn't a waste of resources, for example if you're gaming on a 144hz monitor. But we've already seen with the X1X that saying "this plays the same games but better" just doesn't fire up consumers like the promise of completely new games that take a dramatic leap from current gen titles does.
Obviously there are games that do run at 4k/60fps on a $1300 2080Ti, but there are already plenty that don't. So lets take RDR as an example, how is Rockstar supposed to push things even further with RDR3, if the game was designed from the ground up to run in native 4k and RT on ps5? Roughly speaking, the boost in resolution and RT (depending on how AMD's RT cores perform) would leave Rockstar with about the same resources left for making the actual game as they had when they were making RDR2 for the ps4.
So just let me ask you another question, what kind of leap in overall visual fidelity are you expecting from these next gen games?
You didn’t get a trick answer. It’s not a waste of resources no matter how you try to paint it. It was also a ridiculous notion to suggest that a 2080Ti (don’t try to throw in the price, as you took it here in the first place) can’t run any games in 4K/60. Quite clearly it can and you’ve now backtracked. The need to compare PC to console 1:1 will continue to be your undoing in these debates.
As for ‘leap in fidelity’ - a notable one. I have a thread dedicated to this.
|
|
I'm using the 2080Ti as an example because it's the most powerful, most expensive gpu you can buy right now and performance wise they are saying Series X is a pretty close match. For me that's something to get excited about and I'm also expecting a big leap. However, even a 2080ti can hit its limit pretty quick when we're talking about native 4k, RT and ultra settings.
I never said a 2080ti can't run any current gen game in native 4k/60fps, I said there already some games that don't. So assuming Rockstar will want to push visual fidelity ever further than a RDR2, for example. Wouldn't that be pretty hard when half the resources are "wasted" on native 4k? It's a design choice where they'll either pick the biggest leap in fidelity possible and figure out the resolution later, or they'll aim for native 4k from the start and use what resources are left to build their game. I'm not saying native 4k is terrible by definition, if a developer can get everything they envisioned in their game and still have the headroom to run it in 4k, then great. However, I think it would suck if native 4k would be a mandatory design choice from the start and it gets in the way of ambitions for next gen titles in general.