CGI-Quality said:
|
I'm using the 2080Ti as an example because it's the most powerful, most expensive gpu you can buy right now and performance wise they are saying Series X is a pretty close match. For me that's something to get excited about and I'm also expecting a big leap. However, even a 2080ti can hit its limit pretty quick when we're talking about native 4k, RT and ultra settings.
I never said a 2080ti can't run any current gen game in native 4k/60fps, I said there already some games that don't. So assuming Rockstar will want to push visual fidelity ever further than a RDR2, for example. Wouldn't that be pretty hard when half the resources are "wasted" on native 4k? It's a design choice where they'll either pick the biggest leap in fidelity possible and figure out the resolution later, or they'll aim for native 4k from the start and use what resources are left to build their game. I'm not saying native 4k is terrible by definition, if a developer can get everything they envisioned in their game and still have the headroom to run it in 4k, then great. However, I think it would suck if native 4k would be a mandatory design choice from the start and it gets in the way of ambitions for next gen titles in general.







