goopy20 said:
If a game is designed with parity in mind, how would they then be able to really push and optimize for GTX2080 if it also has to run and look smooth on a GTX1050? I can't believe we are even arguing this but here's an interesting read, straight from the mouth of a AAA developer... "One of the first things that you have to address when developing a game is, what is your intended target platform? If the answer to that question is "multiple", you are effectively locking yourself in to compromising certain aspects of the game to ensure that it runs well on all of them. It's no good having a game that runs well on PS3 but chugs on Xbox 360, so you have to look at the overall balance of the hardware. As a developer, you cannot be driven by the most powerful console, but rather the high middle ground that allows your game to shine and perform across multiple machines.
When you are developing a game, getting to a solid frame-rate is the ultimate goal. It doesn't matter how pretty your game looks, or how many players you have on screen, if the frame-rate continually drops, it knocks the player out of the experience and back to the real world, ultimately driving them away from your game if it persists. Maintaining this solid frame-rate drives a lot of the design and technical decisions made during the early phases of a game project. Sometimes features are cut not because they cannot be done, but because they cannot be done within the desired frame-rate." https://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators |
Maybe we should ask that question to RDR2 developers who developed their game to run on a GTX 770, or FH4 developers who developed their game to run on a GTX 650 TI, or MSFS developers who developed their game to run on a GTX 770. I guess none of those games take advantage of the GTX 2080 Ti power since they can also run on such low end hardware.