By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Phil Spencer Says Xbox Series X Games Aren't Being Held Back By Xbox One

chakkra said:
SvennoJ said:

So they spend 12 months moving a slider down for the Switch version...

You know, I'm not a developer myself, but I'm gonna go out on a limb and say that developing a game to run in a range of GPUs from GTX 2080 down to GT 1050 might be a little bit different than porting a game from PC to Switch.

If a game is designed with parity in mind, how would they then be able to really push and optimize for GTX2080 if it also has to run and look smooth on a GTX1050? I can't believe we are even arguing this but here's an interesting read, straight from the mouth of a AAA developer...

"One of the first things that you have to address when developing a game is, what is your intended target platform? If the answer to that question is "multiple", you are effectively locking yourself in to compromising certain aspects of the game to ensure that it runs well on all of them. It's no good having a game that runs well on PS3 but chugs on Xbox 360, so you have to look at the overall balance of the hardware. As a developer, you cannot be driven by the most powerful console, but rather the high middle ground that allows your game to shine and perform across multiple machines.

When you are developing a game, getting to a solid frame-rate is the ultimate goal. It doesn't matter how pretty your game looks, or how many players you have on screen, if the frame-rate continually drops, it knocks the player out of the experience and back to the real world, ultimately driving them away from your game if it persists. Maintaining this solid frame-rate drives a lot of the design and technical decisions made during the early phases of a game project. Sometimes features are cut not because they cannot be done, but because they cannot be done within the desired frame-rate."

https://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators



Around the Network
goopy20 said:
chakkra said:

You know, I'm not a developer myself, but I'm gonna go out on a limb and say that developing a game to run in a range of GPUs from GTX 2080 down to GT 1050 might be a little bit different than porting a game from PC to Switch.

If a game is designed with parity in mind, how would they then be able to really push and optimize for GTX2080 if it also has to run and look smooth on a GTX1050? I can't believe we are even arguing this but here's an interesting read, straight from the mouth of a AAA developer...

"One of the first things that you have to address when developing a game is, what is your intended target platform? If the answer to that question is "multiple", you are effectively locking yourself in to compromising certain aspects of the game to ensure that it runs well on all of them. It's no good having a game that runs well on PS3 but chugs on Xbox 360, so you have to look at the overall balance of the hardware. As a developer, you cannot be driven by the most powerful console, but rather the high middle ground that allows your game to shine and perform across multiple machines.

When you are developing a game, getting to a solid frame-rate is the ultimate goal. It doesn't matter how pretty your game looks, or how many players you have on screen, if the frame-rate continually drops, it knocks the player out of the experience and back to the real world, ultimately driving them away from your game if it persists. Maintaining this solid frame-rate drives a lot of the design and technical decisions made during the early phases of a game project. Sometimes features are cut not because they cannot be done, but because they cannot be done within the desired frame-rate."

https://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators

Maybe we should ask that question to RDR2 developers who developed their game to run on a GTX 770, or FH4 developers who developed their game to run on a GTX 650 TI, or MSFS developers who developed their game to run on a GTX 770. I guess none of those games take advantage of the GTX 2080 Ti power since they can also run on such low end hardware.



goopy20 said:

Parity is a thing developers need to work around as they need to hit certain performance targets on the lowest common denominator. Halo Infinite for example can't be running in 360p and look like dog shit on the Xone. It has to hit at least 1080p/30fps and with all the visuals and core game elements intact from the Series X version.

goopy20 said:

Like I said, scalability of the engines and hardware won't hold Series X back, parity with Xbox, Lockhart and low-end pc's will. With every design idea they will have to ask themselves "will this run at 30fps/1080p on a jaguar cpu, 1,3Tflops gpu and HDD too?" If the answer is no, then they just have scale it down across the board or remove it all together.

goopy20 said:

If a game is designed with parity in mind, how would they then be able to really push and optimize for GTX2080 if it also has to run and look smooth on a GTX1050? I can't believe we are even arguing this but here's an interesting read, straight from the mouth of a AAA developer...

So you are pushing your parity fairy tale again? We had this discussion before. Parity = the same / equal.

We have parity, if a game looks and runs the same on system A and system B and the game is wasting the additional performance of the faster system.

But games on Xbox One S and Xbox One X don't look the same (unless the developer was too lazy for an X enhancement patch... and even then there are performance differences).

Games on PS4 and PS4 Pro don't look the same (unless the developer was too lazy for a Pro patch... and even then there are performance differences).

Games on Xbox One S and PS4 don't look the same (the PS4 version almost always looks and/or performs better).

Games on Xbox One X and PS4 Pro don't look the same (the XBO X version almost always looks and/or performs better).

And games definitely don't look the same on an GTX 1050 and an RTX 2080 Ti!

Oh, and more and more games have dynamic resolutions / effects... in these cases parity ain't even possible anymore.

Last edited by Conina - on 14 July 2020

goopy20 said:
chakkra said:

You know, I'm not a developer myself, but I'm gonna go out on a limb and say that developing a game to run in a range of GPUs from GTX 2080 down to GT 1050 might be a little bit different than porting a game from PC to Switch.

If a game is designed with parity in mind, how would they then be able to really push and optimize for GTX2080 if it also has to run and look smooth on a GTX1050? I can't believe we are even arguing this but here's an interesting read, straight from the mouth of a AAA developer...

"One of the first things that you have to address when developing a game is, what is your intended target platform? If the answer to that question is "multiple", you are effectively locking yourself in to compromising certain aspects of the game to ensure that it runs well on all of them. It's no good having a game that runs well on PS3 but chugs on Xbox 360, so you have to look at the overall balance of the hardware. As a developer, you cannot be driven by the most powerful console, but rather the high middle ground that allows your game to shine and perform across multiple machines.

When you are developing a game, getting to a solid frame-rate is the ultimate goal. It doesn't matter how pretty your game looks, or how many players you have on screen, if the frame-rate continually drops, it knocks the player out of the experience and back to the real world, ultimately driving them away from your game if it persists. Maintaining this solid frame-rate drives a lot of the design and technical decisions made during the early phases of a game project. Sometimes features are cut not because they cannot be done, but because they cannot be done within the desired frame-rate."

https://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators

Oh man I love when goopzilla posts DF articles because he always accidentally debunks his own FUD when he does. Like when he was trying to tell us Infamous 3 counts as a “real next gen game” because it allowed gameplay and design possibilities not possible on PS3 and linked a DF article that praised the graphics. But in typical goopy fashion he ignored the part where they said design wise it was just a prettier PS3 Infamous. And so began the flip to the narrative he uses now with graphics being the only thing that matters. Except for XSX games, then graphics don’t matter.

In this article we see numerous goopisms debunked. Like 60fps not being important, 60fps not pushing hardware, and developers having to start at a base of a Xbone or “potato PC” when designing first party XSX games. Also the idea that you can’t massively increase performance simply by dropping resolution. 

Please reference DF more, goop.



chakkra said:
goopy20 said:

If a game is designed with parity in mind, how would they then be able to really push and optimize for GTX2080 if it also has to run and look smooth on a GTX1050? I can't believe we are even arguing this but here's an interesting read, straight from the mouth of a AAA developer...

"One of the first things that you have to address when developing a game is, what is your intended target platform? If the answer to that question is "multiple", you are effectively locking yourself in to compromising certain aspects of the game to ensure that it runs well on all of them. It's no good having a game that runs well on PS3 but chugs on Xbox 360, so you have to look at the overall balance of the hardware. As a developer, you cannot be driven by the most powerful console, but rather the high middle ground that allows your game to shine and perform across multiple machines.

When you are developing a game, getting to a solid frame-rate is the ultimate goal. It doesn't matter how pretty your game looks, or how many players you have on screen, if the frame-rate continually drops, it knocks the player out of the experience and back to the real world, ultimately driving them away from your game if it persists. Maintaining this solid frame-rate drives a lot of the design and technical decisions made during the early phases of a game project. Sometimes features are cut not because they cannot be done, but because they cannot be done within the desired frame-rate."

https://www.eurogamer.net/articles/digitalfoundry-the-secret-developers-what-hardware-balance-actually-means-for-game-creators

Maybe we should ask that question to RDR2 developers who developed their game to run on a GTX 770, or FH4 developers who developed their game to run on a GTX 650 TI, or MSFS developers who developed their game to run on a GTX 770. I guess none of those games take advantage of the GTX 2080 Ti power since they can also run on such low end hardware.

All the games we're playing today are targeting the ps4 as the base console, which has a gpu that's the equivalent of a GTX650Ti. So yeah, of course a pc with that exact same gpu will run it as well. Truth is that a GTX2080ti is complete overkill for multi platform games. That is why you can play current gen games on it at 4k and like 200fps in the first place. Next gen we should finally see what a high-end gpu like that can really do when we get games that aren't designed around the ancient GTX750. Which should be a lot more than just ps4 games at 200fps. At least I hope so, else next gen will be pretty damn boring. 



Around the Network
Conina said:
goopy20 said:

Parity is a thing developers need to work around as they need to hit certain performance targets on the lowest common denominator. Halo Infinite for example can't be running in 360p and look like dog shit on the Xone. It has to hit at least 1080p/30fps and with all the visuals and core game elements intact from the Series X version.

goopy20 said:

Like I said, scalability of the engines and hardware won't hold Series X back, parity with Xbox, Lockhart and low-end pc's will. With every design idea they will have to ask themselves "will this run at 30fps/1080p on a jaguar cpu, 1,3Tflops gpu and HDD too?" If the answer is no, then they just have scale it down across the board or remove it all together.

goopy20 said:

If a game is designed with parity in mind, how would they then be able to really push and optimize for GTX2080 if it also has to run and look smooth on a GTX1050? I can't believe we are even arguing this but here's an interesting read, straight from the mouth of a AAA developer...

So you are pushing your parity fairy tale again? We had this discussion before. Parity = the same / equal.

We have parity, if a game looks and runs the same on system A and system B and the game is wasting the additional performance of the faster system.

But games on Xbox One S and Xbox One X don't look the same (unless the developer was too lazy for an X enhancement patch... and even then there are performance differences).

Games on PS4 and PS4 Pro don't look the same (unless the developer was too lazy for a Pro patch... and even then there are performance differences).

Games on Xbox One S and PS4 don't look the same (the PS4 version almost always looks and/or performs better).

Games on Xbox One X and PS4 Pro don't look the same (the XBO X version almost always looks and/or performs better).

And games definitely don't look the same on an GTX 1050 and an RTX 2080 Ti!

Oh, and more and more games have dynamic resolutions / effects... in these cases parity ain't even possible anymore.

That's not what parity means dude. Of course Xone games will run and look better on Series X. But its like you said, it will be the difference between a Series S and a Xbox One X game, except on steroids. Parity simply means that you're getting the same core game experience on all platforms and everything is designed so that it can hit 30fps and 1080p on the lowest common denominator (Series S).

If you think this is a big enough upgrade to call it a generational leap, then fine. But to me you're still getting the same levels, npc's, physics and overall game experience on a 12Tflops RTX2080 as on a 1,3Tflops Series S.

Call me crazy but to me a generational leap isn't just a bump in graphics settings, framerate and resolution. It's about whole new experiences and immersion thanks to a leap in geometry, level design, ai, physics etc that wouldn't be possible on 7-year-old hardware. And the difference typically looks something like this.

Last edited by goopy20 - on 14 July 2020

DonFerrari said:
zero129 said:

Except the Series S is still a next gen system. Its still using the new RDNA2 chipset, its still using the same CPU as Series X, same SSD speeds etc.

It will have no problems playing all nextgen games in a lower res and i already showed you how nvidia has tech that can take a 540P image and upscale it to 1080P while also looking better then a native 1080P image and a 720P image up to 1440P. MS and AMD have also been working on such tech and it makes perfect sense why with the Series S targeting 1080-1440P and series X Targeting 4K as such tech will allow both console to hit that target at a fraction of the cost easy.

But i understand why your downplaying this a lot goopy as you have a lot to be worried about. Like i said in another thread if parents, low income gamers, casual gamers (The biggest majority of gamers pretty much)  see Series S multi-plat games such as GTA6, CoD, Fifa etc running in 1080P on Series S and looking just as good (To them) as the PS5 versions i mean you did say before too that people wont care about a few extra pixels or more shadow detail etc if the game looks pretty much the exact same to the casuals. And if Series S cost $199-249 like the rumors vs a $449 ps5 i know what console they will be picking up and i think so do you. MS doesnt care if you buy Series S or Series X so it doesnt matter to them if Series X is mostly for the Hardcore gamer who wants the best graphics.

But Like i said Series S is a next gen console but for 1080P tv owners.

I find it very unlikely that the DLSS2 or any other technique upscaling an 720p image can look better than the same image in native resolution of 1080p or higher. Or do you mean having your system running to the limit to generate a 720p instead of a 1080p and then with the upscalling your result be better?

Still MS already have machine learning and similar technique, plus Sony have been using temporary reconstruction (not that much different) for a lot of PS4Pro titles.

The upscaling technique isn't superior to native, it is less hw expensive though. And with that power can be used to improve other aspects of image quality.

Actually, several tech journalists have been saying that DLSS 2.0 actually improves image quality when compared to native 4K. Like here for example: (go to 19:17)
https://www.youtube.com/watch?v=ggnvhFSrPGE



zero129 said:
goopy20 said:

Pretty sure I always said the opposite. Unreal 5 scales all the way down to mobiles, doesn't mean they could get that ps5 tech demo running on a iphone. I mean technically they could, but then it would look like ass and defeat the whole purpose of showcasing the graphics capabilities of  both the engine and the iphone.

Like I said, scalability of the engines and hardware won't hold Series X back, parity with Xbox, Lockhart and low-end pc's will. With every design idea they will have to ask themselves "will this run at 30fps/1080p on a jaguar cpu, 1,3Tflops gpu and HDD too?" If the answer is no, then they just have scale it down across the board or remove it all together.

So you admit they could scale that UE5 demo to an Iphone so how in anyway would the Iphone version looking shit ruin the PS5 version looking good and nextgen?.

And how many first party games will they need to make that decision with? its 2021 when they stop supporting XBOne right?. So how many first party games will release between now and their deadline/.

Series S is a nextgen system it will not hold back Series X anymore then PS5 will.

Because Epic wouldn't dare to show that demo running on a iphone in the first place. People would say the engine looks horrible and/or the iphone can't do graphics. So if Epic wanted to show off their engine and promote both the ps5 and the iphone at the same time, they would have had to take a middle ground where the demo would look good on both platforms.

It basically would have been a different demo entirely and on ps5 it would have looked a lot less impressive than what they've actually shown us, even if it would run at 8k and 2000fps. 

Last edited by goopy20 - on 14 July 2020

goopy doesn’t understand what scaling is



LudicrousSpeed said:
goopy doesn’t understand what scaling is

I dunno man. I honestly think you believe Infamous 2 on the ps3 would look like Infamous SS on the ps4 if they just dialed a graphics slider