Forums - Gaming Discussion - Phil Spencer Says Xbox Series X Games Aren't Being Held Back By Xbox One

DonFerrari said:
sales2099 said:

Don’t forget people loyal to PS themselves. Better to compete with the weaker hardware then one that’s considerably better. 

https://www.trustedreviews.com/opinion/ps4-pro-vs-xbox-one-s-2940191/amp

Id settle for 4K/60 or even 4K/30 for any slow paced 3rd person single player action adventure games. 

Edit: It begins, the resolution wars of 2020 :)

https://twitter.com/strictly_nobs/status/1282411380235173889?s=21

So the poorly formed opinion of a random person is the same weight as Phil Spencer whose console weren't able to compete at all? Gotcha.

I don't think MS is crazy enough to attempt 8K gaming. That's retardedly pointless to the point image quality actually starts to degrade due to super sampling. 



It was Britain, it is America, tomorrow France and next year, the world... 

Warning: This poster has a very negative opinion of Sony and Nintendo, Idea Factory and companies Tecmo Koei, EA, BioWare, Blizzard, Treyarch, Infinity Ward, Kadokawa and Sega. If you have very positive views of these and a negative view of Microsoft or Bethesda Game Studios, AVOID ENGAGEMENT AT ALL COSTS! 

    Around the Network
    zero129 said:
    goopy20 said:

    Maybe that's the only parameter that matters for fanboys. But Halo not being on ps5 is just a business decision, whereas Horizon Zero West being exclusive to the ps5 is a design decision. Big difference there in how far the developers can potentially take their ambitions.

    The truth is that MS is too busy being all consumer friendly that they ARE forgetting the people who are excited about next gen. You know the people who want to see exciting new games, conceived to be impossible on current gen. I get what you're saying and that you won't settle for anything less than the best version of Halo Infinite. But how many people bought a X1X or a high-end gpu to play the best version of base console games? 

    There simply is a difference between a generational jump we typically see when new consoles come out, and scalable graphics like we see on pc. This is the difference.

    Generational leap

    And it's not just the graphics in this screenshot, it's about the entire scope and visual package of the game.

    Scalable graphics on pc

    Scalable graphics on pc and the mid-gen consoles are a lot more subtle. Obviously the resolution and framerate jump is noticeable when you're playing the game, but you're still getting the exact same game (same levels, ai, physics etc.) whether your playing on a 1,3TF Xbox One or a X1X/high-end pc. 

    Scalable

    The Witcher 3 PC

    How do you explain this?. Do the images of the witcher 3 not look at least 2 to 3 gens apart?.

    I wonder how many devs it took to make both versions....

    The lowest settings is not the same thing as the base console settings. The base platform is how the developers intended their game to be experienced by completely optimizing and targeting what they think is the best balance between compromises and performance. That's how console optimization usually works, developers go with the best bang for the buck on console and optimize specifically for that platform. Like I said, RDR2 uses a combination of the lower than the lowest and ultra settings to make it look the way it does on relatively weak hardware.

    60fps,4k and Ray Tracing is the farthest thing from getting the best bang for the buck. A RTX2080 or Series X is insanely powerful compared to the Xbox One, but you can still max them out pretty quickly by being inefficient. Loads of people think the RTX cards were a cash grab because Ray Tracing is such a performance hog that it becomes pretty much useless. Hell, on a GTX2060 you get like 8 fps is you play a current gen game in 4k with RT high enabled...

    We don't really know how the RT cores of the ps5 and Series X compare to the RTX2060, but RT will probably take a similar hit on performance. It would be a bit disappointing but maybe RT is a bridge too far for these next gen consoles and it would be better if the big AAA don't make too much use of it. Maybe that is why the UE5 demo didn't have RT and why Sony hasn't said much about Ray Tracing. However, they are saying Halo Infinite will be 4k/60fps with RT or 4k/120fps... It's not hard to understand that running a Xbox One game at settings like that will max out the Series X pretty easily. So yes, if you think 120fps or RT is what makes games next gen and think that's taking full advantage of the capabilities of Series X, then you won't be disappointed on the 23rd. However if you're expecting an actual generational leap in overall immersion and visuals (geometry, level design, assets variation, scope etc.) then I just don't see that happening.  



    sales2099 said:

    goopy20 said:

    We know man, you already mentioned the 20% gpu difference 5 times now. So you think 20% more gpu power will make a huge difference but let me guess, a 4Tflops Lockhart or the 1,3Tflops Xbox One won't hold back Series X at all because uncle Phil says so...

    But I'm still curious if you ever played a game at 120fps and why you think that's so important? I'm also curious if you can tell the difference between native 4k and something like 1440p or checkerboard rendering.

    You can’t lecture me on repeating the 20-30% gap when you peddle the holding back concern trolling stance since the year started. Fact is Xbox is in a better position to hit benchmarks while also looking next gen. 

    If you make a game on the lower hardware and port up then 100% you are right it holds new hardware back. But since they announced Halo Infinite is built natively on Series X we now know they just have to cut corners to scale down. You know this but still for some reason hope (???) that MS is building games natively on old hardware and porting up.

    It’s simple, my TV is 4K and more frames the better. If you wanna settle for less before the generation even begins then I’m glad I’m not on your side. Series X has a better shot to do both because they actually designed it to be next gen. I never played a game at 120 FPS but I imagine it’s a step up and can’t wait to see the difference. 

    Here I know it doesn’t suit your concern but here’s the confirmation again.

    https://twitter.com/xcloudtimdog/status/1276173499028078592?s=21

    We don't even know what "Series X optimized games built natively for Series X" means. All I know is that Halo will be coming out on Xone and they're talking about 4k/120fps or 60fps with RT on Series X. They did say some of the "optimized for Series X" games get more enhancements than others and Halo is one of them. My guess is that it could set a new benchmark for Ray Tracing compared to the stuff we've already seen on pc and it will probably look great.However, the core game will still be designed around the limitations of the Xbox One.

    It's like how TLOU remastered maxes out the ps4 pro at 4k/60fps compared to TLOU2 at 1440p/30fps. You tell me which is the more ambitious and better looking game?



    goopy20 said:
    zero129 said:

    Scalable

    The Witcher 3 PC

    How do you explain this?. Do the images of the witcher 3 not look at least 2 to 3 gens apart?.

    I wonder how many devs it took to make both versions....

    The lowest settings is not the same thing as the base console settings. The base platform is how the developers intended their game to be experienced by completely optimizing and targeting what they think is the best balance between compromises and performance. That's how console optimization usually works, developers go with the best bang for the buck on console and optimize specifically for that platform. Like I said, RDR2 uses a combination of the lower than the lowest and ultra settings to make it look the way it does on relatively weak hardware.

    60fps,4k and Ray Tracing is the farthest thing from getting the best bang for the buck. A RTX2080 or Series X is insanely powerful compared to the Xbox One, but you can still max them out pretty quickly by being inefficient. Loads of people think the RTX cards were a cash grab because Ray Tracing is such a performance hog that it becomes pretty much useless. Hell, on a GTX2060 you get like 8 fps is you play a current gen game in 4k with RT high enabled...

    We don't really know how the RT cores of the ps5 and Series X compare to the RTX2060, but RT will probably take a similar hit on performance. It would be a bit disappointing but maybe RT is a bridge too far for these next gen consoles and it would be better if the big AAA don't make too much use of it. Maybe that is why the UE5 demo didn't have RT and why Sony hasn't said much about Ray Tracing. However, they are saying Halo Infinite will be 4k/60fps with RT or 4k/120fps... It's not hard to understand that running a Xbox One game at settings like that will max out the Series X pretty easily. So yes, if you think 120fps or RT is what makes games next gen and think that's taking full advantage of the capabilities of Series X, then you won't be disappointed on the 23rd. However if you're expecting an actual generational leap in overall immersion and visuals (geometry, level design, assets variation, scope etc.) then I just don't see that happening.  

    Answer the original question. Im asking you how is it possible that the same game can look 3 gens apart?. You keep trying to say its impossible unless 2 dev teams etc work on two versions.

    Yet them two pics of the Witcher 3 look like completely different games from different gens more so then any game jump you showed from PS4 to PS5.

    How is this possible?. Why is it that MS's engines cant scale the same way?.



    zero129 said:
    goopy20 said:

    Maybe that's the only parameter that matters for fanboys. But Halo not being on ps5 is just a business decision, whereas Horizon Zero West being exclusive to the ps5 is a design decision. Big difference there in how far the developers can potentially take their ambitions.

    The truth is that MS is too busy being all consumer friendly that they ARE forgetting the people who are excited about next gen. You know the people who want to see exciting new games, conceived to be impossible on current gen. I get what you're saying and that you won't settle for anything less than the best version of Halo Infinite. But how many people bought a X1X or a high-end gpu to play the best version of base console games? 

    There simply is a difference between a generational jump we typically see when new consoles come out, and scalable graphics like we see on pc. This is the difference.

    Generational leap

    And it's not just the graphics in this screenshot, it's about the entire scope and visual package of the game.

    Scalable graphics on pc

    Scalable graphics on pc and the mid-gen consoles are a lot more subtle. Obviously the resolution and framerate jump is noticeable when you're playing the game, but you're still getting the exact same game (same levels, ai, physics etc.) whether your playing on a 1,3TF Xbox One or a X1X/high-end pc. 

    Scalable

    The Witcher 3 PC

    How do you explain this?. Do the images of the witcher 3 not look at least 2 to 3 gens apart?.

    I wonder how many devs it took to make both versions....

    I heard it was just one generation apart because X1X is gen 8 while Switch is gen 9.

    goopy20 said:
    zero129 said:

    Scalable

    The Witcher 3 PC

    How do you explain this?. Do the images of the witcher 3 not look at least 2 to 3 gens apart?.

    I wonder how many devs it took to make both versions....

    The lowest settings is not the same thing as the base console settings. The base platform is how the developers intended their game to be experienced by completely optimizing and targeting what they think is the best balance between compromises and performance. That's how console optimization usually works, developers go with the best bang for the buck on console and optimize specifically for that platform. Like I said, RDR2 uses a combination of the lower than the lowest and ultra settings to make it look the way it does on relatively weak hardware.

    60fps,4k and Ray Tracing is the farthest thing from getting the best bang for the buck. A RTX2080 or Series X is insanely powerful compared to the Xbox One, but you can still max them out pretty quickly by being inefficient. Loads of people think the RTX cards were a cash grab because Ray Tracing is such a performance hog that it becomes pretty much useless. Hell, on a GTX2060 you get like 8 fps is you play a current gen game in 4k with RT high enabled...

    We don't really know how the RT cores of the ps5 and Series X compare to the RTX2060, but RT will probably take a similar hit on performance. It would be a bit disappointing but maybe RT is a bridge too far for these next gen consoles and it would be better if the big AAA don't make too much use of it. Maybe that is why the UE5 demo didn't have RT and why Sony hasn't said much about Ray Tracing. However, they are saying Halo Infinite will be 4k/60fps with RT or 4k/120fps... It's not hard to understand that running a Xbox One game at settings like that will max out the Series X pretty easily. So yes, if you think 120fps or RT is what makes games next gen and think that's taking full advantage of the capabilities of Series X, then you won't be disappointed on the 23rd. However if you're expecting an actual generational leap in overall immersion and visuals (geometry, level design, assets variation, scope etc.) then I just don't see that happening.  

    You are factually wrong. The UE5 demo had RT on it, what it didn't do was use the RT cores of the PS5, which would likely show a better image if used.

    Also Cerny on the presentation said some of the builds they done in Sony had 3 of the 4 main elements of RT and the impact in performance was small. That was the objective with the RT cores, to give RT without taxing the rest of the GPU (sure some devs may go even heavier and then need to use the rest of the GPU for it).



    duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

    http://gamrconnect.vgchartz.com/post.php?id=8808363

    Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

    http://gamrconnect.vgchartz.com/post.php?id=9008994

    Around the Network
    goopy20 said:
    sales2099 said:

    You can’t lecture me on repeating the 20-30% gap when you peddle the holding back concern trolling stance since the year started. Fact is Xbox is in a better position to hit benchmarks while also looking next gen. 

    If you make a game on the lower hardware and port up then 100% you are right it holds new hardware back. But since they announced Halo Infinite is built natively on Series X we now know they just have to cut corners to scale down. You know this but still for some reason hope (???) that MS is building games natively on old hardware and porting up.

    It’s simple, my TV is 4K and more frames the better. If you wanna settle for less before the generation even begins then I’m glad I’m not on your side. Series X has a better shot to do both because they actually designed it to be next gen. I never played a game at 120 FPS but I imagine it’s a step up and can’t wait to see the difference. 

    Here I know it doesn’t suit your concern but here’s the confirmation again.

    https://twitter.com/xcloudtimdog/status/1276173499028078592?s=21

    We don't even know what "Series X optimized games built natively for Series X" means. All I know is that Halo will be coming out on Xone and they're talking about 4k/120fps or 60fps with RT on Series X. They did say some of the "optimized for Series X" games get more enhancements than others and Halo is one of them. My guess is that it could set a new benchmark for Ray Tracing compared to the stuff we've already seen on pc and it will probably look great.However, the core game will still be designed around the limitations of the Xbox One.

    It's like how TLOU remastered maxes out the ps4 pro at 4k/60fps compared to TLOU2 at 1440p/30fps. You tell me which is the more ambitious and better looking game?

    I’m shocked to see that even if Halo Infinite looks great on XSX you’ve found some way to disqualify it as “real next gen”. Totally didn’t see that coming. So apparently what is good enough on PS5 is not good enough on XSX. 



    DonFerrari said:
    zero129 said:

    Scalable

    The Witcher 3 PC

    How do you explain this?. Do the images of the witcher 3 not look at least 2 to 3 gens apart?.

    I wonder how many devs it took to make both versions....

    I heard it was just one generation apart because X1X is gen 8 while Switch is gen 9.

    goopy20 said:

    The lowest settings is not the same thing as the base console settings. The base platform is how the developers intended their game to be experienced by completely optimizing and targeting what they think is the best balance between compromises and performance. That's how console optimization usually works, developers go with the best bang for the buck on console and optimize specifically for that platform. Like I said, RDR2 uses a combination of the lower than the lowest and ultra settings to make it look the way it does on relatively weak hardware.

    60fps,4k and Ray Tracing is the farthest thing from getting the best bang for the buck. A RTX2080 or Series X is insanely powerful compared to the Xbox One, but you can still max them out pretty quickly by being inefficient. Loads of people think the RTX cards were a cash grab because Ray Tracing is such a performance hog that it becomes pretty much useless. Hell, on a GTX2060 you get like 8 fps is you play a current gen game in 4k with RT high enabled...

    We don't really know how the RT cores of the ps5 and Series X compare to the RTX2060, but RT will probably take a similar hit on performance. It would be a bit disappointing but maybe RT is a bridge too far for these next gen consoles and it would be better if the big AAA don't make too much use of it. Maybe that is why the UE5 demo didn't have RT and why Sony hasn't said much about Ray Tracing. However, they are saying Halo Infinite will be 4k/60fps with RT or 4k/120fps... It's not hard to understand that running a Xbox One game at settings like that will max out the Series X pretty easily. So yes, if you think 120fps or RT is what makes games next gen and think that's taking full advantage of the capabilities of Series X, then you won't be disappointed on the 23rd. However if you're expecting an actual generational leap in overall immersion and visuals (geometry, level design, assets variation, scope etc.) then I just don't see that happening.  

    You are factually wrong. The UE5 demo had RT on it, what it didn't do was use the RT cores of the PS5, which would likely show a better image if used.

    Also Cerny on the presentation said some of the builds they done in Sony had 3 of the 4 main elements of RT and the impact in performance was small. That was the objective with the RT cores, to give RT without taxing the rest of the GPU (sure some devs may go even heavier and then need to use the rest of the GPU for it).

    I didn't know that. I thought UE5's Lumination (dynamic global illumination) is kinda like the next best thing to RT at a fraction of the cost. We will see but I got a feeling RT will be too expensive to really be fully utilized in all AAA next gen games. I'm sure we'll see it being used here and there, though. 



    LudicrousSpeed said:
    goopy20 said:

    We don't even know what "Series X optimized games built natively for Series X" means. All I know is that Halo will be coming out on Xone and they're talking about 4k/120fps or 60fps with RT on Series X. They did say some of the "optimized for Series X" games get more enhancements than others and Halo is one of them. My guess is that it could set a new benchmark for Ray Tracing compared to the stuff we've already seen on pc and it will probably look great.However, the core game will still be designed around the limitations of the Xbox One.

    It's like how TLOU remastered maxes out the ps4 pro at 4k/60fps compared to TLOU2 at 1440p/30fps. You tell me which is the more ambitious and better looking game?

    I’m shocked to see that even if Halo Infinite looks great on XSX you’ve found some way to disqualify it as “real next gen”. Totally didn’t see that coming. So apparently what is good enough on PS5 is not good enough on XSX. 

    It just seems you can't tell the difference between a next gen title (specifically made to take full advantage of the new features) or a current gen game running at higher settings. Like I said, which game do you think is taking more advantage of the ps4 pro hardware; a ps3 game like TLOU running in 4k/60fps or TLOU2 in 1440p/30fps? 



    goopy20 said:
    DonFerrari said:

    I heard it was just one generation apart because X1X is gen 8 while Switch is gen 9.

    You are factually wrong. The UE5 demo had RT on it, what it didn't do was use the RT cores of the PS5, which would likely show a better image if used.

    Also Cerny on the presentation said some of the builds they done in Sony had 3 of the 4 main elements of RT and the impact in performance was small. That was the objective with the RT cores, to give RT without taxing the rest of the GPU (sure some devs may go even heavier and then need to use the rest of the GPU for it).

    I didn't know that. I thought UE5's Lumination (dynamic global illumination) is kinda like the next best thing to RT at a fraction of the cost. We will see but I got a feeling RT will be too expensive to really be fully utilized in all AAA next gen games. I'm sure we'll see it being used here and there, though. 

    Nope all that is a form of RT, RT is a "generic" name for all the group of technics.

    And if UE5 demo had been optimized to use the RT cores it would look better from what we heard from CGI and Pema.



    duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

    http://gamrconnect.vgchartz.com/post.php?id=8808363

    Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

    http://gamrconnect.vgchartz.com/post.php?id=9008994

    DonFerrari said:
    goopy20 said:

    I didn't know that. I thought UE5's Lumination (dynamic global illumination) is kinda like the next best thing to RT at a fraction of the cost. We will see but I got a feeling RT will be too expensive to really be fully utilized in all AAA next gen games. I'm sure we'll see it being used here and there, though. 

    Nope all that is a form of RT, RT is a "generic" name for all the group of technics.

    And if UE5 demo had been optimized to use the RT cores it would look better from what we heard from CGI and Pema.

    You're right, I was just reading about it: https://www.hardwaretimes.com/the-unreal-5-demo-on-the-ps5-used-software-ray-tracing-similar-to-reshades-ray-tracing-shader-ray-traced-gi/

    Interesting stuff as Epic said the UE5 used about the same gpu resources as Fortnite. https://www.thegamer.com/unreal-engine-5-tech-demo-gpu-fortnite/