By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Series S vs Series X (resolution, performance, etc.)

DonFerrari said:

Understood. My technical understanding in this is also limited, but sure it is very reasonable expectation. Will just point out what Pemalite brought on the pixel count rendering isn't that much affected by the RAM amount (and perhaps speed), but textures and other elements are. But sure if your assets are being limited by the speed/amount of RAM making the render higher would just make things unbalanced. Let's see how things will roll during the rest of the gen.

On the 50% difference on some extreme cases I don't remember what were the games, but were very few, I think CoD was one of them on the release, did it got patched later? And from what I remember even those titles that had 50% difference in pixel count while playing weren't so much worse.

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 



Around the Network
AkimboCurly said:

So yes empirically you're absolutely right most were not the full 720p vs. 1080p. I didn't intend it to be read like that was the norm, only that in extremis you get a 50% resolution hit. 

The reason I say it has to do with the RAM (and by extension the ESram) is because of the way it was partitioned up. I'm not a developer obviously but to my limited understanding, unless you can fit your render target into the 32MB buffer of extra-speedy RAM, you're forced to relegate it into the DDR3. This meant that, especially in multiplats which seldom used the buffer, the Xbox One would construct its frame with DDR3 (which is supposed to be system RAM) while the PS4 was able to use its GDDR5. The bandwidth differences then become seriously 2.0. The new Series S has a similar tiered memory architecture which people suspect the slower 2GB will be used for the OS. But even the faster memory (8GB GDDR6) has less bandwidth than the slowest tier memory in the Series X. So to my mind, unless developers scale their rendering targets nicely, both for the GPU but ALSO for the memory bandwidth, the Series S will get the short end of the stick. In practice that means that cutting texture resolutions AS WELL as internal resolution is basically non-negotiable. Watch Dogs got it right and AC Valhalla got it wrong

The eSRAM was basically up to developer choice.
Some developers could have used it as an additional level of cache for the CPU to bolster CPU performance.

Basically that 32MB was a massive limiter, it was necessary to leverage to get the most out of the machine, but developers learned to work around it by taking a tiled-approach over multiple passes to make the absolute most out of it.

DDR3 and GDDR5 are both "System Memories" and "Graphics Memories" in the 8th gen consoles, DDR3 definitely has the latency advantage (Remember DRAM latency is a result of clockrate.) which meant CPU tasks had an edge on the Xbox One (Plus clockrate advantage) and GDDR5 had the bandwidth advantage which meant graphics duties were simply superior on the Playstation 4... Plus the Playstation 4 just had the GPU compute to make up for the CPU deficiency, all comes down to the developer and engine.

Sadly the Xbox One was GPU limited more often than not, but when it wasn't and the CPU was the limitation, it definitely held a slight edge in gaming... A certain Assassins Creed title comes to mind when lots of actors were on-screen.

Things like Alpha Effects will be scaled back on the Series S, Resolution will be the first cutback which will save on bandwidth/fillrate massively. - Around 256GB/s of bandwidth is a good number for 1080P and the Series S fits into that ballpark fairly well, especially when you start to account for delta colour compression, primitive shaders, draw stream binning rasterization.

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

AkimboCurly said:

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.



--::{PC Gaming Master Race}::--

Pemalite said:
AkimboCurly said:

So yes empirically you're absolutely right most were not the full 720p vs. 1080p. I didn't intend it to be read like that was the norm, only that in extremis you get a 50% resolution hit. 

The reason I say it has to do with the RAM (and by extension the ESram) is because of the way it was partitioned up. I'm not a developer obviously but to my limited understanding, unless you can fit your render target into the 32MB buffer of extra-speedy RAM, you're forced to relegate it into the DDR3. This meant that, especially in multiplats which seldom used the buffer, the Xbox One would construct its frame with DDR3 (which is supposed to be system RAM) while the PS4 was able to use its GDDR5. The bandwidth differences then become seriously 2.0. The new Series S has a similar tiered memory architecture which people suspect the slower 2GB will be used for the OS. But even the faster memory (8GB GDDR6) has less bandwidth than the slowest tier memory in the Series X. So to my mind, unless developers scale their rendering targets nicely, both for the GPU but ALSO for the memory bandwidth, the Series S will get the short end of the stick. In practice that means that cutting texture resolutions AS WELL as internal resolution is basically non-negotiable. Watch Dogs got it right and AC Valhalla got it wrong

The eSRAM was basically up to developer choice.
Some developers could have used it as an additional level of cache for the CPU to bolster CPU performance.

Basically that 32MB was a massive limiter, it was necessary to leverage to get the most out of the machine, but developers learned to work around it by taking a tiled-approach over multiple passes to make the absolute most out of it.

DDR3 and GDDR5 are both "System Memories" and "Graphics Memories" in the 8th gen consoles, DDR3 definitely has the latency advantage (Remember DRAM latency is a result of clockrate.) which meant CPU tasks had an edge on the Xbox One (Plus clockrate advantage) and GDDR5 had the bandwidth advantage which meant graphics duties were simply superior on the Playstation 4... Plus the Playstation 4 just had the GPU compute to make up for the CPU deficiency, all comes down to the developer and engine.

Sadly the Xbox One was GPU limited more often than not, but when it wasn't and the CPU was the limitation, it definitely held a slight edge in gaming... A certain Assassins Creed title comes to mind when lots of actors were on-screen.

Things like Alpha Effects will be scaled back on the Series S, Resolution will be the first cutback which will save on bandwidth/fillrate massively. - Around 256GB/s of bandwidth is a good number for 1080P and the Series S fits into that ballpark fairly well, especially when you start to account for delta colour compression, primitive shaders, draw stream binning rasterization.

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

AkimboCurly said:

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Well MS expect Series S to sell much better than Series X. So that could actually mean that in the end Series X usually won't outperform PS5 version while Series S will be competent enough because that is where devs/pubs will sell most of the SW on Xbox.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Pemalite said:

AkimboCurly said:

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Thus far is seems many developers are doing that. I wouldn't mind if AAA games keep targeting that as well, 1080p will always look sharp and they can do other things to improve it overall.

Although, I'm a tad concerned some games aren't aiming for a higher resolution. Bearing in mind it apparently has Xbox One X GPU power, that shouldn't be an issue. Maybe developers need more experience with the hardware?



Recently Completed
Rage 2
for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Added Halo: MCC to the list. I assume video and articles will keep coming so I will update often.



Recently Completed
Rage 2
for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Around the Network
shikamaru317 said:
EnricoPallazzo said:
Im underwhelmed by Series S. I though the difference would be games running at 1080p instead of of 4k only which would make sense I guess since it would be generating only 25% of the pixels of a 4K game. But it seems the gap is much wider and games need to run at 30fps. Why would Valhalla have to run 1620p/30 fps instead of 1080p/60fps? Or why you would have to turn off ray tracing ON TOP of lowering resolution from 4k to 1080p?

Valhalla doesn't need to run at 1620p 30 fps, devs were lazy and used the same 1620p-2160p scaler on Series S, Xbox One X, PS5, and Series X, because 1620p is over the 1080p-1440p recommendation for Series S, Ubisoft had to drop to 30 fps. It could definitely run the game at 60 fps, most likely with a 1080p-1440p dynamic scaler, if Ubisoft cared to take the time to implement it. 

As for no ray-tracing on Series S for DMC V, considering it is terribly optimized on both PS5 and Series X, I'm guessing they didn't want to take time to optimize a 3rd version of the game when they were already struggling to optimize 2 versions. Doesn't help that they decided to target so many different performance profiles, they have 4 different performance profiles per platform to optimize on XSX and PS5. It's just too much work for a small porting team that is facing covid production issues. 

Also it's worth noting that we have reports from several sources that the Xbox SDK was behind schedule and came in hot, only giving devs a few months to get used to Xbox Series development ahead of launch. I'm guessing that played a factor in these Series S launch game issues as well. 

The good news is that several of these launch games and next-gen updates for last gen games show performance on Series S runs as expected, 1080p or 1440p at the same framerate as Series X runs them at 4K, exactly what MS designed Series S to do. Yakuza for instance, it has one of the most demanding game engines of any of these launch games, the Dragon Engine, and yet it's managing 1440p at the same framerate as Series X at 4K. 

Thanks for the explanation. It's this kind of thing that makes me sometimes became a pc gamer. How difficult it is for the devs to give us the option? Why cant I choose to run at 1080p 60fps? I really dont get it.



While the Xbox SX is the more powerful system and, historically I prefer Xbox to PlayStation (PS3 and PS4 would gain features way too late and they often worked kinda weird (having you sync trophies, external HDD support, etc). BUT I'm gonna come to Sony's defense. Some people are bragging about how certain bc games load faster on Xbox than on Playstation. I'm gonna say that, in many cases, the XBO version was a lower quality game. A 900p game is gonna load faster than a 1080p game. I own an Xbox One S and an Xbox One X and I see it all the time. Those 4K enhanced games have enhanced load times!



mZuzek loves Smeags. 😢

Curious, does the seriesx/s apu gpu allow for GPGPU processing?



d21lewis said:

While the Xbox SX is the more powerful system and, historically I prefer Xbox to PlayStation (PS3 and PS4 would gain features way too late and they often worked kinda weird (having you sync trophies, external HDD support, etc). BUT I'm gonna come to Sony's defense. Some people are bragging about how certain bc games load faster on Xbox than on Playstation. I'm gonna say that, in many cases, the XBO version was a lower quality game. A 900p game is gonna load faster than a 1080p game. I own an Xbox One S and an Xbox One X and I see it all the time. Those 4K enhanced games have enhanced load times!

If you're suggesting Xbox One X has longer load times, that's generally not the case from my experience.

Games with superior assets though can load slightly longer. Something 1st party games tend to offer.



Recently Completed
Rage 2
for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

KratosLives said:
Curious, does the seriesx/s apu gpu allow for GPGPU processing?

Yes.
Every Xbox console that has ever been released can do that.




--::{PC Gaming Master Race}::--