By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Series S vs Series X (resolution, performance, etc.)

Why would we have games with better performance on One X instead of Series S? I thought Series S was at least the same power as One X.

Call of Duty: Cold War

One X: Dynamic 4K/60 fps

Series S: Dynamic 1440p/60 fps, no RT

Assassin's Creed: Valhalla

One X: Dynamic 4K/30 fps

Series S: Dynamic 1188p-1620p/30 fps

Gears 5

One X: Dynamic 4K/60 fps

Series S: Dynamic 720p-1440p/60 fps

Halo: MCC

One X: 4K/60 fps with dips

Series S: 1080p/60 fps

Also just in case you can find it, it would be nice to see if games with unlocked fps mode on One X reach 60fps on Series S, like the performance mode on Dark Souls 3, The Witcher 3 etc.



Around the Network
Mr Puggsly said:
d21lewis said:

While the Xbox SX is the more powerful system and, historically I prefer Xbox to PlayStation (PS3 and PS4 would gain features way too late and they often worked kinda weird (having you sync trophies, external HDD support, etc). BUT I'm gonna come to Sony's defense. Some people are bragging about how certain bc games load faster on Xbox than on Playstation. I'm gonna say that, in many cases, the XBO version was a lower quality game. A 900p game is gonna load faster than a 1080p game. I own an Xbox One S and an Xbox One X and I see it all the time. Those 4K enhanced games have enhanced load times!

If you're suggesting Xbox One X has longer load times, that's generally not the case from my experience.

Games with superior assets though can load slightly longer. Something 1st party games tend to offer.

That's what I was saying. The enhanced games have longer loading times than the non-enhanced. They tend to take up more HDD space, too. "Regular" games actually load a lot faster on Xbox One X than they do on the S--it's even considered a feature.

I guess the term "enhanced" is where the confusion is. Microsoft considers HDR, 4K, faster frame rates, and faster load times as enhancements. And rightfully so. I was only talking about the games like Gears 4 that get the 4K treatment.



mZuzek loves Smeags. 😢

EnricoPallazzo said:

Why would we have games with better performance on One X instead of Series S? I thought Series S was at least the same power as One X.

Call of Duty: Cold War

One X: Dynamic 4K/60 fps

Series S: Dynamic 1440p/60 fps, no RT

Assassin's Creed: Valhalla

One X: Dynamic 4K/30 fps

Series S: Dynamic 1188p-1620p/30 fps

Gears 5

One X: Dynamic 4K/60 fps

Series S: Dynamic 720p-1440p/60 fps

Halo: MCC

One X: 4K/60 fps with dips

Series S: 1080p/60 fps

Also just in case you can find it, it would be nice to see if games with unlocked fps mode on One X reach 60fps on Series S, like the performance mode on Dark Souls 3, The Witcher 3 etc.

I was always skeptical the Series S is at par with One X in GPU power. Some examples suggest its at par, others are questionable. Maybe developers still need to learn the hardware?

A few things though. You have to remember the One X is running base One content at a higher resolution. The Series S is running Series X content at a lower resolution. So they aren't aren't running the exact same content per se. I think Gears Tactics and Yakuza show them being about equal on GPU.

However, I find it odd a lot of the 1st party content is running 1080p. It seems to me that content should be doing 1440p. The significant advantage we are seeing for Series S is its ability to achieve 60 fps with more often.

The Series S runs BC games like base One S. Therefore anything capped to 30 fps on One S is also capped to 30 fps on the Series S. Unless MS uses their BC tech to unlock titles for 60 fps. Their only example so far was running Fallout 4 at 60 fps even though its capped to 30 fps on all console platforms.

There are a lot of games the base One failed to maintain 60 fps in though. Hitman, many CoD titles, id shooters, Halo:MCC, etc. Then you have games that ran like trash such as Just Cause 3, well below 30 fps. That's kind of stuff people who analyze frame rates will focus on.



Recently Completed
Rage 2
for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:
EnricoPallazzo said:

Why would we have games with better performance on One X instead of Series S? I thought Series S was at least the same power as One X.

Call of Duty: Cold War

One X: Dynamic 4K/60 fps

Series S: Dynamic 1440p/60 fps, no RT

Assassin's Creed: Valhalla

One X: Dynamic 4K/30 fps

Series S: Dynamic 1188p-1620p/30 fps

Gears 5

One X: Dynamic 4K/60 fps

Series S: Dynamic 720p-1440p/60 fps

Halo: MCC

One X: 4K/60 fps with dips

Series S: 1080p/60 fps

Also just in case you can find it, it would be nice to see if games with unlocked fps mode on One X reach 60fps on Series S, like the performance mode on Dark Souls 3, The Witcher 3 etc.

I was always skeptical the Series S is at par with One X in GPU power. Some examples suggest its at par, others are questionable. Maybe developers still need to learn the hardware?

A few things though. You have to remember the One X is running base One content at a higher resolution. The Series S is running Series X content at a lower resolution. So they aren't aren't running the exact same content per se. I think Gears Tactics and Yakuza show them being about equal on GPU.

However, I find it odd a lot of the 1st party content is running 1080p. It seems to me that content should be doing 1440p. The significant advantage we are seeing for Series S is its ability to achieve 60 fps with more often.

The Series S runs BC games like base One S. Therefore anything capped to 30 fps on One S is also capped to 30 fps on the Series S. Unless MS uses their BC tech to unlock titles for 60 fps. Their only example so far was running Fallout 4 at 60 fps even though its capped to 30 fps on all console platforms.

There are a lot of games the base One failed to maintain 60 fps in though. Hitman, many CoD titles, id shooters, Halo:MCC, etc. Then you have games that ran like trash such as Just Cause 3, well below 30 fps. That's kind of stuff people who analyze frame rates will focus on.

Wait... so when you load current gen Series S game, let's say Witcher 3, of FFXV, or DS3, you DO NOT have the option to choose a resolution or performance mode like in One X? Those patches applied to One X games do not carry over to Series S, only to Series X? Thats really bad. 



DonFerrari said:
Pemalite said:

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Well MS expect Series S to sell much better than Series X. So that could actually mean that in the end Series X usually won't outperform PS5 version while Series S will be competent enough because that is where devs/pubs will sell most of the SW on Xbox.

Well, the base consoles PS4 + Xbox One + Xbox One S sold much better than the PS4 Pro and Xbox One X.

Nevertheless the Xbox One X outperformed the PS4 Pro in almost every game.



Around the Network
EnricoPallazzo said:
Mr Puggsly said:
EnricoPallazzo said:

Why would we have games with better performance on One X instead of Series S? I thought Series S was at least the same power as One X.

Call of Duty: Cold War

One X: Dynamic 4K/60 fps

Series S: Dynamic 1440p/60 fps, no RT

Assassin's Creed: Valhalla

One X: Dynamic 4K/30 fps

Series S: Dynamic 1188p-1620p/30 fps

Gears 5

One X: Dynamic 4K/60 fps

Series S: Dynamic 720p-1440p/60 fps

Halo: MCC

One X: 4K/60 fps with dips

Series S: 1080p/60 fps

Also just in case you can find it, it would be nice to see if games with unlocked fps mode on One X reach 60fps on Series S, like the performance mode on Dark Souls 3, The Witcher 3 etc.

I was always skeptical the Series S is at par with One X in GPU power. Some examples suggest its at par, others are questionable. Maybe developers still need to learn the hardware?

A few things though. You have to remember the One X is running base One content at a higher resolution. The Series S is running Series X content at a lower resolution. So they aren't aren't running the exact same content per se. I think Gears Tactics and Yakuza show them being about equal on GPU.

However, I find it odd a lot of the 1st party content is running 1080p. It seems to me that content should be doing 1440p. The significant advantage we are seeing for Series S is its ability to achieve 60 fps with more often.

The Series S runs BC games like base One S. Therefore anything capped to 30 fps on One S is also capped to 30 fps on the Series S. Unless MS uses their BC tech to unlock titles for 60 fps. Their only example so far was running Fallout 4 at 60 fps even though its capped to 30 fps on all console platforms.

There are a lot of games the base One failed to maintain 60 fps in though. Hitman, many CoD titles, id shooters, Halo:MCC, etc. Then you have games that ran like trash such as Just Cause 3, well below 30 fps. That's kind of stuff people who analyze frame rates will focus on.

Wait... so when you load current gen Series S game, let's say Witcher 3, of FFXV, or DS3, you DO NOT have the option to choose a resolution or performance mode like in One X? Those patches applied to One X games do not carry over to Series S, only to Series X? Thats really bad. 

Yes, at least until now even if Series S is on par or actually stronger than X1X in BC mode it will run the X1 version of the games. And since most of those had capped framerate and low capped resolution, most of the BC content for Series S is quite underwhelming. Imho it should use X1X versions when available.

Conina said:
DonFerrari said:
Pemalite said:

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Well MS expect Series S to sell much better than Series X. So that could actually mean that in the end Series X usually won't outperform PS5 version while Series S will be competent enough because that is where devs/pubs will sell most of the SW on Xbox.

Well, the base consoles PS4 + Xbox One + Xbox One S sold much better than the PS4 Pro and Xbox One X.

Nevertheless the Xbox One X outperformed the PS4 Pro in almost every game.

X1X was like 40% stronger than PS4Pro with probably less bottlenecks and both didn't have much customizations. Series X is 15-20% higher in Tflop but in power itself on real world applications we don't have a concrete number. Plus we have seem the initial wave of games. So it isn't far from reality to expect that we may not see Series X seeing major gap against PS5 multiplat versions.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

DonFerrari said:
Pemalite said:
AkimboCurly said:

So yes empirically you're absolutely right most were not the full 720p vs. 1080p. I didn't intend it to be read like that was the norm, only that in extremis you get a 50% resolution hit. 

The reason I say it has to do with the RAM (and by extension the ESram) is because of the way it was partitioned up. I'm not a developer obviously but to my limited understanding, unless you can fit your render target into the 32MB buffer of extra-speedy RAM, you're forced to relegate it into the DDR3. This meant that, especially in multiplats which seldom used the buffer, the Xbox One would construct its frame with DDR3 (which is supposed to be system RAM) while the PS4 was able to use its GDDR5. The bandwidth differences then become seriously 2.0. The new Series S has a similar tiered memory architecture which people suspect the slower 2GB will be used for the OS. But even the faster memory (8GB GDDR6) has less bandwidth than the slowest tier memory in the Series X. So to my mind, unless developers scale their rendering targets nicely, both for the GPU but ALSO for the memory bandwidth, the Series S will get the short end of the stick. In practice that means that cutting texture resolutions AS WELL as internal resolution is basically non-negotiable. Watch Dogs got it right and AC Valhalla got it wrong

The eSRAM was basically up to developer choice.
Some developers could have used it as an additional level of cache for the CPU to bolster CPU performance.

Basically that 32MB was a massive limiter, it was necessary to leverage to get the most out of the machine, but developers learned to work around it by taking a tiled-approach over multiple passes to make the absolute most out of it.

DDR3 and GDDR5 are both "System Memories" and "Graphics Memories" in the 8th gen consoles, DDR3 definitely has the latency advantage (Remember DRAM latency is a result of clockrate.) which meant CPU tasks had an edge on the Xbox One (Plus clockrate advantage) and GDDR5 had the bandwidth advantage which meant graphics duties were simply superior on the Playstation 4... Plus the Playstation 4 just had the GPU compute to make up for the CPU deficiency, all comes down to the developer and engine.

Sadly the Xbox One was GPU limited more often than not, but when it wasn't and the CPU was the limitation, it definitely held a slight edge in gaming... A certain Assassins Creed title comes to mind when lots of actors were on-screen.

Things like Alpha Effects will be scaled back on the Series S, Resolution will be the first cutback which will save on bandwidth/fillrate massively. - Around 256GB/s of bandwidth is a good number for 1080P and the Series S fits into that ballpark fairly well, especially when you start to account for delta colour compression, primitive shaders, draw stream binning rasterization.

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

AkimboCurly said:

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Well MS expect Series S to sell much better than Series X. So that could actually mean that in the end Series X usually won't outperform PS5 version while Series S will be competent enough because that is where devs/pubs will sell most of the SW on Xbox.

I don't believe that for a second!

Just buying a memory card would make the Series S (about 350gb usable storage) cost as much as a Series X! In fact, from what I've seen, there are less Series S consoles released into the market, period. I think the Series S is like the "Xbox  360 Core". Yeah, it's cheaper but there are so many cuts that only the desperate will buy one.



mZuzek loves Smeags. 😢

d21lewis said:
DonFerrari said:
Pemalite said:
AkimboCurly said:

So yes empirically you're absolutely right most were not the full 720p vs. 1080p. I didn't intend it to be read like that was the norm, only that in extremis you get a 50% resolution hit. 

The reason I say it has to do with the RAM (and by extension the ESram) is because of the way it was partitioned up. I'm not a developer obviously but to my limited understanding, unless you can fit your render target into the 32MB buffer of extra-speedy RAM, you're forced to relegate it into the DDR3. This meant that, especially in multiplats which seldom used the buffer, the Xbox One would construct its frame with DDR3 (which is supposed to be system RAM) while the PS4 was able to use its GDDR5. The bandwidth differences then become seriously 2.0. The new Series S has a similar tiered memory architecture which people suspect the slower 2GB will be used for the OS. But even the faster memory (8GB GDDR6) has less bandwidth than the slowest tier memory in the Series X. So to my mind, unless developers scale their rendering targets nicely, both for the GPU but ALSO for the memory bandwidth, the Series S will get the short end of the stick. In practice that means that cutting texture resolutions AS WELL as internal resolution is basically non-negotiable. Watch Dogs got it right and AC Valhalla got it wrong

The eSRAM was basically up to developer choice.
Some developers could have used it as an additional level of cache for the CPU to bolster CPU performance.

Basically that 32MB was a massive limiter, it was necessary to leverage to get the most out of the machine, but developers learned to work around it by taking a tiled-approach over multiple passes to make the absolute most out of it.

DDR3 and GDDR5 are both "System Memories" and "Graphics Memories" in the 8th gen consoles, DDR3 definitely has the latency advantage (Remember DRAM latency is a result of clockrate.) which meant CPU tasks had an edge on the Xbox One (Plus clockrate advantage) and GDDR5 had the bandwidth advantage which meant graphics duties were simply superior on the Playstation 4... Plus the Playstation 4 just had the GPU compute to make up for the CPU deficiency, all comes down to the developer and engine.

Sadly the Xbox One was GPU limited more often than not, but when it wasn't and the CPU was the limitation, it definitely held a slight edge in gaming... A certain Assassins Creed title comes to mind when lots of actors were on-screen.

Things like Alpha Effects will be scaled back on the Series S, Resolution will be the first cutback which will save on bandwidth/fillrate massively. - Around 256GB/s of bandwidth is a good number for 1080P and the Series S fits into that ballpark fairly well, especially when you start to account for delta colour compression, primitive shaders, draw stream binning rasterization.

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

AkimboCurly said:

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Well MS expect Series S to sell much better than Series X. So that could actually mean that in the end Series X usually won't outperform PS5 version while Series S will be competent enough because that is where devs/pubs will sell most of the SW on Xbox.

I don't believe that for a second!

Just buying a memory card would make the Series S (about 350gb usable storage) cost as much as a Series X! In fact, from what I've seen, there are less Series S consoles released into the market, period. I think the Series S is like the "Xbox  360 Core". Yeah, it's cheaper but there are so many cuts that only the desperate will buy one.

I wouldn't buy a S, for sure X is a much better option. But the S outselling X was said by Phil Spencer himself.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

DonFerrari said:
d21lewis said:
DonFerrari said:
Pemalite said:
AkimboCurly said:

So yes empirically you're absolutely right most were not the full 720p vs. 1080p. I didn't intend it to be read like that was the norm, only that in extremis you get a 50% resolution hit. 

The reason I say it has to do with the RAM (and by extension the ESram) is because of the way it was partitioned up. I'm not a developer obviously but to my limited understanding, unless you can fit your render target into the 32MB buffer of extra-speedy RAM, you're forced to relegate it into the DDR3. This meant that, especially in multiplats which seldom used the buffer, the Xbox One would construct its frame with DDR3 (which is supposed to be system RAM) while the PS4 was able to use its GDDR5. The bandwidth differences then become seriously 2.0. The new Series S has a similar tiered memory architecture which people suspect the slower 2GB will be used for the OS. But even the faster memory (8GB GDDR6) has less bandwidth than the slowest tier memory in the Series X. So to my mind, unless developers scale their rendering targets nicely, both for the GPU but ALSO for the memory bandwidth, the Series S will get the short end of the stick. In practice that means that cutting texture resolutions AS WELL as internal resolution is basically non-negotiable. Watch Dogs got it right and AC Valhalla got it wrong

The eSRAM was basically up to developer choice.
Some developers could have used it as an additional level of cache for the CPU to bolster CPU performance.

Basically that 32MB was a massive limiter, it was necessary to leverage to get the most out of the machine, but developers learned to work around it by taking a tiled-approach over multiple passes to make the absolute most out of it.

DDR3 and GDDR5 are both "System Memories" and "Graphics Memories" in the 8th gen consoles, DDR3 definitely has the latency advantage (Remember DRAM latency is a result of clockrate.) which meant CPU tasks had an edge on the Xbox One (Plus clockrate advantage) and GDDR5 had the bandwidth advantage which meant graphics duties were simply superior on the Playstation 4... Plus the Playstation 4 just had the GPU compute to make up for the CPU deficiency, all comes down to the developer and engine.

Sadly the Xbox One was GPU limited more often than not, but when it wasn't and the CPU was the limitation, it definitely held a slight edge in gaming... A certain Assassins Creed title comes to mind when lots of actors were on-screen.

Things like Alpha Effects will be scaled back on the Series S, Resolution will be the first cutback which will save on bandwidth/fillrate massively. - Around 256GB/s of bandwidth is a good number for 1080P and the Series S fits into that ballpark fairly well, especially when you start to account for delta colour compression, primitive shaders, draw stream binning rasterization.

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

AkimboCurly said:

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Well MS expect Series S to sell much better than Series X. So that could actually mean that in the end Series X usually won't outperform PS5 version while Series S will be competent enough because that is where devs/pubs will sell most of the SW on Xbox.

I don't believe that for a second!

Just buying a memory card would make the Series S (about 350gb usable storage) cost as much as a Series X! In fact, from what I've seen, there are less Series S consoles released into the market, period. I think the Series S is like the "Xbox  360 Core". Yeah, it's cheaper but there are so many cuts that only the desperate will buy one.

I wouldn't buy a S, for sure X is a much better option. But the S outselling X was said by Phil Spencer himself.

*Head explodes*



mZuzek loves Smeags. 😢

EnricoPallazzo said:

Wait... so when you load current gen Series S game, let's say Witcher 3, of FFXV, or DS3, you DO NOT have the option to choose a resolution or performance mode like in One X? Those patches applied to One X games do not carry over to Series S, only to Series X? Thats really bad. 

Yep, the Series S plays all Xbox One content like a base Xbox One. Any 60 fps modes exclusive to One X are not going to be on the Series S.

A few things though.

Some games will be patched to improve resolution and boost performance to 60 fps. We are already seeing that with various titles even when relying on BC. I do believe MS can do this simply with approval from the publisher and assuming boosting the frame rate doesn't break the game.

Dark Souls 3 never had a 60 fps patch for Xbox One X. Its 900p/30 fps regardless which Xbox hardware you use. So MAYBE a simple patch could fix that game.

All OG Xbox content is 1440p. While select 360 titles achieve 1440p.

So its wait and see. Hopefully MS will make an effort to improve many games for Series S specs.



Recently Completed
Rage 2
for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)