By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Microsoft are pitching Series S as a low resolution version of X. But obviously with the X's GPU having about 3x more performance, there should be more differences than that. And I don't think MS are communicating all the differences very well, maybe intentionally.

For instance, my TV is a bit rubbish considering it's only last year's model. HDMI 1.4 means that with the Series X, if I force 4k (+ downsampling) I'm limited to 30fps - not an option. So I'd rather settle for 1080p + 60fps.

However... it's not just frame rate or resolution that will be affected. Polygon count, texture detail, amount of raytracing, amount of effects should all be more complex and higher processed on the X vs the S. In other words, if you have a 1080p - whether capable of supersampling or not - you're still better off with the X because of the extra GPU grunt doing so much more than just resolution/frame rate improvements. Unless I'm missing something?