Otherwise in pure compute scenarios the Xbox Series X not only has the brute-strength advantage but also some efficiency advantages such as Variable Rate Shading which gives it the edge for things like global illumination lighting and shading.
You are assuming, like DF did a while ago, that the PS5 doesn’t have Variable Rate Shading or their own similar solution, which could be worse, just as good, or even better than DX12U’s VRS. It would be interesting if you could mention every feature the PS5 does and doesn’t support in comparison to the Series X. But I don’t think people should assume the PS5 doesn’t support certain features just because Sony isn’t openly bragging about them.
Even if the Playstation 5 doesn't have Variable Rate Shading baked in hardware, developers can implement it in software using their own algorithms, I just don't see it happening and it will come with overhead.
But I can only go by what features Microsoft and Sony have championed, if Sony or Microsoft hasn't advertised a certain *important* feature by now, can we assume they have it baked in hardware?
They are constantly trying to 1-up each other in the console stakes... Sony has rightfully talked up the SSD and Microsoft has talked up it's RDNA2 advantages.
Digital Foundry though I wouldn't disregard so readily, they tend to be right more often than not.
I don't think it's software/tool issue at all that causing the performance different, we need to remember Eurogamers preview, Gears 5 devs got Geforce 2080 performance from XsX after just 2 weeks of optimizations.
And Gears 5 don't favor AMD over Nvidia in this title:
In average, Radeon 5700XT is around 5% faster than Geforce 2070 when compared to other games. I'm pretty certain that I know what is going on but gonna keep it a secret and it's just speculation anyway. If I'm right, tools won't help XsX but it's not so bad for Xbox fans, RTX performance between the consoles should be very close to eachother.
Geforce RTX 2080 isn't that impressive.
Keep in mind that is 2018 hardware and we are almost in 2021.
Interested to know your findings on "what is going on".
If it's anything like the Teraflops list where you pretty much outlined every single variable so you can claim you were "right"... Well. Don't need to remind you how fallacious that line of thinking is.
At least the S allows for someone to put his/her game in rest mode and not crash.
Btw your sig is insane. Not positive insane, but insane insane. Assuming it's not a joke, you know Mario is the foundation of basically every modern platformer right?
It does go down to like 576p to run at 100-120fps though.
Xbox Series S:
It is soooo blurry compaired to PS5 or XSX, its nuts.
If I was a dev, I woulda just dropped 120fps mode on the Series S.
This is too big a drop in image quality imo, to give up for a mode few TVs support.
Who has brand new expensive tvs with 120fps,.... but buys a Series S anyways? Its a bad look.
I think it's fine. You don't have to use the 120fps mode, you can turn it off.
Eventually the Xbox Series S/X will get a hardware update and games like this will potentially operate at a higher resolution naturally without the need of a "remaster".
It also makes comparisons fun.
576p is a dogs breakfast though... But would probably look good on a CRT at those refresh rates.
@SvennoJ Not sure why we have screen tearing, its shameful considering we have had Vsync for decades now. The thread always changes, last gen was 1080p, mid way it was 4k and now its 60fps. Whatever does what better, the story always changes.
I tend to avoid vsync, so I am glad it's not there.
It introduces input lag... And if your framerate drops enough... It can take you from say... 45fps down to 30 instantly which can look jarring.
It's great if you are just hovering at like 33fps and want it locked at 30.
My displays have freesync, so it's a non-issue.
I will personally opt for a little bit of screen tearing over vsync, but that's just me personally.