By using this site, you agree to our Privacy Policy and our Terms of Use. Close

By the way where is this magical Microsoft DLSS equivalent? Not one peep out of them about that regarding XBox Series X where they've discussed every other hardware feature ad nauseam.

The XBox division has never mentioned this tech at all, which is curious. MS did have a presentation on it, but wouldn't you know it the GPU they were using to demo that was Nvidia hardware, not AMD.

If Series X can do that why not use it for that Minecraft ray tracing demo that tanked the hardware's performance down to 1080p. Surely you could render it at even 720p native and then scale up to an even better 1440p if the chip was capable of doing so? You would get an even better image quality while actually taxing the system far less, so it kinda begs the question where exactly that super duper ML tech is. 

Because we know for Nvidia it's here and it's now, no hype, no fuss, there are games using it that you can play right now. 

So one has to ask exactly why they're not using that. My guess is on the PC ML demo they showed they were using Nvidia's Tensor cores to help achieve that effect. You would think especially Sony, if something like that was possible with the AMD GPU they are using, they'd be shouting about it from the rooftops. 

How often exactly is a major, major hardware feature that dramatically impacts performance not even talked about for any gaming hardware 3 months prior to hardware launch?

Last edited by Soundwave - on 10 August 2020