By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Soundwave said:

That's great and all, but I would say again ... where is it? They've shown really nothing of this on any actual XBox, which begs a lot of questions. 

I'm guessing their implementation of this that isn't non-hardware specific has drawbacks in performance cost. Otherwise they would be crowing about it from every roof top. 

Especially if Sony doesn't have an equivalent. Nvidia's DLSS 2.0 is not some smoke and mirrors PR buzz, you can run it on actual games right now. Series X is three months from launch and Microsoft has little to nothing to say about DLSS like implementation. That is pretty hard to believe given the performance implications something like that has. 

Unless of course it doesn't work as well in real world scenarios (or at least relevant to the AMD hardware) as MS has been saying. 

Of course it has performance costs.
It is basically doing the same thing as nVidia's approach, but instead of doing the processing on the tensor cores it is doing it in the shaders leveraging rapid packed math. It's hardware agnostic remember, it can be done on even Intel integrated graphics.

But it also has performance benefits, because lower resolution and stuffs.

But yes, it has been demonstrated, they even got nVidia onboard.
https://devblogs.microsoft.com/directx/wp-content/uploads/sites/42/2018/03/WinML_acceleration_GDC.pdf



Either way... I am finding it a bit concerning that I have provided all this evidence and information for this feature and you still seem to be downplaying it substantially.
A.I is just not as important of a marketing "buzzword" as the real next-gen feature. Aka. Hardware Ray Tracing or SSD's... These days in 2020 the A.I buzzword is used everywhere, even with the camera sensors in modern smartphones. - It's just expected at this point.

But unlike your prior statement... Microsoft has certainly "said" a shit ton about their A.I upscaling... The evidence is in all the linkage I have provided.

Captain_Yuri said:

Well there is a slight catch as of right now based on Xbox Series X RDNA2 specifications.

Tensor cores are very specialized cores that can accelerate INT-8 TOPS which is what's used to Render DLSS according to Digital Foundry. The thing is, the Series X and Ps5 and probably the Series S will have cores that can also do them however not as fast as Nvidia's Tensor cores. Now due to architectural differences and software differences and etc, maybe RDNA 2 might not need them to be as fast so who knows. I am not that technical into how DLSS works at a low level loll.

The point is though, with what we know now, if we were to port DLSS over to the Series X. It would take twice as long to Render DLSS compared to a 2060.

<SNIP>

And there's a bit of an interesting article here too about it:

https://www.eurogamer.net/articles/digitalfoundry-2020-image-reconstruction-death-stranding-face-off

"There's an important point of differentiation between Nvidia's hardware and AMD's, however. The green team is deeply invested in AI acceleration across its entire business and it's investing significantly in die-space on the processor for dedicated AI tasks. AMD has not shared its plans for machine learning support with RDNA 2, and there is some confusion about its implementation in the next-gen consoles. Microsoft has confirmed support for accelerated INT4/INT8 processing for Xbox Series X (for the record, DLSS uses INT8) but Sony has not confirmed ML support for PlayStation 5 nor a clutch of other RDNA 2 features that are present for the next generation Xbox and in PC via DirectX 12 Ultimate support on upcoming AMD products.

Broadly speaking then, the Xbox Series X GPU has around 50 per cent of the RTX 2060's machine learning processing power. A notional DLSS port would see AI upscaling take 5ms to complete, rather than a 2060's circa 2.5ms. That's heavy, but still nowhere near as expensive as generating a full 4K image - and that's assuming that Microsoft isn't working on its own machine learning upscaling solution better suited to console development (spoilers: it is - or at least it was a few years back). In the meantime though, DLSS is the most exciting tech of its type - we're sure to see the technology evolve and for Nvidia to leverage a key hardware/software advantage. The only barrier I can see is its status as a proprietary technology requiring bespoke integration. DLSS only works as long as developers add it to their games, after all.

As exciting as the prospects for machine learning upscaling are, I also expect to see continued development of existing non-ML reconstruction techniques for the next-gen machines - Insomniac's temporal injection technique (as seen in Ratchet and Clank and Marvel's Spider-Man) is tremendous and I'm fascinated to see how this could evolve given access to the PS5's additional horsepower."

With that being said though, there are a lot more months for both Sony, MS and AMD to reveal more and more about their GPUs and other features so things could change.

Even if the Series X and Series S don't have dedicated INT cores, AMD's hardware can do it natively on the shader pipelines anyway.
RDNA natively has support for INT8 operations in it's shaders (But won't be RPM, AFAIK) and will pack two INT16 ops into an INT32... And there is the possibility that RDNA 2 will take that a step further and include INT4.

Temporal re-projection is likely to be a big tool next-gen, DLSS isn't a magic bullet that all developers are clamoring for.

freebs2 said:

It's not really Nintendo that should push the boundary of graphics. It's more a question of Amd vs Nvidia technology. I believe a next-gen Switch will have its own advantages compared even to next gen PS and Xbox, just for the eventual use of latest AI technologies developed by Nvidia. But that said, Switch 2 (if it retains the current form factor) will still be a low power / low spec machine so I agree with you, people should not get their hopes too high on the graphical front. 

Nintendo haven't had the most graphically advanced console since the 5th console generation with the Nintendo 64.

I think at this point it's expected that Nintendo isn't chasing the best graphics in the industry... But at the end of the day, it doesn't really matter as long as the visuals are "good enough" and the games play amazing.

Last edited by Pemalite - on 11 August 2020


www.youtube.com/@Pemalite