HoloDust said:
A78AEx8 at Switch 2 CPU clocks gives 493/2735 single/multi-core in Geekbench 6. PS4 does 197/990. Switch 2 CPU is not that great, compared to what PC/Android handhelds pull off, let alone PS5, but it's quite solid and much, much better than PS4. Not that it matters much, DLSS is GPU based, not CPU - CPU certainly influences overall performance, just not DLSS part of it. |
I didn't want to use Geekbench 6 as that also factors in some GPU functionality I just wanted to compare CPU performance isolated from the GPU to get a fair perspective of the actual CPUs. As for DLSS while the work is primarily done by the GPU a lot of work is done by the CPU. So the CPU performance especially the low CPU performance of the Switch 2 will surely have an impact especially it seems as the older RTX 2050 seems to need more CPU resources in this regard compared to the RTX 4050 so this is better optimised in later cards and the Switch 2 is of the same generation as the RTX 2050 just a much lower power version. Also you can see on PC benchmarks that CPU power has an effect on DLSS frame rates. DLSS performance scales with CPU performance. It doesn't take the burden of upscaling off the CPU as if it did you could have 360p upscaling to 1080p with a much inferior CPU instead but same frame rates. However the reality is it really takes the burden of the GPU more because now it can upscale to 4K with decent frame rates and image quality that it could never produce natively. It enables that GPU to punch well above its normal frame rates for that output resolution. However the burden to the CPU is greater. Of course many PCs have more than enough CPU power so its not an issue but the Switch 2 doesn't its going to be a matter of optimising CPU performance surely a lot of the time.
There was a comment about Hogwarts having poor DLSS upscaling compared to some other titles and this could be related to more limited CPU resources for that game and reducing the quality of DLSS. We will get a much more accurate picture in a few days when people start analysing games on retail hardware. The fact Nintendo have released information about optimising the operating system and trying to get background tasks onto one CPU core rather than two makes me think people are likely to be disappointed in the performance initially and they are trying to limit damage of that by stating like the PS4 it will eventually be better optimised releasing more performance for games but we shall see. I believe if I have remembered rightly FSR 3 etc on AMD chipsets takes far less GPU and CPU resources to upscale but then its much inferior results. XeSS takes a lot of CPU resources but gives much better results. Surely the fact XeSS operates on both AMD and Nvidia chipsets too shows its more CPU bound as well as graphic architecture is less of an issue for that upscaling technology.
Yes, while DLSS (Deep Learning Super Sampling) primarily relies on the GPU's Tensor Cores for its AI-powered upscaling, there's a CPU element involved as well, especially with newer features like DLSS Frame Generation.
Elaboration:
Tensor Cores (GPU):
DLSS's core functionality, which is the AI upscaling and frame generation, is handled by the GPU's Tensor Cores, specifically the RTX 20, RTX 30, RTX 40, RTX 50, and Quadro RTX series.
CPU's Role (Rendering, Pre-processing):
The CPU is responsible for rendering the game world and preparing the data for DLSS.
For DLSS Frame Generation, the CPU needs to efficiently manage the rendering of the initial frames, as the generated frames rely on that base.
A powerful CPU can help reduce the workload on the GPU, potentially leading to better performance, especially when DLSS is enabled.
DLSS and Frame Generation:
DLSS Frame Generation, which is available on RTX 40 and RTX 50 series GPUs, boosts frame rates by using AI to generate new frames.
This requires the CPU to efficiently handle the initial frame rendering and provide the necessary input for the AI.
A CPU bottleneck could potentially limit the benefits of DLSS Frame Generation, as it relies on the base frames generated by the CPU.
Impact of CPU on DLSS:
In some cases, a weak CPU can bottleneck the performance of DLSS, especially in CPU-intensive games or when running higher DLSS settings.
While the GPU handles the AI-powered upscaling and frame generation, the CPU's performance can impact the overall experience.








