HoloDust said:
Not sure if you watched the video, it was very noticeable, and some days ago DF did an episode on Hogwarts where they've discussed that very scene - standard DLSS 3/4 just doesn't look that bad in 1440p in any setting (and that is resolution Hogwarts is supposedly outputting, in docked mode), so it is very weird that it does on SW2 - as if it has reduced precision or a shorter accumulation window for temporal data, which is why disocclusion artifacts are pronounced so heavily. So maybe there's actual SW2 Ultra Lite DLSS after all, that cuts back on some stuff to be much lighter to run, but in return can't handle some problems as well as standard DLSS. (again, do watch DF video, starting at around 13:15, at full screen preferably). |
I personally don't think the Switch 2 has the CPU resources that many are claiming. An 8 core A78C CPU at 1Ghz is about a passmark of 1900 although from what was claimed the Switch 2 doesn't have the 8MB of cache that a full implementation would have only 4MB so that would reduce performance a bit. The PS4 has around a passmark of 1300 for its eight Jaguar cores. However the Switch 2 uses more of its CPU performance with features like gamechat in fact 2 of its cores are used. I'm assuming the figures being quoted are the short life bursts to 1.7Ghz that the Switch 2 is capable of but that isn't sustained performance where as the Jaguar cores are at 1.6Ghz each. That is their base clock speed. The PS4 uses one of its cores for the operating system etc which admittedly came later with a revision to the operating system. Originally it was 2 like the Switch 2. The Switch 2 being a portable system will always be trying to reduce power and lowering Mhz where as when docked this isn't such an issue its more thermal management.
I see claims that the Switch 2 is super powerful in CPU terms but I really can't see it myself. The Arm Cortex A78C is an old CPU of the same era as the graphics architecture and the 'C' is mainly enhanced security features which obviously Nintendo would want. How those enhanced security features effect performance I don't know.
There is a CPU element to upscaling as well as GPU and its likely some games will reduce the level of upscaling quality to reduce CPU load. Which maybe what Hogwarts is doing.
Development kits for Switch 2 have been with developers since 2019/2020 incredibly although when Nintendo delayed launching their updated Switch because the Switch 1 was still selling incredibly well I guess they would have abandoned development but they have been perfecting their development on the T239 chipset for a long time. So I don't think Switch 2 is necessarily going to be achieving much greater optimisation over the years. To developers they have had it a very long time it is old technology to them.
I guess Nintendo were morally obliged to stick with T239 to a degree based on all the development work that would already have been done on it.
Just for comparison on a overclocked modded Mariko Switch 1 with the 4 cores operating at 2Ghz you can get about 1200 passmark score where as a stock Switch 1 is around 600.
So they are not that far apart from each other really except for standard CPU frequencies on the original Switch.
I used a much later ARM A78 chip for comparison as couldn't find passmark information for the older A78C but they should be comparable being on the same architecture. I can't imagine the later chip being anymore than 5% faster if that. Obviously you need to adjust the performance to the much lower Ghz of the Switch 2 which is about 1Ghz.
https://www.cpubenchmark.net/cpu.php?cpu=ARM+Cortex-A78AE+8+Core+1984+MHz&id=6298
The AMD Jaguar 1.6GHz 8-core CPU used in the PlayStation 4 and Xbox One has a PassMark rating around 1200-1400. It's considered a low-power, efficient processor, but not as powerful as modern CPUs like those in the Ryzen series or the i7-4790K.








