By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Switch 2 motherboard maybe leaked

Kyuu said:

Seems like it'll be a lot weaker than I hoped (PS4 Pro) even though it's launching a year+ later than I expected. Perhaps modern features and optimization will push it close enough. Hopefully the base console won't cost more than $400.

It's hard to say, given how old GCN 2.0 is and there rarely are direct comparisons between these architectures (that control for driver optimizations) to give us a good idea how they compare, but Ampere TFLOPs seem to correspond to 1.1-1.3 GCN 2.0 TFLOPs when estimating rasterization from them (not including ray-tracing, neural-rendering, etc, of course.) PS4 Pro is capable of about 4.2 TFLOPs. That's about 3.23 - 3.8 TFLOPs (adjusted) when comparing with the Switch 2, adjusting for the TFLOP per unit of rasterization performance. That puts Switch 2 around 80-96% of the raw theoretical performance of the PS4 Pro before any bottlenecks, depending on which ratio you use. 

With DLSS, it shouldn't be too hard for Switch 2 to match or even exceed PS4 Pro level graphics when docked, especially given that it has a better CPU (even with the heavy under-clock) and more available memory. 

This is a rough comparison, but the thing to take away that in terms of raw-performance they're roughly in the same class.

And of course when it comes to modern features (like ray-tracing and neural rendering) the Switch 2 will be able to do things the PS4 Pro couldn't. 



Around the Network
Norion said:
haxxiy said:

CPU will be the bigger challenge for developers 100%.

PS4/XBO ports should be fine, but anything from this gen that isn't already running extremely well on the console CPUs would face huge issues.

How exactly would this compare to the CPU in the PS4/XBO? I assume it'd be more capable but it seems it wouldn't be by much based on what you're saying.

To put things in perspective, a Cortex A78 cluster (4-cores) gets a Geekbench 6 score of about 1121 single, 3016 multi-core at 2GHz.

An FX-8120 (a desktop CPU of an architecture similar to the 8th Gen console's) gets about 413 single, 1800 multi-core at 3.1 Ghz. 

The 8th Generation consoles range from 1.6 Ghz to 2.3 Ghz, so they're running at much lower clocks than the FX-8120. Jaguar had higher IPC than Bulldozer, but only by about 20-30%. Not enough to make up the difference. 

We're looking at an IPC for the Switch 2's CPU nearly double that of Jaguar CPUs. 

The A78C also has an advantage over the base A78, in that all cores are on a single cluster and are homogenous. 

Even at only 1Ghz the Switch 2's CPU should outclass the PS4/Pro/XBO/XBO:S/XBO:X pretty easily. The IPC difference is just too large between modern-ish ARM and Jaguar. 

There is also the matter that game-engines are just much more efficient with multi-threading loads now than they were during the 8th generation. 

Last edited by sc94597 - on 14 January 2025

sc94597 said:
Norion said:

How exactly would this compare to the CPU in the PS4/XBO? I assume it'd be more capable but it seems it wouldn't be by much based on what you're saying.

To put things in perspective, a Cortex A78 gets a Geekbench 6 score of about 1121 single, 3016 multi-core at 2GHz.

An FX-8120 (a desktop CPU of an architecture similar to the 8th Gen console's) gets about 413 single, 1800 multi-core at 3.1 Ghz. 

The 8th Generation consoles range from 1.6 Ghz to 2.3 Ghz, so they're running at much lower clocks than the FX-8120. Jaguar had higher IPC than Bulldozer, but only by about 20-30%. Not enough to make up the difference. 

We're looking at an IPC for the Switch 2's CPU nearly double that of Jaguar CPUs. 

The A78C also has an advantage over the base A78, in that all cores are on a single cluster and are homogenous. 

Even at only 1Ghz the Switch's CPU should outclass the PS4/Pro/XBO/XBO:S/XBO:X pretty easily. The IPC difference is just too large between modern-ish ARM and Jaguar. 

There is also the matter that game-engines are just much more efficient with multi-threading loads now than they were during the 8th generation. 

So a notable improvement then but considering how bad the Jaguar is it'd still be disappointing and way behind the current gen consoles. Hopefully it ends up being at least somewhat better than this.



Norion said:

So a notable improvement then but considering how bad the Jaguar is it'd still be disappointing and way behind the current gen consoles. Hopefully it ends up being at least somewhat better than this.

I am not as pessimistic as some are about it. Switch 2's CPU, even at only 1Ghz should be comparable to mobile i5's from a few generations ago that plenty of people are able to play games on at console-level framerates. For example, I have a Thinkpad with an i5-10310U that should be roughly comparable performance-wise to the Switch 2. With an eGPU (which has its own performance penalties associated with it) it's able to play any modern game at >=30fps. 

As a rough test, I am currently downloading Microsoft Flight Simulator 2024 now on an old Dell Inspiron with an i5 7300hq (4-core, 4-thread, low performance CPU) and GTX 1060 Max-Q to see how it performs. Guessing 1080p (upscaled) 30fps will be doable. The Switch 2's CPU should be slightly better than this old i5 in multi-core and similar in single-core. The GPU should be similar (maybe slightly weaker), in pure-rasterization. 

Last edited by sc94597 - on 14 January 2025

haxxiy said:

I mean, MSFS24 is bottlenecked at around 120 fps in a 12900K... which has twice as many cores running three to four times faster with a higher IPC than the Switch 2's CPU (assuming those clocks hold, again).

That game will definitely be cloud-only or drastically cut back. It's already going to be always online, so the former is more likely, IMO.

Was able to get the game to run on a 4-core 4-thread, 3.1 Ghz, i5-7300HQ at 1080p (with FSR-performance) 30fps in Windows 11. I pretty much just set everything to low, and didn't really optimize the graphics. Probably could've toggled on better graphics settings since the CPU was indeed a bottleneck at times (but most of the time it hovered around 60-80% utilization, with 90%+ GPU utilization) on the GTX 1060 Max-Q. Still was able to get a consistent-enough 30fps. 1% lows were around 27 FPS. Other than those drops, no noticeable stuttering. 

For context this is the CPU's Geekbench 6 score. Roughly 28% of the Series X-equivalent AMD 4800s in multi-core and 75% of its single-core performance.

Definitely don't think it is an impossible port, considering this. If Switch 2 shares some of this load with co-processors/GPGPU and the game scales with more cores (which it tends to do) then it seems possible. 

Last edited by sc94597 - on 15 January 2025

Around the Network
Norion said:

How exactly would this compare to the CPU in the PS4/XBO? I assume it'd be more capable but it seems it wouldn't be by much based on what you're saying.

Yeah, that sounds about right.

sc94597 said:

Was able to get the game to run on a 4-core 4-thread, 3.1 Ghz, i5-7300HQ at 1080p (with FSR-performance) 30fps in Windows 11. I pretty much just set everything to low, and didn't really optimize the graphics. Probably could've toggled on better graphics settings since the CPU was indeed a bottleneck at times (but most of the time it hovered around 60-80% utilization, with 90%+ GPU utilization) on the GTX 1060 Max-Q. Still was able to get a consistent-enough 30fps. 1% lows were around 27 FPS. Other than those drops, no noticeable stuttering. 

Even though synthetic benchmarks tend to flatter ARM processors, that i5 is still faster than a Snapdragon 8350 in PassMark, which in turn is exactly the same CPU architecture as the Switch 2 (1-3-4 cortex cores) with twice as high clocks.

Still a long way to go in terms of optimization, IMO.



 

 

 

 

 

sc94597 said:

Was able to get the game to run on a 4-core 4-thread, 3.1 Ghz, i5-7300HQ at 1080p (with FSR-performance) 30fps in Windows 11. I pretty much just set everything to low, and didn't really optimize the graphics. Probably could've toggled on better graphics settings since the CPU was indeed a bottleneck at times (but most of the time it hovered around 60-80% utilization, with 90%+ GPU utilization) on the GTX 1060 Max-Q. Still was able to get a consistent-enough 30fps. 1% lows were around 27 FPS. Other than those drops, no noticeable stuttering. 

Even though synthetic benchmarks tend to flatter ARM processors, that i5 is still faster than a Snapdragon 8350 in PassMark, which in turn is exactly the same CPU architecture as the Switch 2 (1-3-4 cortex cores) with twice as high clocks.

Still a long way to go in terms of optimization, IMO.

They're pretty close, with the 7300HQ having 10% higher multi-core and the 8350 having 27% higher single-core.  

https://www.cpubenchmark.net/cpu.php?cpu=Snapdragon+8350&id=3930

https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i5-7300HQ+%40+2.50GHz&id=2922

But we're expecting the Switch 2 to have quite a different configuration from the 8350 anyway, based on a homogenous A78C setup in a single, 8-core cluster. 

At similar clocks to the 7300HQ a 2 cluster A78C setup has about twice the performance of the i5 7300HQ (and nearly 25% higher IPC.) 

1010 single-core and 5335 multi-core in Geekbench 5

https://x.com/TheGalox_/status/1462968579150729222/photo/1

versus 657 single-core and 2483 multi-core in Geekbench 5 (edited scores because I just ran Geekbench 5 on my laptop.) 

At 35-40% of the performance (assuming linear frequency-performance scaling, which we probably shouldn't, but it's all we got), we're looking at a multi-core ~ 2000 for an A78C @1Ghz-1.1Ghz vs. 2483 for the i5 7300HQ. That's about 80% of the performance of my i5 7300HQ. 

----

I'll re-run the test with turbo boost off and the cores pegged at the 2.5 Ghz baseline and see if I can still get around 30 fps. 

Last edited by sc94597 - on 15 January 2025

sc94597 said:

At similar clocks to the 7300HQ a 2 cluster A78C setup has about twice the performance of the i5 7300HQ (and nearly 25% higher IPC.) 

1010 single-core and 5335 multi-core in Geekbench 5

https://x.com/TheGalox_/status/1462968579150729222/photo/1

versus 870 single-core and 2764 multi-core in Geekbench 5. 

At 35-40% of the performance (assuming linear frequency-performance scaling, which we probably shouldn't, but it's all we got), we're looking at a multi-core ~ 2000 for an A78C @1Ghz-1.1Ghz vs. 2764 for the i5 7300HQ. That's about 70% of the performance. 

My mistake, I looked at a different Orin chipset regarding the CPU configuration.

Still, like I said, synthetic benchmarks flatter ARM immensely compared to x86 - Geekbench even more so than PassMark. That's mainly because x86 natively supports a million more complex instructions than ARM, so in IRL scenarios things can be a bit different.

It's a much better CPU than the Switch 1, though, which is probably what Nintendo cares about.



 

 

 

 

 

Kyuu said:

Seems like it'll be a lot weaker than I hoped (PS4 Pro) even though it's launching a year+ later than I expected. Perhaps modern features and optimization will push it close enough. Hopefully the base console won't cost more than $400.

I always find PS4 Pro a weird reference point for power since its more or less tied to PS4 builds but with checkerboarded 4k. No games are meaningfully built around its specs.

I think comparing it to base PS4 makes a lot more sense especially as S2 internal resolutions will always be 1080p or less, even when docked. It sounds like its effectively doubling PS4's graphical capabilities which bodes incredibly well for first party games. Thinking of games like Ghost of Tsushimaa, The Last of Us pt 2 and God of War Ragnarok... Nintendo's gonna have hardware capable of better graphics than those. That's crazy exciting to think about.

As far as third parties I'm not so worried. Most titles are GPU bound, they'll be a bit soft and lacking some graphical features & texture detail on Switch 2 but will be comparable at a glace at 30fps. Developers shouldn't have much headache but day one releases are unlikely to be common. I think we'll often see Switch 2 versions 4-8months later and aiming for 1080p output docked, upscaled from sub 720p resolutions.

I'll be playing very few third party games on it but excited that some of my friends that only game on Nintendo will finally be able to tuck into modern JRPGs and AAA games

Last edited by Otter - on 15 January 2025

sc94597 said:
Norion said:

So a notable improvement then but considering how bad the Jaguar is it'd still be disappointing and way behind the current gen consoles. Hopefully it ends up being at least somewhat better than this.

I am not as pessimistic as some are about it. Switch 2's CPU, even at only 1Ghz should be comparable to mobile i5's from a few generations ago that plenty of people are able to play games on at console-level framerates. For example, I have a Thinkpad with an i5-10310U that should be roughly comparable performance-wise to the Switch 2. With an eGPU (which has its own performance penalties associated with it) it's able to play any modern game at >=30fps. 

As a rough test, I am currently downloading Microsoft Flight Simulator 2024 now on an old Dell Inspiron with an i5 7300hq (4-core, 4-thread, low performance CPU) and GTX 1060 Max-Q to see how it performs. Guessing 1080p (upscaled) 30fps will be doable. The Switch 2's CPU should be slightly better than this old i5 in multi-core and similar in single-core. The GPU should be similar (maybe slightly weaker), in pure-rasterization. 

Can that handle Dragon's Dogma 2 alright? Cause I know that pushes the CPU really hard in certain areas with something like a Ryzen 3600 not performing that well.