By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 4.55%
 
Outdated 1 1.52%
 
Slightly outdated 16 24.24%
 
On point 37 56.06%
 
High tech! 7 10.61%
 
A mixed bag 2 3.03%
 
Total:66
bonzobanana said:

Obviously Switch 2 has a very inconsistent frame rate at 40 and 60fps despite also having dynamic resolution according to that video. I thought the PS4 Pro had a super sampling mode to create a 1440p to 1080p resolution but I guess you are saying there is also a 1080p resolution non super-sampling mode. It's not just shadow quality but draw distance, pop in etc and some additional graphical effects that are enhanced on both PS4 Pro and Xbox One X compared to Switch 2. However they aren't as locked to 30fps as Switch 2 is docked but then Switch 2 has inferior graphic fidelity so maybe that helps it maintain 30fps. This is from memory but the PS4 Pro has about 4 Teraflops of GPU performance and the Xbox One X 6 Teraflops where as the Switch 2 docked is around 2.2 Teraflops however no question the Graphic architecture of Switch 2 is superior and benefits from AI upscaling. I know the Xbox One X has basically a RX580 level GPU which is still very decent on PC for a budget Gaming PC but of course a PC massively exceeds Xbox One X in CPU performance typically. It feels like 10 years ago I was reading about PS4 Pro and Xbox One X versions of Fallout 4. 

Like you I feel if DLSS is implemented well the Switch 2 could exceed PS4 Pro in docked mode but we will see. I feel I've been here before expecting Bethesda to do a competent job only to be disappointed by the results. Overall I'm still very pleased with the Switch 2 performance level for this game, I love the variety of options on Switch 2 with regard frame rate. Really performance is as expected but pricing ridiculously high for such a old game. I am really looking forward to playing this at some point when pricing is a fraction of what it is today. I can't imagine it beating the Steam Deck though, the game runs brilliantly on that. I just love Fallout 4 so much I'm happy to start it again on another system and make different choices at the beginning to make it a different experience.

4:55

There was a Boost Mode firmware update before the full resolution upgrade patch that allows the game to run at 1080p with no super sampling on PS4 Pro, and it is only then that it gets a solid 30fps. I don't know where you're getting that the PS4 Pro version has an advantage in draw distance and pop in versus Switch 2. The Digital Foundry video didn't make that comparison at all. The reduced LOD settings were in comparison to the Series S and PS5 versions, which are improved beyond PS4 Pro. "Looking closer" to the PS4 version, is not the same as looking identical to the base PS4 version, and an actual comparison between the PS4 Pro and SW2 would have to be made to say if the SW2 is closer to PS4 Pro or base PS4 there. 

Using TFLOPs to compare across microarchitectures, especially ones that are many generations separated and from different manufacturers, is nonsense. You've been told this dozens of times in this thread and other threads over the years. And yet again, the reason is that about a third of all game-compute on the GPU (not even counting the CPU) is not done with floating points. TFLOPs is a good measure if we are talking about scientific computing or pure deep learning. It's not a good measure for gaming without context. Also your 2.2 TFLOPS number is wrong. The SW2 docked is 3.09 TFLOPs. TFLOPS is and always has been a function of max clock rate of the GPU and core count. Given the reported max GPU clocks are almost certainly correct and so is the 12SM count, the number is absolutely 3.07 TFLOPs. That doesn't mean the GPU is always running at the max frequency, but this is likewise true of the PS4 Pro and Xbox One X too, their GPUs aren't always running at their max frequencies either. Finally, Digital Foundry -- whose opinion you seem to trust confirms the 3.09 TFLOPS.

I don't think anybody can convince you to stop spreading misinformation though. You've been doing it for a long time and persist even after being corrected. But if it were indeed true that the SW2 docked were roughly 55% the performance of a PS4 Pro (what you're suggesting) it wouldn't be able to run this game at 1440p 30fps, like the PS4 Pro. The level of LOD management or reduction in quality would have to be drastic to get it there. You're suggesting that they got the SW2 to run at native 1440p 30fps with the compute level of nearly a base PS4, without drastic cuts to visual quality. And you're talking about Bethesda doing this? Come on. Be real. 

Last edited by sc94597 - on 08 March 2026

Around the Network
sc94597 said:

4:55

There was a Boost Mode firmware update before the full resolution upgrade patch that allows the game to run at 1080p with no super sampling on PS4 Pro, and it is only then that it gets a solid 30fps. I don't know where you're getting that the PS4 Pro version has an advantage in draw distance and pop in versus Switch 2. The Digital Foundry video didn't make that comparison at all. The reduced LOD settings were in comparison to the Series S and PS5 versions, which are improved beyond PS4 Pro. "Looking closer" to the PS4 version, is not the same as looking identical to the base PS4 version, and an actual comparison between the PS4 Pro and SW2 would have to be made to say if the SW2 is closer to PS4 Pro or base PS4 there. 

Using TFLOPs to compare across microarchitectures, especially ones that are many generations separated and from different manufacturers, is nonsense. You've been told this dozens of times in this thread and other threads over the years. And yet again, the reason is that about a third of all game-compute on the GPU (not even counting the CPU) is not done with floating points. TFLOPs is a good measure if we are talking about scientific computing or pure deep learning. It's not a good measure for gaming without context. Also your 2.2 TFLOPS number is wrong. The SW2 docked is 3.07 TFLOPs. TFLOPS is and always has been a function of max clock rate of the GPU and core count. Given the reported max GPU clocks are almost certainly correct and so is the 12SM count, the number is absolutely 3.07 TFLOPs. That doesn't mean the GPU is always running at the max frequency, but this is likewise true of the PS4 Pro and Xbox One X too, their GPUs aren't always running at their max frequencies either. Finally, Digital Foundry -- whose opinion you seem to trust confirms the 3.07 TFLOPS.

I don't think anybody can convince you to stop spreading misinformation though. You've been doing it for a long time and persist even after being corrected. But if it were indeed true that the SW2 docked were roughly 55% the performance of a PS4 Pro (what you're suggesting) it wouldn't be able to run this game at 1440p 30fps, like the PS4 Pro. The level of LOD management or reduction in quality would have to be drastic to get it there. You're suggesting that they got the SW2 to run at native 1440p 30fps with the compute level of nearly a base PS4, without drastic cuts to visual quality. And you're talking about Bethesda doing this? Come on. Be real. 

 Geekerwan did the full analysis of Switch 2 hardware and the figure they gave was 2-2.4 teraflops so I went with 2.2 Teraflops. Everyone knows Nintendo does not clock chipsets to their full speed so why you repeat this nonsense that Nintendo will clock at the full Mhz it is ridiculous. Same as the CPUs where the potential is there to be much higher but they clocked at 1Ghz even for docked. So please stop doing this ridiculous pro Nintendo fanboy rubbish which surely no one believes at all. The video posted stating stated around PS4 graphics or slightly below and of course PS4 Pro and Xbox One X are significantly improved on that albeit at a 30fps cap which at times they struggle to maintain. Yes gflops figures are still relevant but not the full picture. Yes AI upscaling especially can make such figures not as relevant but they still have relevance especially if you are not using DLSS like many Switch 2 games especially Nintendo titles it seems. However none of the figures really matter when we can see how the Switch 2 is performing and what level it reaches and that is PS4 level graphics with more stable 30fps frame rate. That's what that video stated, what video were you watching? How on earth can you accuse anyone of spreading misinformation when your posts are so clearly unrealistic and pro-Nintendo with no ability to be rational at all? The Switch 2 is performing around PS4 base spec level as a portable but maintaining a more stable 30fps frame rate and docked its the same PS4 base level graphic fidelity but at 1440p again with a more stable 30fps which you would expect based on the Geekerwan analysis. PS4 Pro and Xbox One X have higher graphic fidelity and features than that but with a less stable frame rate under extreme load. However the 40fps and 60fps Switch 2 frame rates are not anywhere near as stable and in the portable side can drop as low as 504p. That is what the video shows us. Nothing controversial or debatable surely. You seem to be in denial about the PS4 Pro and Xbox One X improved visuals but that is just a issue in your head based on your own ridiculous bias.



bonzobanana said:

 Geekerwan did the full analysis of Switch 2 hardware and the figure they gave was 2-2.4 teraflops so I went with 2.2 Teraflops. Everyone knows Nintendo does not clock chipsets to their full speed so why you repeat this nonsense that Nintendo will clock at the full Mhz it is ridiculous. Same as the CPUs where the potential is there to be much higher but they clocked at 1Ghz even for docked. So please stop doing this ridiculous pro Nintendo fanboy rubbish which surely no one believes at all. The video posted stating stated around PS4 graphics or slightly below and of course PS4 Pro and Xbox One X are significantly improved on that albeit at a 30fps cap which at times they struggle to maintain. Yes gflops figures are still relevant but not the full picture. Yes AI upscaling especially can make such figures not as relevant but they still have relevance especially if you are not using DLSS like many Switch 2 games especially Nintendo titles it seems. However none of the figures really matter when we can see how the Switch 2 is performing and what level it reaches and that is PS4 level graphics with more stable 30fps frame rate. That's what that video stated, what video were you watching? How on earth can you accuse anyone of spreading misinformation when your posts are so clearly unrealistic and pro-Nintendo with no ability to be rational at all? The Switch 2 is performing around PS4 base spec level as a portable but maintaining a more stable 30fps frame rate and docked its the same PS4 base level graphic fidelity but at 1440p again with a more stable 30fps which you would expect based on the Geekerwan analysis. PS4 Pro and Xbox One X have higher graphic fidelity and features than that but with a less stable frame rate under extreme load. However the 40fps and 60fps Switch 2 frame rates are not anywhere near as stable and in the portable side can drop as low as 504p. That is what the video shows us. Nothing controversial or debatable surely. You seem to be in denial about the PS4 Pro and Xbox One X improved visuals but that is just a issue in your head based on your own ridiculous bias.

You need to source that Geekerwan citation. I can't find it anywhere. I am guessing you're taking that from his "simulated Switch 2" test he did and the fact that he is roughly comparing it to a GTX 1050ti on a Steel Nomad benchmark. If that is the case, then.... sigh, I don't have the energy to explain to you how ridiculous that conclusion is. 

The 3.09 TFLOPs is with the actual max GPU clocks available to developers (1.007 Ghz) as stated in the SDK, not the theoretical max (1.4 Ghz) for the chip. If the Switch 2 were able to clock at its theoretical max it would be a 4.2 TFLOP chip. The SDK leak has also been confirmed by Digital Foundry. The fact you're still arguing about this confirmed fact, a year after it has been confirmed, is really dishonest. Again ask yourself, would a GPU 55% as performant as the PS4 Pro's be able to rasterize Fallout 4 at a solid 1440p 30fps, more solid than the PS4 Pro does? Think critically here rather than regurgitate a poor conclusion from data you've cherry-picked. 

Please quote where the video said "PS4 ... or slightly below." Digital Foundry said it was a closer match to PS4 and had reduced LOD compared PS5/Series S, not that it was possibly below PS4 or even exactly the same as PS4.


https://www.digitalfoundry.net/reviews/fallout-4-on-switch-2-is-dramatically-improved-over-its-nintendo-direct-debut

Draw distance is more in line with PS4 than Series S or PS5, with more obvious pop-in. Volumetrics are also curtailed with more noise and flicker. Meanwhile, Switch 2's 60fps DRS is more aggressive than Series S overall, especially in using their respective 60fps options.

Also there is a significant difference between the Xbox One X and PS4 Pro versions of the game, so I don't know why you're pairing them together here. Arguably the difference is larger than PS4 Pro and base PS4.  

Not only does the Xbox One X use its increased power to render a higher resolution, it also renders more. There’s an impressive jump in middle draw distance between the PS4 Pro version and the Xbox One X. Check out the buildings and details in the GIF below. Oddly enough, Digital Foundry also tested the unpatched version of Fallout 4 on the Xbox One X, and it also featured better mid-range draw distance than the PS4 Pro.

I agree the SW2 is roughly on par with a base PS4 in portable mode, and so far we've seen it able to rasterize as well as a PS4 Pro in docked mode for many titles. Not always (especially not when DLSS is applied and resources are allocated to that), but often enough. 

By the way, my primary platform is PC. I own a Switch 2, Series X, PS5, and PS5 Pro. Also a Steam Deck and Rog Ally. Reducing my position as one of a coping Nintendo fan doesn't work here. 

Edit: Also Digital Foundry said the 40fps mode is very stable, even at stress points. 

Last edited by sc94597 - on 08 March 2026

By the way, this is why extrapolating TFLOPs from a Steel Nomad Light benchmark is a bad idea. 

Here are the TFLOPs and average Steel Nomad Light benchmark scores for various GPU's in that range. Notice that an R9 285 has a similar score to a GTX 1050ti, despite the R9 285 being a 3.25 TFLOPs card and the GTX 1050ti being 2.2 TFLOPs. 

GPU TFLOPS Benchmark
RX 560 2.611 1769
GTX 950 1.825 1708
Radeon 7870 2.56 1530
R9 270x 2.688 1739
GTX 1050 1.862 1830
Radeon HD 7950 2.867 1958
GTX 1630 1.828 1966
GTX 960 2.413 2180
Radeon R9 285 3.29 2434
Radeon HD 7970 3.789 2253
Radeon R9 380 3.476 2492
GTX 1050ti 2.138 2312

In fact if you do a simple linear regression on these 12 data points (with Steel Nomad being the dependent variable), you get an R^2 of only about .287, and an insignificant p-value (assuming .05 alpha.) I hope this table also shows why TFLOPs aren't a good measure of performance, even for synthetic benchmarks like Steel Nomad Light. Even still, using the equation from that simple linear regression you get something like a predicted 2.825 TFLOPs +/- 0.4 TFLOP given a Steel Nomad score of 2205. 

But really we don't need to do this to know what SW2's TFLOPs are. Again, TFLOPs are exactly a function of max clock rate and core count. We know both of these. 

Docked FP32 TFLOPs = 1536 Cores * 1.007 Ghz x 2 ~ 3.09 TFLOPs. 

Handheld FP32 TFLOPs = 1536 Cores * 561 Mhz x 2 ~ 1.72 TFLOPs. 

Last edited by sc94597 - on 08 March 2026

When it gets to the point of calling others "fanboys" simply for posting factual information, it might be time to take a step back and give it a rest, this endless downplaying is starting to come off as just obsessive and toxic at this point.



Around the Network
sc94597 said:

By the way, this is why extrapolating TFLOPs from a Steel Nomad Light benchmark is a bad idea. 

Here are the TFLOPs and average Steel Nomad Light benchmark scores for various GPU's in that range. Notice that an R9 285 has a similar score to a GTX 1050ti, despite the R9 285 being a 3.25 TFLOPs card and the GTX 1050ti being 2.2 TFLOPs. 

GPUTFLOPSBenchmark
RX 5602.6111769
GTX 9501.8251708
Radeon 78702.561530
R9 270x2.6881739
GTX 10501.8621830
Radeon HD 79502.8671958
GTX 16301.8281966
GTX 9602.4132180
Radeon R9 2853.292434
Radeon HD 79703.7892253
Radeon R9 3803.4762492
GTX 1050ti2.1382312

In fact if you do a simple linear regression on these 12 data points (with Steel Nomad being the dependent variable), you get an R^2 of only about .287, and an insignificant p-value (assuming .05 alpha.) I hope this table also shows why TFLOPs aren't a good measure of performance, even for synthetic benchmarks like Steel Nomad Light. Even still, using the equation from that simple linear regression you get something like a predicted 2.825 TFLOPs +/- 0.4 TFLOP given a Steel Nomad score of 2205. 

But really we don't need to do this to know what SW2's TFLOPs are. Again, TFLOPs are exactly a function of max clock rate and core count. We know both of these. 

Docked FP32 TFLOPs = 1536 Cores * 1.007 Ghz x 2 ~ 3.09 TFLOPs. 

Handheld FP32 TFLOPs = 1536 Cores * 561 Mhz x 2 ~ 1.72 TFLOPs. 

Well I've got a RTX 2050 mobile laptop which is basically Switch 2 GPU architecture but over twice the power and compared to the RX 580 (Xbox One X) it is still weaker in real world tests as you would expect. The Switch 2 is under half that performance when docked.

https://pc-builds.com/compare/gpu/0Yg1aB/radeon-rx-580/geforce-rtx-2050

Lets not forget Nintendo does not give out full specs and development hardware is not retail hardware. The Geekerwan analysis is the best we have at the current time. Lets also not forget the Switch 2 is based on a dated mainly 10Nm fabrication process which is very power hungry and it only comes with a 19Wh battery well below other mobile gaming devices. This idea that somehow Nintendo have clocked the system to high levels despite always clocking low previously and simply not having that level of power. Even the Geekerwan analysis only gave the theoretical peak figures for GPU performance when portable, likely below 1 Teraflop in real terms a lot of the time. This is nothing new we know Nintendo always clocks lower, they always have, it means greater reliability, longer battery life and less returns for them, a cooler running system is a longer life unit. It's what Nintendo does to maximise profits. Why do we keep getting people pretending Nintendo have clocked to the maximum clocks. Development hardware is always clocked higher than retail hardware especially where Nintendo is concerned. I don't get why we have to pretend the Switch 2 is more powerful. The Geekerwan analysis was brilliant it gave us a decent and a REALISTIC viewpoint of the hardware without fanboy nonsense. Nothing we are seeing in how the Switch 2 performs is contradicting it. Ultimately again we are in a situation where people just want to believe what they want to believe which is pure nonsense. The Switch 2 is a decent portable handheld but its based on a fabrication process from 2020 and has a tiny battery and yet people keep pretending its clocked to high limits. The reason the Switch 2 can perform well is the fantastic DLSS upscaling and of course its a fixed platform so can be fully optimised to work around weaknesses. Some games do this and some do not, quick and dirty ports to Switch 2 show a very low performance level.



curl-6 said:

When it gets to the point of calling others "fanboys" simply for posting factual information, it might be time to take a step back and give it a rest, this endless downplaying is starting to come off as just obsessive and toxic at this point.

I guess if your view is that the Geekerwan analysis are Nintendo haters just trying to make it seem like Switch 2 is weaker than it is then that would be a valid viewpoint. There is nothing new about people making false claims about Nintendo hardware and its performance level we had it with wii u and Switch and in the end the people who were claiming the lower performance level were shown to be true. There is nothing new here. Non-technical people just choose what they want to believe but that doesn't actually make it true.



bonzobanana said:

Well I've got a RTX 2050 mobile laptop which is basically Switch 2 GPU architecture but over twice the power and compared to the RX 580 (Xbox One X) it is still weaker in real world tests as you would expect. The Switch 2 is under half that performance when docked.

https://pc-builds.com/compare/gpu/0Yg1aB/radeon-rx-580/geforce-rtx-2050

Lets not forget Nintendo does not give out full specs and development hardware is not retail hardware. The Geekerwan analysis is the best we have at the current time. Lets also not forget the Switch 2 is based on a dated mainly 10Nm fabrication process which is very power hungry and it only comes with a 19Wh battery well below other mobile gaming devices. This idea that somehow Nintendo have clocked the system to high levels despite always clocking low previously and simply not having that level of power. Even the Geekerwan analysis only gave the theoretical peak figures for GPU performance when portable, likely below 1 Teraflop in real terms a lot of the time. This is nothing new we know Nintendo always clocks lower, they always have, it means greater reliability, longer battery life and less returns for them, a cooler running system is a longer life unit. It's what Nintendo does to maximise profits. Why do we keep getting people pretending Nintendo have clocked to the maximum clocks. Development hardware is always clocked higher than retail hardware especially where Nintendo is concerned. I don't get why we have to pretend the Switch 2 is more powerful. The Geekerwan analysis was brilliant it gave us a decent and a REALISTIC viewpoint of the hardware without fanboy nonsense. Nothing we are seeing in how the Switch 2 performs is contradicting it. Ultimately again we are in a situation where people just want to believe what they want to believe which is pure nonsense. The Switch 2 is a decent portable handheld but its based on a fabrication process from 2020 and has a tiny battery and yet people keep pretending its clocked to high limits. The reason the Switch 2 can perform well is the fantastic DLSS upscaling and of course its a fixed platform so can be fully optimised to work around weaknesses. Some games do this and some do not, quick and dirty ports to Switch 2 show a very low performance level.

A few things: 

1. The GPU in the PS4 Pro isn't an RX 580 or equivalent to one. Not sure why you're bringing it up, other than maybe you're conflating a Switch 2 comparison with PS4 Pro with a comparison with the more performant Xbox One X? A much closer analogue to the PS4 Pro's GPU is a slightly power-limited R9 290 (say an R9 290 set to 85% of its max clock rate.) 

2. An SDK isn't Nintendo releasing specific specifications to the public. The target audience are video game developers. Nintendo is obviously going to include documentation on what the SW2 is capable of in a development kit package used by game developers. Digital Foundry has already verified the SDK leak as accurate

And again, the clock rates used to calculate the TFLOPs here, aren't the theoretical maximum (1.4Ghz) but the maximum available to developers for their games (1.007Ghz), so your point of dev kits having more resources is moot here, given that this is specifically a target for games running on the system. 

Make no mistake though, this is a full Ampere GPU, rated for 3.072 TFLOPs when running in docked mode according to Nintendo itself, so by extension that'll drop down to 1.71 TFLOPs when running in mobile mode. TLOPs comparisons against other devices are irrelevant at this point.

I could see the SW2 being hacked in a few months or years and the max clock rates reporting as such while games are running, and you'll still be in denial somehow. 

3. The Geekerwan simulation is fine. Your extrapolation beyond it isn't. He compared a simulated SW2 (not even the SW2 itself) to other hardware on a synthetic benchmark. He never provided a 2.2 TFLOP number. You erroneously pulled that out of the fact that their simulation gave a similar score to a GTX 1050ti, but I showed in the post you just quoted GPUs with TFLOPs much higher getting similar scores -- like the R9 380. Extrapolating from a single data point will lead you to silly conclusions like this. 

It's not "wanting to believe what they want to believe" it is knowing the facts as they exist based on the evidence we have. #2 above especially, is something you're motivationally ignoring. It's a fact that it has been verified what the clock rates of the SW2 are by Digital Foundry. And they treat it as such, like everyone else who isn't stuck in the year 2024. 

Finally, you still haven't answered the core question. If the Switch 2 is 55% as performant as the PS4 Pro (which your motivated reasoning implies), why is it rendering a purely rasterized load (no DLSS) in FO4, at the same resolution and with a better frame-rate than PS4 Pro? A minimal difference in LODs (and that is assuming it even is the case) isn't going to suddenly make hardware that is nearly half as performant pull this off. What explanation do you have for this? 

Last edited by sc94597 - on 09 March 2026

Soundwave said:

Where's FF7 Rebirth and Star Wars Outlaws and Assassin's Creed Shadows on the PS4 Pro? *crickets*

Switch 2 is competently running games a generation ahead of the PS4 (with ray tracing in Star Wars Outlaws case), Indiana Jones arriving soon also ... either the PS5 gen is a fraud generation/hardware or the Switch 2 is a generation ahead of the PS4. Can't have it both ways when ever it suits whatever the narrative is supposed to be this week. 

If a fraud gen means games can be ported down, that is every generation going forward. Same will be a he case for Switch 3 and most of Nintendo's current Switch 2 line up.

We've already seen games made for current gen only, ported back to PS4. RE8/RE4 had PS4 versions announced later which looked great. Jedi Survivor was decent outside of cutscenes where the HDD couldn't stream data in time.

PS5 gen represents a more incremental leap over PS4 but that's 75% the reality of diminishing returns. When PS4 games can look as good as TLOU2 and RDR2 (both of which are better looking then many PS5 games at a glance and any Switch 2 game), you know most game experiences can be scaled down. And equally you know most devs can't really compete with those games in terms of resources/budget. Overall PS5 experience quality is definitely a step ahead of PS4 but it's not a generational leap of old.

The push for 60fps also means there's more of headroom for decent 30fps experiences on weaker hardware.

Looking at the ports though you can clearly see that S2 has architectural/hardware advantages that represent a generational leap over PS4 Pro in spite pro having more pixel pushing power. Whether it be the SSD, DLSS or ray traycing capacity.

As opposed to trying to paint a black and white picture in terms of generations, the reality is a bit more gray.

Last edited by Otter - on 09 March 2026

bonzobanana said:
curl-6 said:

When it gets to the point of calling others "fanboys" simply for posting factual information, it might be time to take a step back and give it a rest, this endless downplaying is starting to come off as just obsessive and toxic at this point.

I guess if your view is that the Geekerwan analysis are Nintendo haters just trying to make it seem like Switch 2 is weaker than it is then that would be a valid viewpoint. There is nothing new about people making false claims about Nintendo hardware and its performance level we had it with wii u and Switch and in the end the people who were claiming the lower performance level were shown to be true. There is nothing new here. Non-technical people just choose what they want to believe but that doesn't actually make it true.

sc94597 already covered the Geekerwan point.

Saying a source may not be inherently accurate doesn't imply they're a "hater".

And actually, many claiming a lower performance level for Switch 1 were wrong; tons of people wrongly claimed for instance that it was on par or even less capable than PS3 and 360, which was untrue.

Same thing happened with Switch 2, with many claiming it was "just a PS4" which turned out to be false.

For every overestimation there was an underestimation, you're fixating on just the side you disagree with.

Last edited by curl-6 - on 09 March 2026