By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Was Nintendo right to opt out of the graphics arms race?

Tagged games:

 

Was it the right decision?

Yes 74 88.10%
 
No 10 11.90%
 
Total:84



Around the Network
Pemalite said:
Chrkeller said:

This, regarding the bolded part.  That is a premium chipset but will also (at least) double the cost compared to the Switch 2.  Nintendo sacrificed power in favor of price point, which btw is the smart move.  Nintendo was never going to go premium at $600+.  Nintendo does not sell niche products; they sell wide appeal mass market products.  My house has two units, one for me and one for my daughter.  At $600+ we wouldn't own two of them, and my daughter would be buying a lot less software.  

I get that it's a premium part... But it falls into that "price" argument, that you get what you pay for.

The Switch 2 doesn't exist in a bubble where it offers the best price/performance for every single person.
The Switch Lite is half the price and has an amazing game library...
Or you could spend a bit more than the Switch 2 and get 40+ years of games and a dozen console platforms with a PC handheld.
Or you can go all out and get the most powerful handheld device on Earth that can play PS5 games at PS5 quality settings.

So any argument that the Switch 2 is a "premium" device is just blatantly false.

There are many people who can't justify the Switch 2's price when they can buy 2x Switch Lite consoles instead.

Soundwave said:

No one cares about premium, premium is a niche, tiny ass market.

You literally said the Switch 2 is a "Premium product" in a previous post.


The literal mind-games and backflips is pretty hypocritical.

Chrkeller said:

The bolded part, exactly.  If someone wants to say it outclasses the ps4, fair, it does.  If someone wants to argue it outclasses the ps4 pro, questionable, but still acceptable.  But the attitude that it is leaps and bounds beyond the ps4 is absurd.  

The Switch 2 has multiple hardware advantages when compared against the Playstation 4 or even Playstation 4 Pro.
The Playstation 4 being based on Graphics Core Next hasn't aged particularly well when it comes to newer rendering methods.
The lowest and slowest discreet GPU's today, even on a 64bit memory bus... Will beat a Radeon 7850 that is the PS4-equivalent.

I.E. DLSS decimates Checkerboard.
RT is far better suited to the Switch 2's GPU over Graphics Core Next due to that architectures lack of low-precision math. (Even when accounting for PS4 Pro's RPM path)

Chrkeller said:

The ps4 pro has a more powerful CPU compared to the Switch 2.  The Switch 2 fits between the ps4 and ps4 pro, but with newer tech like RT.  So, calling the S2 a ps4 pro with bells/whistles is just accurate..  and ironically is exactly what the vast majority of us predicted.  Nintendo sacrificed power for price, as we all said they would.  It isn't a premium product, because if it were, the unit would be 16 gb ram, not 12 gb.    

The Switch 2's ARM A78C cores are significantly better than Jaguar.
A78c is based on ARMv8.2-A with better branch prediction, wider execution, larger caches... I would peg the IPC of A78c to be around 3x-4x that of Jaguar at the same clock.

Thus the PS4 CPU would need to operate in the range of 3ghz-4ghz to match the Switch 2's CPU's and not the paltry 1.6Ghz clock.

The Switch 2 also offloads some CPU processing like decompression, which would consume a core or two on last generation hardware, significantly increasing available CPU resources for other tasks.

The Clock on the Switch 2 whilst only at 1ghz... Sucks. It's not the end of the world.

It falls significantly short of AMD Zen however.


Soundwave said:

I also said many times also this was not a budget hardware and would likely cost $400 only if you were lucky, likely $450 or more looking at the component leaks. 1536 CUDA cores, 12MB RAM, 256GB high speed internal storage, 8 inch display, this was not some rinky dinky bargain basement cheap hardware, but you had a lot of people here thinking $350 was in play (lol). Because I was looking at the actual leaks and the components therein, not basing thingson a stupid "dur hur Nintendo cannott make gud hardwarez this is will beeez just like da Wii You again when Ninty fans got too excitied" line of logic. 

It is budget hardware.
It's not even using a fully enabled Tegra Orin chip, it's using a cut-down chip.

The full Tegra Orin is a 2048 SM part running at 1.3Ghz.
Switch 2 is a 1536 SM part running at 0.561Ghz mobile, 1Ghz docked.

There are instances where the Switch 2 will have less than a THIRD of the performance of a fully enabled Tegra Orin chip.

And that ignores the other white elephant in the room... That Tegra Orin is old and outdated in nVidia's Tegra product stack as nVidia has Tegra Thor on the market which would have given Nintendo 2-3x~ the performance for the same 1536 SM chip.

12GB of Ram is small and budget.

The Xbox Series X and Playstation 5 released with 16GB of Ram.... 5 years ago.
24-32GB seems to be the sweet spot for DRAM in most devices, even PC handhelds currently.

The 8" display is utter cheap garbage, it's blurry, it's got poor contrasts, colour and response times. It can't even do HDR properly, it is the worst part about the Switch 2.. And I would argue should have been the most important part.
There are much better LCD panels on the market.

Soundwave said:

Switch 2 has features from Ada Lovelace as well direct from Nvidia's own documentation which is an even newer architecture than Ampere isa". 

What features? Name them.



 

lol the full Orin chip was a design for automobiles running off a *car battery*, if the Switch 2 used it would be comically oversized and cost far more, no shit they didn't use the full size chip. That chip was never designed for a game console, certainly not a portable one. 

You can Google the Nvidia hardware leaks, there were Ada Lovelace features like media block and compression tech that show up in the data leaks for T239, it was discussed here years ago. Looking at the T239 chip under XRay and seeing that it's quite different from the Tegra T234 who knows how much else they took from the Lovelace designs because that definitely is not just a stripped/dumped T234. People using the Nvidia T234 Orin power calculator while well intentioned also turned out to be wrong I think ... Switch 2's chip has considerably better power efficiency, how exactly they pulled that off I would like to hear more from Nvidia and Nintendo on. I suspect they copied further features from Ada Lovelace to enable that, because at 8nm with that tiny ass battery they should not be getting these results. Not based on Orin power calculations anyhow. 

On the scale of Nintendo hardware, Switch 2 is definitely on the "premium" end of that, meaning it is not budget hardware like the Wii, Wii U, 3DS, DS, Game Boy were. It's more in line with what the N64 or GameCube were for their time. I mean do you want to set up some official terminology just for this board on that because I'd be fine with that. Premium PC GPUs are generally what I consider the pricier ones, and yeah no one really does care that much about that because it's not a big part of the market. Again if you want to establish some kind of official board nomenclature for that for everyone to abide by, fine. 

Clearly there are huge differences between Nintendo of 2026 and Nintendo of 2006 just like there are huge differences between Nintendo of 1996 and 2006. Nintendo is not one singular, non-changing entity, to the contrary I think there are a lot of people who don't understand that Nintendo changes radically pretty much with every new long term president. But a lot of people on the internet really don't know their history of the NES/SNES/N64 especially so they don't understand, this is also why there's a shit ton of idiots on Youtube crying about how Nintendo has changed, lost their identity, etc. etc. You know people who grew up with the NES/SNES/N64/GameCube could say the same shit about the Wii/Wii U/DS/3DS being not the "real" Nintendo either. People need to stop using what they Nintendo has done in the past 10 years as some kind of holy religious decree that can't change, like yeah they might do the same thing again, but they might do the complete opposite too (and they have done that in their past also). You have to understand who is the head of the company and the hardware division too, that's not some small detail to gloss over. 

Last edited by Soundwave - on 13 January 2026

Soundwave said:
Norion said:

He specifically said your statement about architecture mattering more than raw horsepower was not true, not that architecture wasn't important so he clearly wasn't arguing against those points in that response. If you think he's wrong about raw horsepower mattering more you should actually back up your claim instead of arguing against something he didn't say there. Also bozobanana is right, you really are acting childish over this by implying he might be lying about owning a Switch 2 for no good reason and other nonsense like the GPU comments.

Architecture does matter massively when you are talking about a 7+ year difference in architecture, I didn't think I had to spell that out specifically because I would think people would do their homework and know that GCN 2.0 (the PS4) is 7 years older than Ampere, but apparently I had to point that out as I guess that wasn't common knowledge or common sense? 

Of course there will be a large difference between Ampere and GCN 2.0 (PS4) ... like does that seriously need to be explained and dumbed down? Just like Ampere will be dwarfed by a GPU architecture from 2027 (7 years afterwards). That shouldn't even be a controversial statement at all, jeeezus. 

If you want to side track the discussion, y'know it's not terribly hard to find GCN 2.0 GPUs (PS4 class) ... go see if the configs of those AMD cards that are roughly in line with the PS4 (so like not cherry picking the highest config GCN 2.0 card) can run Star Wars Outlaws, Assassin's Creed Shadows and the like. Frankly I don't care, I don't even know why someone wanted to fall on their sword to defend the PS4's honor in a thread that isn't even about the PS4, lol. Like who cares. This topic isn't about the PS4. 

Why are you consistently conflating his claim that raw horsepower matters more than architecture with something else? Raw horsepower mattering more than architecture doesn't mean that architecture doesn't matter a ton over that length of time. Those are two completely different statements since raw horsepower mattering more and architecture being very important do not contradict each other at all since raw horsepower could be even more important thus both of those things could be simultaneously true.

His response was specifically focused on your claim that architecture matters more than raw horsepower so as I said if you think he's wrong about raw horsepower mattering more you should focus on backing up your initial claim instead of arguing against something he didn't say there.

Last edited by Norion - on 13 January 2026

Norion said:
Soundwave said:

Architecture does matter massively when you are talking about a 7+ year difference in architecture, I didn't think I had to spell that out specifically because I would think people would do their homework and know that GCN 2.0 (the PS4) is 7 years older than Ampere, but apparently I had to point that out as I guess that wasn't common knowledge or common sense? 

Of course there will be a large difference between Ampere and GCN 2.0 (PS4) ... like does that seriously need to be explained and dumbed down? Just like Ampere will be dwarfed by a GPU architecture from 2027 (7 years afterwards). That shouldn't even be a controversial statement at all, jeeezus. 

If you want to side track the discussion, y'know it's not terribly hard to find GCN 2.0 GPUs (PS4 class) ... go see if the configs of those AMD cards that are roughly in line with the PS4 (so like not cherry picking the highest config GCN 2.0 card) can run Star Wars Outlaws, Assassin's Creed Shadows and the like. Frankly I don't care, I don't even know why someone wanted to fall on their sword to defend the PS4's honor in a thread that isn't even about the PS4, lol. Like who cares. This topic isn't about the PS4. 

Why are you consistently conflating his claim that raw horsepower matters more than architecture with something else? Raw horsepower mattering more than architecture doesn't mean that architecture doesn't matter a ton over that length of time. Those are two completely different statements since raw horsepower mattering more and architecture being very important do not contradict each other at all and thus can obviously be simultaneously true.

His response was specifically focused on your claim that architecture matters more than raw horsepower so as I said if you think he's wrong about raw horsepower mattering more you should actually back up your claim instead of arguing against something he didn't say there.

If you're going to coflate a PS4 with a Switch 2, then yes, you bet your ass architecture matters massively. It's a 7 year gap, I shouldn't have to hold someone's hand and explain to them how that is a significant difference, that should just be a matter of fact statement. Raw specs of the PS4, Switch 2 don't matter that much when you are dealing with a massive architecture difference. But whatever that's how it goes. 

I knew when the actual data leaks from Nvidia were coming out 1536 CUDA cores and Ampere based with even some Lovelace features, 12GB RAM, was going to be a system that plays way more than just PS4 tier games. If that's what Nintendo was going for they wouldn't have chose that chip as it would be ridiculous overkill and I said as much here. 

I said looking at those leaks, this hardware would likely be a game changer especially for Japanese developers as they'd now have a Switch console that can run even PS5 games reasonably well and they'd now have to make a decision on bringing their main console IP to the Switch 2.

What do we know today? Square-Enix committed their mainline FF7 Remake series to Switch 2 including FF7 Rebirth (PS5 only title) AND even their next mainline FF game which is FF7 Remake Part III. Resident Evil 9, the new one is on Switch 2. Now we get data leaks that even Monster Hunter Wilds is coming, which I also said would happen.

That's not just random, I looked at the specs objectively and could see this coming a mile away while other posters were doing their best to spread a bunch of FUD, doomerism in every Switch 2 topic. 



Norion said:
Soundwave said:

Architecture does matter massively when you are talking about a 7+ year difference in architecture, I didn't think I had to spell that out specifically because I would think people would do their homework and know that GCN 2.0 (the PS4) is 7 years older than Ampere, but apparently I had to point that out as I guess that wasn't common knowledge or common sense? 

Of course there will be a large difference between Ampere and GCN 2.0 (PS4) ... like does that seriously need to be explained and dumbed down? Just like Ampere will be dwarfed by a GPU architecture from 2027 (7 years afterwards). That shouldn't even be a controversial statement at all, jeeezus. 

If you want to side track the discussion, y'know it's not terribly hard to find GCN 2.0 GPUs (PS4 class) ... go see if the configs of those AMD cards that are roughly in line with the PS4 (so like not cherry picking the highest config GCN 2.0 card) can run Star Wars Outlaws, Assassin's Creed Shadows and the like. Frankly I don't care, I don't even know why someone wanted to fall on their sword to defend the PS4's honor in a thread that isn't even about the PS4, lol. Like who cares. This topic isn't about the PS4. 

Why are you consistently conflating his claim that raw horsepower matters more than architecture with something else? Raw horsepower mattering more than architecture doesn't mean that architecture doesn't matter a ton over that length of time. Those are two completely different statements since raw horsepower mattering more and architecture being very important do not contradict each other at all since raw horsepower could be even more important thus both of those things could be simultaneously true.

His response was specifically focused on your claim that architecture matters more than raw horsepower so as I said if you think he's wrong about raw horsepower mattering more you should focus on backing up your initial claim instead of arguing against something he didn't say there.

Because he is wrong and can't own it.  The 2080 runs with the 5050.  2018 and 2025 release dates, respectively, which is 7 years...

This is how RTX 2080 and RTX 5050 compete in popular games:

  • RTX 2080 is 11% faster in 1080p
  • RTX 2080 is 12% faster in 1440p
  • RTX 2080 is 12% faster in 4K

Edit

I should add i know raw power matters because I made a stupid mistake.  When I first tried PC I jumped on a 3050 sale because I assumed a 3000 series meant better performance than a 2000 series...  not so much.  Many 2000 series out perform a 3050.  Live and learn.  With nvidia the last two numbers, within reason, matter way more than the first two.  

50 and 60 are budget.  70 and 70ti are mid tier.  80 and 90 is premium.  I wouldn't trade my 4090 for a 5080, much less a 5070.

Architecture obviously matters and plays a role, but not to the point people think.  Raw power is insanely crucial.  I will be stunned if my 4090 doesn't at least match the 6060 and likely it will easily best a 6050.  Heck, solid chance it will be parity to a 6070, or at least not far behind.

Last edited by Chrkeller - on 13 January 2026

“Consoles are great… if you like paying extra for features PCs had in 2005.”
Around the Network

And the repeat continues over and over again.

1. Pretent to laugh because is in panic
2. Ignore any and every question
3. Talk about how Nintendo changed from 20 years ago, when not asked
4. Say it's not Wii, Wii U (repeat all consoles for the 45th time), again, not part of any question to him
5. Repeat the same list of games when talking about anything else, mostly hardware
6. Champions Nintendo, poor Nintendo, but nobody is attacking it, he heard it somewhere back then tho
7. Repeat ad aeternum

By this point it would be more fruitful to answer with a random cake recipe, or a list of your favorite movies by genre, the result will be the same, the anwser cycle too, but at least someome else may find inspiration to go do something else.



BraLoD said:

And the repeat continues over and over again.

1. Pretent to laugh because is in panic
2. Ignore any and every question
3. Talk about how Nintendo changed from 20 years ago, when not asked
4. Say it's not Wii, Wii U (repeat all consoles for the 45th time), again, not part of any question to him
5. Repeat the same list of games when talking about anything else, mostly hardware
6. Champions Nintendo, poor Nintendo, but nobody is attacking it, he heard it somewhere back then tho
7. Repeat ad aeternum

By this point it would be more fruitful to answer with a random cake recipe, or a list of your favorite movies by genre, the result will be the same, the anwser cycle too, but at least someome else may find inspiration to go do something else.

Or cite about the 10 different accurate predictions I made about the Switch 2 that were correct. 

Why are you even in this thread? This isn't a PS4 thread, it's like a guy inserting his wife into a topic so he can get into an argument about a perceived slight about his wife when the topic wasn't even about her to begin with. 

I don't give a shit about the PS4 frankly. It has nothing to do with the Switch 2. Maybe PS4 can run Star Wars Outlaws, probably it cannot without it being seriously retooled, otherwise they probably would make a version of it given they need every sale they can get. It's not like Ubi Soft gets less money for a game on the PS4 versus PS5, why would they care which platform the sales come from. Maybe the Sega Dreamcast can run it too. Maybe you can port it and tell us. That doesn't have much to do with the Switch 2 or Nintendo either. 

If you're going to be in a thread that deals with Nintendo historically, sure you might yeah want to know about their shifting hardware stances over the decades and how that can change and how that relates to Switch 2. Because it's actually part of the topic. 

Nintendo made a decision to get out of the "arms race" but that was 20 plus years ago (hence why 20 years is brought up, not just randomly). That approach worked for a while and then fizzled out as has been discussed. That's not the Nintendo of today, that's a fair distinction to make. That era of Nintendo is as long gone as the NES was when the Wii launched. 

Last edited by Soundwave - on 13 January 2026

Soundwave said:
Norion said:

Why are you consistently conflating his claim that raw horsepower matters more than architecture with something else? Raw horsepower mattering more than architecture doesn't mean that architecture doesn't matter a ton over that length of time. Those are two completely different statements since raw horsepower mattering more and architecture being very important do not contradict each other at all and thus can obviously be simultaneously true.

His response was specifically focused on your claim that architecture matters more than raw horsepower so as I said if you think he's wrong about raw horsepower mattering more you should actually back up your claim instead of arguing against something he didn't say there.

If you're going to coflate a PS4 with a Switch 2, then yes, you bet your ass architecture matters massively. It's a 7 year gap, I shouldn't have to hold someone's hand and explain to them how that is a significant difference, that should just be a matter of fact statement. Raw specs of the PS4, Switch 2 don't matter that much when you are dealing with a massive architecture difference. But whatever that's how it goes. 

You don't have to hold someone's hand and explain that to them since his response wasn't about that but alright. Just try to respond to what someone is actually saying whenever possible.

Chrkeller said:
Norion said:

Why are you consistently conflating his claim that raw horsepower matters more than architecture with something else? Raw horsepower mattering more than architecture doesn't mean that architecture doesn't matter a ton over that length of time. Those are two completely different statements since raw horsepower mattering more and architecture being very important do not contradict each other at all since raw horsepower could be even more important thus both of those things could be simultaneously true.

His response was specifically focused on your claim that architecture matters more than raw horsepower so as I said if you think he's wrong about raw horsepower mattering more you should focus on backing up your initial claim instead of arguing against something he didn't say there.

Because he is wrong and can't own it.  The 2080 runs with the 5050.  2018 and 2025 release dates, respectively, which is 7 years...

This is how RTX 2080 and RTX 5050 compete in popular games:

  • RTX 2080 is 11% faster in 1080p
  • RTX 2080 is 12% faster in 1440p
  • RTX 2080 is 12% faster in 4K

Edit

I should add i know raw power matters because I made a stupid mistake.  When I first tried PC I jumped on a 3050 sale because I assumed a 3000 series meant better performance than a 2000 series...  not so much.  Many 2000 series out perform a 3050.  Live and learn.  With nvidia the last two numbers, within reason, matter way more than the first two.  

50 and 60 are budget.  70 and 70ti are mid tier.  80 and 90 is premium.  I wouldn't trade my 4090 for a 5080, much less a 5070.

Architecture obviously matters and plays a role, but not to the point people think.  Raw power is insanely crucial.  I will be stunned if my 4090 doesn't at least match the 6060 and likely it will easily best a 6050.  Heck, solid chance it will be parity to a 6070, or at least not far behind.

Yeah I would absolutely say raw power is more important currently since a 4090 is gonna age incredibly well and you say 6060 but it should should run games better than even the 7060 despite the gap between the two being close to 7 years, the 4090 is just that much of a beast that it's still the clear 2nd best GPU three years later. Also the 5060 doesn't even beat the 3070 so people who bought the latter at launch and used it for 5 years will have a significantly better experience than people who get and use a 5060 for that long.

GPUs last longer than ever so if anything architecture is less important than it has ever been compared to raw power. Like a 2080ti is gonna still be running the most demanding games out there ok after it turns 10 years old by late 2028 cause it was given enough power to last over a decade whereas a 3050 is gonna be struggling massively by the end of this decade despite being a few years younger.

Last edited by Norion - on 13 January 2026

I mean, yes the Switch 2 has several advantages over the PS4 (DLSS, hardware raytracing, more RAM, fast I/O) and as such is a more capable system overall, that much is true.

At the same time, the Switch 2 is still Nintendo staying out of the graphics race just by the fact that it's a portable device, and that it's a $450 mass market machine. It's still a capable piece of kit for the price, but it's not a high end machine, and that's a good thing because if it was it would cost a fortune and sell poorly.



Norion said:
Soundwave said:

If you're going to coflate a PS4 with a Switch 2, then yes, you bet your ass architecture matters massively. It's a 7 year gap, I shouldn't have to hold someone's hand and explain to them how that is a significant difference, that should just be a matter of fact statement. Raw specs of the PS4, Switch 2 don't matter that much when you are dealing with a massive architecture difference. But whatever that's how it goes. 

You don't have to hold someone's hand and explain that to them since his response wasn't about that but alright. Just try to respond to what someone is actually saying whenever possible.

Chrkeller said:

Because he is wrong and can't own it.  The 2080 runs with the 5050.  2018 and 2025 release dates, respectively, which is 7 years...

This is how RTX 2080 and RTX 5050 compete in popular games:

  • RTX 2080 is 11% faster in 1080p
  • RTX 2080 is 12% faster in 1440p
  • RTX 2080 is 12% faster in 4K

Edit

I should add i know raw power matters because I made a stupid mistake.  When I first tried PC I jumped on a 3050 sale because I assumed a 3000 series meant better performance than a 2000 series...  not so much.  Many 2000 series out perform a 3050.  Live and learn.  With nvidia the last two numbers, within reason, matter way more than the first two.  

50 and 60 are budget.  70 and 70ti are mid tier.  80 and 90 is premium.  I wouldn't trade my 4090 for a 5080, much less a 5070.

Architecture obviously matters and plays a role, but not to the point people think.  Raw power is insanely crucial.  I will be stunned if my 4090 doesn't at least match the 6060 and likely it will easily best a 6050.  Heck, solid chance it will be parity to a 6070, or at least not far behind.

Yeah I would absolutely say raw power is more important currently since a 4090 is gonna age incredibly well and you say 6060 but it should should run games better than even the 7060 despite the gap between the two being close to 7 years, the 4090 is just that much of a beast that it's still the clear 2nd best GPU three years later. Also the 5060 doesn't even beat the 3070 so people who bought the latter at launch and used it for 5 years will have a significantly better experience than people who get and use a 5060 for that long.

GPUs last longer than ever so if anything architecture is less important than it has ever been compared to raw power. Like a 2080ti is gonna still be running the most demanding games out there ok after it turns 10 years old by late 2028 cause it was given enough power to last over a decade whereas a 3050 is gonna be struggling massively by the end of this decade despite being a few years younger.

100%.  4090 will absolutely compete with the budget 7000 series.  I just wish I knew all this before buying a 3050, but that is on me.  Raw power matters, a lot.  



“Consoles are great… if you like paying extra for features PCs had in 2005.”