By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Was Nintendo right to opt out of the graphics arms race?

Tagged games:

 

Was it the right decision?

Yes 74 88.10%
 
No 10 11.90%
 
Total:84
Soundwave said:
Chrkeller said:

Bang for buck, S2 kills portable PC, imo. The S2 holds it own for half the price. I do not think anybody thinks otherwise, so it is an imaginary battle. The major benefit of PC portables is a Steam library that puts any console to shame. The real limit of PC mobile is AMD upscaling is utter crap.

And I think we need to be fair and honest about positions. Nobody is saying the S2 is cheap budget hardware. Nobody is saying the S2 falls massively behind current PC mobile. People are opposed the S2 being premium, because it is not. People are against the S2 being generations above the ps4, because it isn't.

Yesterday was a shit show, we should do better, and it starts with debating the actual positions people are taking, not imagined positions.

Who cares if someone says the Switch 2 performs comparable to a premium handheld device? So what if someone says that? What Bible did you write that makes that impossible to say. Even if that was someone's opinion, so what? Who died and decreed 3 or 4 people on this board get to decide which hardware gets categorized as such? All this gatekeeping bullshit about what can be considered good hardware from self imposed elitists is really tiresome. "People are opposed"? There's like 10 people total that post on this board, lol, it's a sad state of affairs to begin with. 

In my opinion this hardware result is very impressive, well beyond even my expectation and I had high expectations for this device, higher than most on this board. Further to that, I would say this is clearly a different Nintendo era under different leadership and it shows, this is much more in line with what the N64 and GameCube were for their day ... which was very impressive hardware for that time, with caveats made considering the Switch 2 is a portable device of course. 

Regarding generations, I was clearly discussing PC architecture and PC architecture generations, Ampere/Lovelace are several generations removed from GCN 2.0 that was used for the PS4. That's not even considering that Nvidia's architecture is generally considered better than AMD's to begin with (they don't have 90% plus GPU marketshare by random luck). If someone here wants to get their panties in a bunch about that, I frankly don't care. Ampere/Lovelace IS several GPU architecture generations beyond AMD GCN 2.0 from 2013. There's nothing you're going to post or anyone is going to post that will change that. 

I am genuinely and sincerely just trying to avoid another disaster.  You appeared to being arguing against Perma on something he didn't say.  If I am mistaken, that is on me and apologies. 

Not just your opinion, best I can we are all impressed with the S2.  Especially the third-party support, way better than I expected.  The S2 is excellent.   

A 2080 is 2018, a 5050 is 2025.  That is 7 years and multiple generations apart..  the 2080 has 12% better performance.  Fact.  So, being generations ahead does not offset raw power, fact.  We should start accepting facts.  Again, I feel bad about playing a role in derailing a really good thread, so let us get it back on topic.    



“Consoles are great… if you like paying extra for features PCs had in 2005.”
Around the Network
Chrkeller said:
Soundwave said:

Who cares if someone says the Switch 2 performs comparable to a premium handheld device? So what if someone says that? What Bible did you write that makes that impossible to say. Even if that was someone's opinion, so what? Who died and decreed 3 or 4 people on this board get to decide which hardware gets categorized as such? All this gatekeeping bullshit about what can be considered good hardware from self imposed elitists is really tiresome. "People are opposed"? There's like 10 people total that post on this board, lol, it's a sad state of affairs to begin with. 

In my opinion this hardware result is very impressive, well beyond even my expectation and I had high expectations for this device, higher than most on this board. Further to that, I would say this is clearly a different Nintendo era under different leadership and it shows, this is much more in line with what the N64 and GameCube were for their day ... which was very impressive hardware for that time, with caveats made considering the Switch 2 is a portable device of course. 

Regarding generations, I was clearly discussing PC architecture and PC architecture generations, Ampere/Lovelace are several generations removed from GCN 2.0 that was used for the PS4. That's not even considering that Nvidia's architecture is generally considered better than AMD's to begin with (they don't have 90% plus GPU marketshare by random luck). If someone here wants to get their panties in a bunch about that, I frankly don't care. Ampere/Lovelace IS several GPU architecture generations beyond AMD GCN 2.0 from 2013. There's nothing you're going to post or anyone is going to post that will change that. 

I am genuinely and sincerely just trying to avoid another disaster.  You appeared to being arguing against Perma on something he didn't say.  If I am mistaken, that is on me and apologies. 

Not just your opinion, best I can we are all impressed with the S2.  Especially the third-party support, way better than I expected.  The S2 is excellent.   

A 2080 is 2018, a 5050 is 2025.  That is 7 years and multiple generations apart..  the 2080 has 12% better performance.  Fact.  So, being generations ahead does not offset raw power, fact.  We should start accepting facts.  Again, I feel bad about playing a role in derailing a really good thread, so let us get it back on topic.    

I don't even have much interest in PC architecture generations other than to say they do exist, which is not really debatable. Using the 20-series as a baseline can be misleading because the 20-series was a massive leap forward, the leap where DLSS and tensor core usage and all of Nvidia's AI investment really took off in a tangible way. Not every PC generation is going to have that level of a jump, that doesn't mean PC architectures don't exist or are meaningless. There is a significant difference between AMD GCN 2.0 architecture and Nvidia Ampere/Lovelace, again if someone is upset by that comment, fine. 

20-series also ironically is probably the beginning of the end of Nvidia giving a crap about gaming as that level of technology soon started to be eaten up for AI servers creating the modern AI boom and I don't think going forward Nvidia really gives that much of a crap if gamers are going to whine that the 40 series or 50 series or 60 series or 70 series are smaller leaps. They don't care now, why should they, aside from having almost a quasi-monopoly (92%+ marketshare) they make 100x more money from AI companies than they do from gamers. 

Developers can't afford budgets or dev times getting any longer anyway, so you're hitting up against a natural ceiling here to begin with, where both tech producers (Nvidia) and even the content companies (game publishers) have no incentive left to push the envelope that much further. 

But again, none of that has anything to do with this thread's topic and should never have made that much of a fuss to begin with. 

Last edited by Soundwave - on 14 January 2026

Soundwave said:
Chrkeller said:

I am genuinely and sincerely just trying to avoid another disaster.  You appeared to being arguing against Perma on something he didn't say.  If I am mistaken, that is on me and apologies. 

Not just your opinion, best I can we are all impressed with the S2.  Especially the third-party support, way better than I expected.  The S2 is excellent.   

A 2080 is 2018, a 5050 is 2025.  That is 7 years and multiple generations apart..  the 2080 has 12% better performance.  Fact.  So, being generations ahead does not offset raw power, fact.  We should start accepting facts.  Again, I feel bad about playing a role in derailing a really good thread, so let us get it back on topic.    

I don't even have much interest in PC architecture generations other than to say they do exist, which is not really debatable. Using the 20-series as a baseline can be misleading because the 20-series was a massive leap forward, the leap where DLSS and tensor core usage and all of Nvidia's AI investment really took off in a tangible way. Not every PC generation is going to have that level of a jump, that doesn't mean PC architectures don't exist or are meaningless. There is a significant different between AMD GCN 2.0 architecture and Nvidia Ampere/Lovelace, again if someone is upset by that comment, fine. 

20-series also ironically is probably the beginning of the end of Nvidia giving a crap about gaming as that level of technology soon started to be eaten up for AI servers creating the modern AI boom and I don't think going forward Nvidia really gives that much of a crap if gamers are going to whine that the 40 series or 50 series or 60 series or 70 series are smaller leaps. They don't care now, why should they, aside from having almost a quasi-monopoly (92%+ marketshare) they make 100x more money from AI companies than they do from gamers. 

But again, none of that has anything to do with this thread's topic and should never have been a debatable point to begin with. 

to my earlier point, we shouldn't argue positions nobody took.  Not a single person said they don't exist.  We all agree they exist.  

Your argument was a 7-year difference in architecture is massive, not always, as is the case with the 2080 vs 5050.   

Nobody said they are meaningless.  Again, we shouldn't argue imaginary battles.  My argument, I believe Norion agrees, raw power trumps architecture generations in most cases, especially recently.  Likely, to your point, that trend will continue.  A lot of people, myself included when I built my first gaming rig, rely on accurate information.  Posting that architecture generations mean more than raw power is inaccurate and could be wrongly mislead people looking to get into gaming.  It would be tragic for someone to pass up on say a 3070 in favor of a 5050 because they read architecture matters more.      

Again, we should debate positions people took.  And to be fair, I am not the only who has pointed this out to you.  It is universally agreed this is a problem in this particular thread.

edit

According to benchmarks I googled, a 1080 outperforms a 3050 by 25%..  and you are saying the 2000 series was a bit jump in architecture, and yet..  raw power.  

Last edited by Chrkeller - on 14 January 2026

“Consoles are great… if you like paying extra for features PCs had in 2005.”

There is a big difference between AMD GCN 2.0 and Nvidia Ampere/Lovelace. No one here has presented any kind of reasoned point against that. Put it this way, if Nintendo built the Switch 2 around AMD GCN 2.0 architecture of course there would be a huge fucking difference and the system probably would be exactly what some here are so desperate for it to be apparently, which would be a quite underpowered and outdated system that would likely be in tough to run modern games in a portable power envelope. Too bad for folks who invested hard in that narrative that Nintendo didn't go that way I guess. 

I used the 7 year gap as a secondary explanation because the poster in question who I was responding too wasn't understanding even that. I'm not a high school teacher, it's not my job to explain something 300 different ways until someone understands it, at some point you either get it or you don't.

Again this is not even what this thread is about, just another case of selective outrage and cherry picking something that isn't even the central point being made. 


Last edited by Soundwave - on 14 January 2026

Soundwave said:

There is a big difference between AMD GCN 2.0 and Nvidia Ampere/Lovelace. No one here has presented any kind of reasoned point against that. Put it this way, if Nintendo built the Switch 2 around AMD GCN 2.0 architecture of course there would be a huge fucking difference and the system probably would be exactly what some here are so desperate for it to be apparently, which would be a quite underpowered and very outdated system that would likely be in tough to run modern games in a portable power envelope. Too bad for folks who invested hard in that narrative that Nintendo didn't go that way I guess. 

I used the 7 year gap as a secondary explanation because the poster in question who I was responding too wasn't understanding even that. I'm not a high school teacher, it's not my job to explain something 300 different ways until someone understands it, at some point you either get it or you don't.

Again this is not even what this thread is about, just another case of selective outrage and cherry picking something that isn't even the central point being made. 


You also said the 2000 series was a massive leap, and yet a 1080 outperforms a 3050.  At the end of the day, I can't make you accept facts.  That is exclusively under your remit.  

As for the bolded part, in 2023 I went on record that the S2 would be able to run modern games..  so you win yet another imaginary battle?  



“Consoles are great… if you like paying extra for features PCs had in 2005.”
Around the Network
Chrkeller said:
Soundwave said:

There is a big difference between AMD GCN 2.0 and Nvidia Ampere/Lovelace. No one here has presented any kind of reasoned point against that. Put it this way, if Nintendo built the Switch 2 around AMD GCN 2.0 architecture of course there would be a huge fucking difference and the system probably would be exactly what some here are so desperate for it to be apparently, which would be a quite underpowered and very outdated system that would likely be in tough to run modern games in a portable power envelope. Too bad for folks who invested hard in that narrative that Nintendo didn't go that way I guess. 

I used the 7 year gap as a secondary explanation because the poster in question who I was responding too wasn't understanding even that. I'm not a high school teacher, it's not my job to explain something 300 different ways until someone understands it, at some point you either get it or you don't.

Again this is not even what this thread is about, just another case of selective outrage and cherry picking something that isn't even the central point being made. 


You also said the 2000 series was a massive leap, and yet a 1080 outperforms a 3050.  At the end of the day, I can't make you accept facts.  That is exclusively under your remit.  

As for the bolded part, in 2023 I went on record that the S2 would be able to run modern games..  so you win yet another imaginary battle?  

You presented a host of opinions over the years here, many/most of them especially your initial claims like the Switch 2 wouldn't be able to run something like FF7 Remake Intergrade (a game that is launching on the system in a few weeks) as one example. You then retreated from those positions and made token concessions here and there but continued to try and doom post in virtually every thread about the system. 

That's on you, not on me. You want to be mad at me over all that bullshit and gatekeeping, go ahead if it makes you feel better. I'm willing to drop it, but if you want to keep pretending none of that ever happened and keep engaging in these threads, I'm perfectly fine with calling you out too. 

The GCN 2.0 example is actually a wonderful one, Nintendo using GCN 2.0 architecture would have been so sweet for you, that's actually exactly the type of architecture/chipset choice that Nintendo would have had to made to fit perfectly into the outdated/underpowered talking points. Maybe that would have been exactly the design Iwata-era Nintendo may have chosen. Maybe not. Unfortunately for you I guess, that Nintendo doesn't exist any more, times have changed. 

Last edited by Soundwave - on 14 January 2026

OdinHades said:

For me pesonally, yes, absolutely. I stopped caring about graphics about ten years ago. Because I can't see a friggin' difference anymore between low and ultra details. Sure, I can watch some digital foundry video that shows me a still image with 16x zoom to show me some detail that looks slightly better than in last gen or something. But I prefer to spend my time with actually playing games. Graphics have gotten good enough for me to not care about small details anymore. I used to be impressed by something like Crysis or Killzone 2 when those games initially released. But today when I see something like AC: Shadows or whatever the hell might be the shiniest game right now, I just think "looks nice", and that's it. Graphics just don't impress anymore as the jumps have gotten too small to really make a huge difference. That also applies to Pathtracing and stuff. Yes, the lightin in pathtraced Half-Life 2 looks gorgeous. But it's still Half-Life 2 and I kinda forget about the lighting or other stuff when I'm 30 minutes or so into a game.

Nintendos route lead to a portable system, which is absolutely fantastic for someone like me who travels a lot. Because in a train, on a plane or in a hotel room, my Switch 2 is indeed infinitely more powerful than my PS5 Pro that is sitting at home doing nothing. Yes, I know the PS Portal is a thing, but I don't like streaming with all the compression, lost connections, delay and stuff.

I prefer to get games on Switch 2 when they are available, although I do have a PS5 Pro. Portability beats the slight graphics upgrade any day of the week for me.

I'm a firm believer that most people will never notice the difference between 1080p and 4K images/video until you tell them that there is a difference. Or the difference between a game running on PS4 vs. PS5 unless there is an obvious contrast in lighting and FPS. Graphics have advanced so far now that generational leaps just aren't what they used to be, and untrained (or uncaring) eyes have to really squint to make out fine details. I'll readily admit that I'm one of those people.

So many times I'll read message boards or comment sections on how much better a game on Switch runs on other platforms and have reservations about a game. Then when I go watch gameplay in motion or a DF comparison, I come out afterward thinking it's good enough for me; if a resolution downgrade and some extra slowdown and pop-in here and there means I don't have to pay an extra $400 and make room for an extra box, so be it.



Switch 2 kinda ended up about where I expected it to be in terms of power, though a little bit better as I honestly didn't expect we'd get 120fps support given the limitations of a portable form factor.

It's technically correct to say that it's multiple GPU generations (not console generations) ahead of last gen in terms of feature set thanks to DLSS/hardware raytracing/etc, but it does need to be stressed that power also matters and in that regard it's also correct to say it's closer to PS4 in raw power simply due to the physical limits of fitting in a tiny case and running off a battery.

What this means for games is that it is capable of running the same games as PS5/Xbox Series, as evidenced by the likes of Star Wars Outlaws and Assassin's Creed Shadows, provided you adjust settings and such. It's also capable of exceeding PS4 and handling stuff PS4 can't, though not to the point of being an entire (console) generation ahead.



curl-6 said:

Switch 2 kinda ended up about where I expected it to be in terms of power, though a little bit better as I honestly didn't expect we'd get 120fps support given the limitations of a portable form factor.

It's technically correct to say that it's multiple GPU generations (not console generations) ahead of last gen in terms of feature set thanks to DLSS/hardware raytracing/etc, but it does need to be stressed that power also matters and in that regard it's also correct to say it's closer to PS4 in raw power simply due to the physical limits of fitting in a tiny case and running off a battery.

What this means for games is that it is capable of running the same games as PS5/Xbox Series, as evidenced by the likes of Star Wars Outlaws and Assassin's Creed Shadows, provided you adjust settings and such. It's also capable of exceeding PS4 and handling stuff PS4 can't, though not to the point of being an entire (console) generation ahead.

Is there even any Ampere based GPU from Nvidia that can't run PS5 tier games? I don't think so, even the lowest speced laptop 30-series cards can run PS5 tier games, the only one where it gets dicey maybe at times is the rare 3050 variant that I think has only 6GB of RAM, but that's purely because it has only (well) 6GB of RAM as a bottleneck. 

Switch 2 is the most extreme stress test of the Ampere architecture, but even at like 10 watts you still can't stop it from running games like Star Wars Outlaws even with ray tracing turned on. 

I doubt this is by accident or fluke either, Nintendo worked on this chip as a custom design and got exactly the level of power they wanted this time around unlike the Switch 1 where they basically had to take a chip that off the shelf (Tegra X1) that they didn't assist in developing. T239 is a custom design made especially for Nintendo. I think they definitely asked for a chip that could run PS5/XSS tier games without being outrageously expensive because they knew they had a good chance at getting franchises like the console Monster Hunter, Final Fantasy, etc. with how well the Switch 1 sold. 

It's highly unlikely the system just happens to be able to run PS5/XSS games that are already out or coming soon as it does by accident or happy coincidence. 

If they wanted just PS4 tier performance or a bit better and something just good enough to make Nintendo's own games look pretty, I don't think this is the chip they would use (again look at how it performs in things like Assassin's Creed Shadows versus a $1000 handheld). They could have done that for way less money with a lighter chip that eats less battery and could've given you a smaller system too. 

Think about Nintendo's hardware division today, most of the old farts are now gone, the younger guys running the show now they didn't grow up in the 1950s/60s like Iwata did when things like a pocket calculator were amazing. They grew up or were in their prime "gaming my 20s" era with the Super Famicom and Playstation, probably had gaming PCs in the 90s too. Someone like likely not going to have the same thoughts on hardware the old farts did. So that's not terribly surprising at least to me anyway, things were always going to change a lot with these new people in charge of the company and guys like even Miyamoto being semi-retired focused on movies/theme parks. 



Soundwave said:

And literally every console could have used a better chip. The PS5 could have had a better chip. The Gamecube could have had a better chip. The PS2 could have had a better chip. The XBox Series S/X, there were better chips available, why is it for this one console we need to act like Nintendo has committed some kind of mortal sin? The chip they chose performs very well and I have no problem giving them props for that.

No shit. They could always be worst as well.

I place the Switch 2 on the same pedal stool as all the other console, they have all drawn criticism and praise for various aspects from me, the Switch 2 doesn't get to exist in a vacuum where it doesn't get criticized in areas where it has fallen short.

The Switch 2 has several aspects where it is a regression over the Switch 1 OLED... And those aspects are relevant to criticize.



Soundwave said:

The Nvidia leak has been proven to be correct, unless you think they just randomly guessed the CUDA core number and other facts from out of thin air (in which case maybe they should buy a lottery ticket), so yes the onus then lies on you to disprove that and show where it is incorrect if you have a problem with those conclusions. 

Leaks can only be proven correct in fucking hindsight. They can be wrong, they can be wrong, they are NOT reliable. JFC.

I made many educational guesses on what the Switch 2 hardware was going to be early on in the Switch 1's lifecycle.
I.E. Tegra Orin. - Many people refuted that based on how old the chipset would be at the time of an expected Switch 2's launch.

It was an educational guess. I could have been wrong. I could have been right, it's irrelevant. 

People who believe in leaks and rumors are actually being foolish.


Soundwave said:

The Orin chip is not suitable to be 1:1 put into a portable game console it had a lot of shit in it that's useless for a game console and is a massive chip for a portable. It has a 455 mm die size, that's larger than a freaking launch PS5, there's no way it would get even an hour of battery life unless Nintendo completely changed the design and made a bulkier, more expensive, heavier system for little gain. 

Size is an absolute shit excuse I am afraid.

Samsung has half a dozen nodes available to shrink a full-fat Tegra Orin 64GB AGX down to a fraction of it's size.

You will also need to list the "irrelevant shit" inside of Orin, otherwise that claim can and will be discarded.

Soundwave said:

This is Switch 2 versus a $1000 portable (is that premium, or what do you want to call that? $1000 is budget friendly?). I would say this level of performance is significantly better than just "OK", this is holding its own fairly well against an extremely expensive gaming device. This comparison is also I believe without the early release performance patch for the Switch 2 and likely there will be other patches coming which will improve the performance of the Switch 2 version

I can can cherry pick videos as well.

The Rog Ally X has longer battery life than the Switch, older games that don't rely on upscaling or Ray Tracing tends to look and run better on the Rog Ally X as it's better suited to AMD's 4 year old GPU technology in the Rog.

The Steamdeck runs legacy games like Crysis better than the Switch 2 and has longer battery life and a far better screen. 

The GPD Win 5 decimates all other handhelds on the market. THAT is a premium device that has opted for the best hardware available... It's essentially a portable PS5.




Soundwave said:

Nintendo could easily have charged $600 for this hardware if they really wanted to, sure the ROG Ally X is better in some respects, but this is also a lot closer than it has any business being, the ROG Ally X is over 2x the cost of a Switch 2. It's just hilarious how many pretzels some people want to twist into to avoid acknowledging that this is probably a very different era of Nintendo hardware. This result is much more in line with systems like the GameCube and N64 which did have impressive hardware performance for their day. And this isn't likely even the best the Switch 2 can ever do, there likely will be better ports as the system is still early and Nintendo sent out dev kits late, there will be better examples of the Switch 2's hardware than this as time goes on. That is undocked only for the Switch 2 also obviously, docked mode may have better results in some areas.

Nintendo cannot charge anything they like. They need to appeal to the mass market in order to build a sizable population in order to sell software.

The Rog Ally X doesn't. It's profitable regardless of how many units it sells... Nor is Asus' entire company tied to a single device or platform.

The Switch 2 is not positioned in a similar position in the market like the Gamecube or Nintendo 64 was.
It's mid-range in the handheld space... It's low-end when compared to other consoles, falling roughly in line with the low-end Xbox Series S.

Soundwave said:

To get this result too while the Switch 2 only runs at about 9-10 watts undocked from 8nm whereas the ROG Ally X there is using 20+ watts is also fairly impressive. The new hardware team at Nintendo and Nvidia did some impressive work in getting this level of performance from that low of a power draw (am I allowed to say that? Or is that not allowed here?). Watt for watt I don't think there's anything on the market that gets this performance at 10 watts. 

The ROG Ally X operating at 20w still gets longer battery life than the Switch 2 at 9w.

So the higher power draw is actually irrelevant.

Is the Tegra Orin chip good for the size of it's chip and power consumption? Absolutely. But let's not pretend the Switch beats the Rog Ally X in every single metric... Because it simply doesn't.
There are many games that run and look better on the Rog Ally X and you get longer battery life and a far better display... And a bigger games library.
The games are also cheaper on the Rog Ally X, you also don't need to pay a stupid subscription to play online.


Chrkeller said:

Bang for buck, S2 kills portable PC, imo. The S2 holds it own for half the price.

Bang for buck, the Switch Lite kills the Switch 2. The Switch lite is half the price and has a huge (and amazing) games library.

Horses for courses.

Chrkeller said:

On a side note, the S2 pro controller is my new favorite controller.  

Is it actually worth upgrading from the S1 Pro controller? Been thinking of upgrading but trying to find justification.


Chrkeller said:

Largely agree.  I mean I can see a difference between medium and ultra but have gotten to the point where it just doesn't have a punch that I care about.  In fact, I find myself using medium/high settings on PC just for the increased fps.  for me it feels like fps is the next big jump.  I absolutely loved Prime 4's 120 fps mode.  the one thing I love about nintendo, when possible, they prioritize fps. 

I actually preferred to run Prime 4 on my Switch OLED over my Switch 2.
Whilst 120hz is nice, it's just not a great experience on that terrible panel.

The OLED contrast just made the game "pop" visually, plus the response time of OLED was simply superior... Not to mention the longer battery life... And I can use my Switch 2 cart in both systems.

I think it's a testament really to how well they optimized the game not just for the Switch 1, but enabling higher options on Switch 2, it's a lean game engine with great art assets.


curl-6 said:

It's technically correct to say that it's multiple GPU generations (not console generations) ahead of last gen in terms of feature set thanks to DLSS/hardware raytracing/etc, but it does need to be stressed that power also matters and in that regard it's also correct to say it's closer to PS4 in raw power simply due to the physical limits of fitting in a tiny case and running off a battery.

Switch 1 was based on Maxwell...

So you had: Pascal (GTX 10xx), Turing (GTX 16xx - RTX 20xx) and finally Ampere (RTX 30xx).

However in PC terms... There is a clear split between Non-RT hardware and RT-Hardware, which was a generational leap.

Arguably I would place the Switch 2 as just a little above the PS4, it's got a better CPU, it's got comparative memory bandwidth (Once you account for Delta Colour Compression, Tiled Based Rendering, Culling etc') and has extra resources to perform new and more advanced rendering tricks that allows the hardware to punch above it's paper specifications.




www.youtube.com/@Pemalite