By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 5.26%
 
Outdated 1 1.75%
 
Slightly outdated 14 24.56%
 
On point 31 54.39%
 
High tech! 7 12.28%
 
A mixed bag 1 1.75%
 
Total:57
Pemalite said:

You have twisted my statements to try and fit your narrative.

Nintendo didn't purchase the Tegra X1 as a "Hand me down". That's your false statement.

The Tegra X1 is not "Custom designed" for any form factor. nVidia made a chipset and threw it everywhere they could... Which is why it ended up in Tablets, HTPC's, Handheld and Automotive.
It's literally not custom designed for anything, it's a general purpose SoC with an emphasis on GPU capability.

nVidia literally every single year... Takes a chip, adjusts it's clockspeeds and sells it to a different market, form factor or price bracket, Tegra X1 is no different.
Nintendo had to make the decision to be conservative with clocks to keep TDP and Battery life within an acceptable "Worse case scenario".
nVidia had nothing to do with that clockspeed decision, that was Nintendo, which is why Switch consoles are capable of overclocking so well.

I never said the X1 was customized for any specific form-factor. In fact, that is my original point. It was designed to be used in many different devices with many different purposes. Also, let's not become the pot calling the kettle black with "you have twisted my statements to try to fit your narrative." You have the tendency of splicing statements out of their original context and replying to them as if they aren't part of a stream of thought. You do this often and to many people. Besides, I didn't "twist your statement" I addressed that you put words into my mouth that I never made, like the idea that hardware was sitting somewhere and then was being used in Switch's.  

The cost of designing the chip was already done and dusted before the Tegra X1 even released.
The GPU is derived from their Desktop offerings, being Maxwell based.

nVidia then licensed the ARM cores from... Well. ARM.

What do you think happened? nVidia designed the CPU and GPU architectures from scratch? No. No. That didn't occur.

It is no more of a "hand me down" chip than what the Xbox Series X, Playstation 5 and Switch 2 has... They are all based on already developed CPU and GPU architectures.

nVidia, AMD and Intel build "libraries" which are blocks of semiconductors that they can line up with other libraries in order to expedite time-to-market chip development by partnering one library up with another.

This is no different than the Switch 2 SoC being based on another nVidia chip used for IoT, Cars and more.
It's based once again on ARM I.P and uses the PC Ampere GPU architecture... Again, this technology is several years old at this point.

Every single design decision requires a labor-force that typically makes six-figures per FTE (in the U.S) to plan, design, and test those decisions. Microarchitecture R&D is separate from the R&D for designing the actual chipsets. There were still FTE's that made decisions about what the specific SOCs look like with the X1, and they cost money to employ. This isn't some binary of "R&D is done when Maxwell [or other micro-architecture] comes out." Tons of decisions are made after that. 

And yes, planning the development of a chipset specifically with one platform in mind and its form-factor does require more R&D than just plopping in an already designed general-purpose chip, turning off a few CPU cores, and adjusting clock-rates. Are you seriously arguing otherwise here? 

If I were to make an analogy, its the difference between making a new video game and making a remaster of a game. In both scenarios you might be using middleware like a game engine (in this analogy that would be the micro-architecture design that has already been done) but the latter (remaster : adjusting a few clock rates) requires a lot less work, FTE, and money than the prior (developing a new game : designing hardware to fit a specific form-factor and purpose over the course of years.) 

The Switch 2 SoC isn't based on the T234 in the same way T210 in the Switch was on T210 in the Shield. I'm not sure where you got that idea. It was developed adjacent to it and the design decisions were made with its form-factor and intended use in mind. Some decisions might have been made for Ampere-based Tegra in general, but not all of these decisions. That's why it doesn't, for example, include specialized edge-compute hardware but has an FDE. Just because they both use the same basic architecture and share some design choices, doesn't mean there isn't different R&D being done for these two chips beyond the micro-architecture and common chipset family R&D. 

Last edited by sc94597 - on 03 May 2025

Around the Network
Conina said:
Oneeee-Chan!!!2.0 said:

I was just thinking after seeing the cyberpunk 2077 performance on the Switch 2, it's odd that the $449 Switch 2 is better than the $500 Steam Deck 256 GB , yet is criticized as being too expensive.

Well, that is one game so far, which gets a lot of porting effort.

The Steam Deck can play thousands of games which either aren't on Switch or where a Switch 2 upgrade patch ain't sure.

F. e. God of War + Ragnarok, The Witcher 3, Resident Evil 2 - 4 Remake, Resident Evil 7 + Village, The Talos Principle 2, Deathloop, Wolfenstein 1 + 2, Doom + Doom Eternal, Horizon 1 + 2, Marvel Spider-Man 1 + MM + 2, Ratchet & Clank: Rift Apart, A Plague Tale Requiem, Forza Horizon 4 + 5, Dark Souls 3, Death Stranding... 

I doubt that Cyberpunk is an exceptional good example.We will probably find that the Switch 2 version is better on many games.

Oneeee-Chan!!!2.0 said:

Of course I checked and saw the actual price.
Yes, but most of them were $450-$550. It must be something like the MSPR for a graphics card.

In the Steam Store, the Steam Decks are always available for MSRP (or below). Free shipping included.

I didn't know they were selling the Steam Deck on the Valve store.
But the price of $399 is current and it was $529 at launch.



Cyberpunk play video in dock mode.
Probably 1080p.



This is not a bad video comparison of Hogwarts Legacy specifically Steam Deck vs. Switch 2. This may have been posted before but I like this better because he jumps directly back and forth between Switch 2 and Steam Deck, the original video for whatever dumb reason opted to throw in Switch 1 as well in the middle of comparisons. 

Although it's not the best because the time of day in some of the shots is different, but as you go along the video the Switch 2 looks a lot cleaner image quality wise and I think it's using better textures too, it kinda looks like the Steam Deck is low or ultra low settings and Switch 2 is medium settings with a better resolution (though this may just be the result of DLSS being used). 



Soundwave said:

Although it's not the best because the time of day in some of the shots is different, but as you go along the video the Switch 2 looks a lot cleaner image quality wise and I think it's using better textures too, it kinda looks like the Steam Deck is low or ultra low settings and Switch 2 is medium settings with a better resolution (though this may just be the result of DLSS being used). 

The Steam Deck seems to be set on lowest settings... even the textures, which makes no sense. The Steam Deck has 16 GB RAM and can handle higher textures without any performance decreases.

The screenshots of the Switch 2 version are from the trailer, which will of course be the docked mode, not the handheld mode.

No wonder that the docked Switch 2 version looks better than PC handheld footage in lowest settings.

Footage of the undocked Switch 2 version compared with the PC version with optimized settings on PC handhelds (Steam Deck, ROG Ally, Legion Go...) will be much more interesting.



Around the Network
sc94597 said:

I never said the X1 was customized for any specific form-factor. In fact, that is my original point. It was designed to be used in many different devices with many different purposes. Also, let's not become the pot calling the kettle black with "you have twisted my statements to try to fit your narrative." You have the tendency of splicing statements out of their original context and replying to them as if they aren't part of a stream of thought. You do this often and to many people. Besides, I didn't "twist your statement" I addressed that you put words into my mouth that I never made, like the idea that hardware was sitting somewhere and then was being used in Switch's.  

You stated Nintendo purchased the Tegra X1 as a "Hand me down". - It's a false statement.

sc94597 said:

Every single design decision requires a labor-force that typically makes six-figures per FTE (in the U.S) to plan, design, and test those decisions. Microarchitecture R&D is separate from the R&D for designing the actual chipsets. There were still FTE's that made decisions about what the specific SOCs look like with the X1, and they cost money to employ. This isn't some binary of "R&D is done when Maxwell [or other micro-architecture] comes out." Tons of decisions are made after that. 

Good thing the USA isn't the leader in chip manufacturing then.

sc94597 said:

And yes, planning the development of a chipset specifically with one platform in mind and its form-factor does require more R&D than just plopping in an already designed general-purpose chip, turning off a few CPU cores, and adjusting clock-rates. Are you seriously arguing otherwise here? 

The Tegra X1 is a standardized commidty part, like an SSD or Ram.

Nintendo purchases it, sets clocks and voltages as per their design specifications via firmware, then gets ANOTHER company to assemble it.

sc94597 said:

If I were to make an analogy, its the difference between making a new video game and making a remaster of a game. In both scenarios you might be using middleware like a game engine (in this analogy that would be the micro-architecture design that has already been done) but the latter (remaster : adjusting a few clock rates) requires a lot less work, FTE, and money than the prior (developing a new game : designing hardware to fit a specific form-factor and purpose over the course of years.) 

No one designs chips strictly for consoles anymore. They are assembly of already pre-designed parts.

The days of Cell are over.

sc94597 said:

The Switch 2 SoC isn't based on the T234 in the same way T210 in the Switch was on T210 in the Shield. I'm not sure where you got that idea. It was developed adjacent to it and the design decisions were made with its form-factor and intended use in mind. Some decisions might have been made for Ampere-based Tegra in general, but not all of these decisions. That's why it doesn't, for example, include specialized edge-compute hardware but has an FDE. Just because they both use the same basic architecture and share some design choices, doesn't mean there isn't different R&D being done for these two chips beyond the micro-architecture and common chipset family R&D. 

Tegra T234 was designed primarily for Nvidia's Jetson AGX Orin for Industrial/HPC applications.

Nintendo took the Switch approach and took an already designed nVidia chip for the Switch 2 and made some clockspeed and voltage changes to fit within it's design goals.

T234 being Ampere and ARM based is a quick time-to-market with low R&D investment as the architectures were already designed and ratified for other markets.

Conina said:

The Steam Deck seems to be set on lowest settings... even the textures, which makes no sense. The Steam Deck has 16 GB RAM and can handle higher textures without any performance decreases.

The screenshots of the Switch 2 version are from the trailer, which will of course be the docked mode, not the handheld mode.

No wonder that the docked Switch 2 version looks better than PC handheld footage in lowest settings.

Footage of the undocked Switch 2 version compared with the PC version with optimized settings on PC handhelds (Steam Deck, ROG Ally, Legion Go...) will be much more interesting.

Keep in mind we are dealing with the strengths/weaknesses that we typically see when comparing AMD against nVidia, just this time in Handhelds.

That is... AMD Radeon's tend to be significantly worse at upscaling, ray tracing and are less efficient overall.

There are going to be instances where the Steamdeck is faster, but other instances where the Switch 2 is faster, more RAM for instances doesn't always guarantee better textures, you need the TMU's and Bandwidth to have the throughput to manage higher textures.

Plus of course we have the wild-card that is developer porting, many instances where a vastly more powerful PC had it's games running worse than the console version. (The Vice versa has happened also, but much less often.)




www.youtube.com/@Pemalite

Pemalite said:

Keep in mind we are dealing with the strengths/weaknesses that we typically see when comparing AMD against nVidia, just this time in Handhelds.

That is... AMD Radeon's tend to be significantly worse at upscaling, ray tracing and are less efficient overall.

There are going to be instances where the Steamdeck is faster, but other instances where the Switch 2 is faster, more RAM for instances doesn't always guarantee better textures, you need the TMU's and Bandwidth to have the throughput to manage higher textures.

Plus of course we have the wild-card that is developer porting, many instances where a vastly more powerful PC had it's games running worse than the console version. (The Vice versa has happened also, but much less often.)

I followed the link to the video where the Steam Deck screenshots are from.

They were made with FSR 1.0!

Since its release, the PC version was upgraded to FSR 2.0, then FSR 2.2 and currently FSR 3.0, they also added frame generation which allows much higher settings than low.

I just made a few comparison shots with FSR 3.0 on my Steam Deck OLED (settings below), they look a lot better than the ones in the video:



Pemalite said:
sc94597 said:

I never said the X1 was customized for any specific form-factor. In fact, that is my original point. It was designed to be used in many different devices with many different purposes. Also, let's not become the pot calling the kettle black with "you have twisted my statements to try to fit your narrative." You have the tendency of splicing statements out of their original context and replying to them as if they aren't part of a stream of thought. You do this often and to many people. Besides, I didn't "twist your statement" I addressed that you put words into my mouth that I never made, like the idea that hardware was sitting somewhere and then was being used in Switch's.  

1. You stated Nintendo purchased the Tegra X1 as a "Hand me down". - It's a false statement.

sc94597 said:

Every single design decision requires a labor-force that typically makes six-figures per FTE (in the U.S) to plan, design, and test those decisions. Microarchitecture R&D is separate from the R&D for designing the actual chipsets. There were still FTE's that made decisions about what the specific SOCs look like with the X1, and they cost money to employ. This isn't some binary of "R&D is done when Maxwell [or other micro-architecture] comes out." Tons of decisions are made after that. 

2. Good thing the USA isn't the leader in chip manufacturing then.

sc94597 said:

And yes, planning the development of a chipset specifically with one platform in mind and its form-factor does require more R&D than just plopping in an already designed general-purpose chip, turning off a few CPU cores, and adjusting clock-rates. Are you seriously arguing otherwise here? 

3. The Tegra X1 is a standardized commidty part, like an SSD or Ram.

Nintendo purchases it, sets clocks and voltages as per their design specifications via firmware, then gets ANOTHER company to assemble it.

sc94597 said:

If I were to make an analogy, its the difference between making a new video game and making a remaster of a game. In both scenarios you might be using middleware like a game engine (in this analogy that would be the micro-architecture design that has already been done) but the latter (remaster : adjusting a few clock rates) requires a lot less work, FTE, and money than the prior (developing a new game : designing hardware to fit a specific form-factor and purpose over the course of years.) 

4. No one designs chips strictly for consoles anymore. They are assembly of already pre-designed parts.

The days of Cell are over.

sc94597 said:

The Switch 2 SoC isn't based on the T234 in the same way T210 in the Switch was on T210 in the Shield. I'm not sure where you got that idea. It was developed adjacent to it and the design decisions were made with its form-factor and intended use in mind. Some decisions might have been made for Ampere-based Tegra in general, but not all of these decisions. That's why it doesn't, for example, include specialized edge-compute hardware but has an FDE. Just because they both use the same basic architecture and share some design choices, doesn't mean there isn't different R&D being done for these two chips beyond the micro-architecture and common chipset family R&D. 

5. Tegra T234 was designed primarily for Nvidia's Jetson AGX Orin for Industrial/HPC applications.

Nintendo took the Switch approach and took an already designed nVidia chip for the Switch 2 and made some clockspeed and voltage changes to fit within it's design goals.

T234 being Ampere and ARM based is a quick time-to-market with low R&D investment as the architectures were already designed and ratified for other markets.

1. It is not false. The chipset in the original Switch SKU had the same exact design and was identical to the T210 in the Nvidia Shield TV.

It is not just a matter of using the Shield's chipset as a base and then making specialized modifications for a hybrid closed-ecosystem console. It was the same exact chip-design, but with a few CPU efficiency cores disabled (still present on the physical hardware though) and clock speeds under-clocked. These things, as you noted, would be done by Nintendo and not Nvidia. 

That is not how most console hardware, even today, inherits existing designs. The T239, for example, is a chipset designed and used exclusively by the Switch 2. There is no identical to x device in the Switch 2's circumstance. It inherits general design decisions from other Ampere and Tegra Orin chipsets, but it is its own chipset which had to be specifically designed by (an) Nvidia SOC R&D team(s), using constraints from Nintendo and with its intended use-case in mind. 

2. U.S companies are the leader in chip design (although not exclusively) and the design work does happen in the U.S. Why do you keep bringing up manufacturing and fabrication? Nvidia is not a chip manufacturer, they're a chip design company. Nintendo's contract with Nvidia is for chip design, and you know this because you say as much in your response here. The T239 would not exist if Nintendo didn't work with Nvidia to ask for it to be designed for them. Tegra Orin would just be a SOC family with one fewer SOC. The T210 would exist if Nintendo didn't contract with Nvidia for the Switch. That's the difference. That is what is meant by "hand me down." 

3. Yes, and the T239, comparatively -- is not. Which is my point. It's not for purchase by any other company than Nintendo. It was designed specifically for Nintendo and the Switch 2 with their input and constraints, working with Nvidia. No other device will use the T239. 

4. See points #1 and #3. There are no other devices that will use the T239. 

5. And Tegra T234 is not the chipset in the Switch 2. The Tegra T239 that is in the Switch 2, is a different SOC from the T234. It's not the same situation with the original Switch where the same exact chipset (T210) was popped into the Switch 2, the efficiency core cluster deactivated (not even removed) and frequencies changed. The T239 has specialized accelerators for gaming-specific tasks, different SM/core counts, and has the specialized edge-compute accelerators (NVDLA) removed compared to the T234. You don't see anything like this with the T210 vs. T210 (in another device) situation because that was an actual general-purposed chip meant to be used in tablets (amongst other devices) and not specifically designed for edge-compute as the T234. Nintendo was able to just purchase the T210 from Nvidia as-is, because it worked well enough. They couldn't do that with a T234 (for its size alone.) And even then Nintendo had to disable its efficiency core cluster that existed physically in the Switch, which they wouldn't have done if they had a chipset designed specifically for the Switch, as they do with the T239 for Switch 2. They'd just not include them from the start and have a different CPU configuration that is fully utilized by the Switch, if the Switch's chipset were semi-custom like the Switch 2's. This is why the T239 doesn't physically have the NVDLA; it's a semi-custom chip for the Switch 2, whereas the T210 was not for the Switch and it therefore had redundant/vestigial hardware (like the A53 cluster.) 

Very few Tegra chipsets in the last few Tegra families are general-purpose in the sense the X1 family was. Most are specifically designed for edge-computing, unlike the X1 family, which was genuinely meant to be used in a variety of devices, including consumer tablets and small form factor boxes. Nvidia basically abandoned the tablet/consumer market after X1, which is why Nintendo couldn't just pick a SOC design 'off the shelf' and say "good enough, we'll just disable some physically extant hardware" this time around. 

Last edited by sc94597 - on 04 May 2025

Nintendo's own hardware designers have said they weren't satisfied with the Switch 1 (Tegra X1) chip's performance but for the time (likely meaning the cost/availability) it was the best option for the Switch 1 circa 2015 or whenever they would have had to have made th decision on which chip to use. They wanted a better performance chip. 

They are happier with the performance of the Switch 2 chip which was no doubt designed with their involvement this time. This is straight from Nintendo's own Iwata Asks style interview with the Switch 2 hardware team.


Nvidia is simply a far bigger and more seasoned company at making these kinds of mobile chips too. The Tegra K1 and X1 were early days efforts, the chip that is in the Switch 2 (T239) is by now several generations ahead. Likely Nvidia has simply gotten better at making these kinds of chips (any chip really). The fact is Nvidia has simply grown to become one of the largest market cap companies on planet earth now.

A company like Sony is a tiny little fart compared to Nvidia these days, ditto for AMD. Nvidia has the best engineers and monstrous R&D resources available to them these days versus 10 years ago when they were no where close to the same market cap. That likely did not hurt the kind of chip Nintendo got these days, Nvidia is now the Rolls Royce of tech companies, they have massive R&D and capex spend and can outbid competitors for top end engineering talent. The Nvidia we are talking about today isn't the same little company they were 10 years ago. 

Last edited by Soundwave - on 04 May 2025

Yeah, my original point in the context of my response to bonzobanana wasn't even about whether or not the T239 will be a better (relatively) chipset for Switch 2 than the T210 was for Switch 1. 

Bonzobanana inquired about "confirmed specs" and my response was to highlight that because the Switch 1 had a T210 that was a "hand me down" (probably should have used "off the shelf" instead) pre-existing Nvidia design, it was a lot easier to get those confirmed specs because we knew them for devices like the Shield TV and we could reasonably use words like "underclocked" in this context because they are the same exact chipset with different clock frequencies in those two devices.  

We can't do this as easily for the Switch 2 because it has a semi-custom chipset. No other device has a T239 upon the Switch 2's release, and it wouldn't even make sense to talk about clock rates vs. a T234 because they have different core-configurations (confirmed for the GPU, and we don't know enough about the CPU to say.) The Switch 2 isn't going to be "under-clocked" compared to anything else because the chipset is specifically designed to only be used in Switch 2s. 

The best we have are the "leaked" specs. 

That was the original context of my post and using the idiom "hand me down" makes sense because we are talking about not just the final configuration set by Nintendo (which was part of the conversation), but whether or not the chipset is the same as in other devices which have confirmed specs (such as the Switch 1 vs. Shield TV.) There is no Shield TV-equivalent for Switch 2. A T234 =|= a T239. A SOC designed for edge computing can't be easily compared with a SOC designed for gaming, while a SOC that is designed to be general-purposed, and is in fact the same exact SOC used in multiple devices, can be compared in those different devices, especially when those devices are doing similar things (Shield TV & Switch.) 

Last edited by sc94597 - on 04 May 2025