By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - How will be Switch 2 performance wise?

 

Your expectations

Performance ridiculously ... 0 0%
 
Really below current gen,... 2 100.00%
 
Slightly below current ge... 0 0%
 
On pair with current gen,... 0 0%
 
Total:2

I'm a little skeptical of this newest Switch 2 leak. While it would be great to have a Switch 1 card slot, it raises alot of questions. Switch 2 needs fast storage, it can't have last gen storage speeds again, and I'm doubtful that Nintendo will be able to get faster Switch 2 carts and slower Switch 1 carts to work in the same slot, which would mean they would need 2 cart slots, which seems fairly pricey to me. On top of that, physical carts with the kind of transfer speeds and sizes that Nintendo is going to need for next gen games (at least 50 GB to fit the larger next gen games, and at least 1 GB/s in transfer speed I'd assume, considering Xbox Series is like 2.4 GB/s and PS5 5.5 GB/s) seem like they would be prohibitively expensive to produce, unless Nintendo plans to charge $10 more for physical Switch 2 games than on digital.

It seems to me like they're going to have to either make a choice of either making a console with a very high bill of materials and expensive carts that end up cutting into their profit alot, or make the choice to have last gen storage speeds, which is going to make porting PS5/XS games to Switch 2 nearly as difficult as porting PS4/XB1 games to Switch 1 was, or make the choice to go digital only on Switch 2 games. 

Last edited by shikamaru317 - on 11 February 2024

Around the Network
shikamaru317 said:

I'm a little skeptical of this newest Switch 2 leak. While it would be great to have a Switch 1 card slot, it raises alot of questions. Switch 2 needs fast storage, it can't have last gen storage speeds again, and I'm doubtful that Nintendo will be able to get faster Switch 2 cards and slower Switch 1 cards to work in the same slot, which would mean they would need 2 cart slots, which seems fairly pricey to me. On top of that, physical carts with the kind of transfer speeds and sizes that Nintendo is going to need for next gen games (at 70 GB to fit the larger next gen games, and at least 1 GB/s in transfer speed I'd assume, considering Xbox Series is like 2,4 GB/s and PS5 5.5 GB/s) seem like they would be prohibitively expensive.

It seems to me like they're going to have to either make a choice of making a console with a very high bill of materials and expensive carts that end up cutting into their profit alot, or make the choice to have last gen storage speeds, which is going to make porting PS5/XS games to Switch 2 nearly as difficult as porting PS4/XB1 games to Switch 1 was.

I mean didn't the 3DS have one slot too? And didn't it also have faster card speeds than DS? What is so amazing about that. Game Boy Advance was the same story, one slot for old GB carts and new GBA carts. 

This is going to cost $400, quite possibly there might even be a higher end SKU which costs even more than that ($450?). 

It's 2024, you're not paying the same for anything as you were 10 years ago. 



Soundwave said:

A RTX 2050 is a PS5 class hardware. Saying it's not would be like saying the XBox Series S is not in the same generation class as the PS5 ... no one is buying that.

Just because it's not exactly equal doesn't mean it basically runs the same games.

A 2050 will optimized developer support will run basically every PS5 game, it's not the exact same teraflop performance but with DLSS it can also render at like half the resolution or less and provide a similar image output.

Anyone unhappy with that level of performance from a portable device is smoking crack cocaine, lol.

Lol, no it isn't.  The ps5, from a nvidia perspective, is close to a 2070s.  Jesus, you don't know anything at all.  The difference between a 50 and 70, across all generations, is significant.  

A 4050 versus a 4070 super, yeah the 4070 over doubles the performance in benchmarking.  Just because they are both 4000 series does not make them the same class.  Not even remotely accurate.  

I'm 100% sure you just parrot random forum posts  and have zero actual knowledge on tech.  

Last edited by Chrkeller - on 11 February 2024

Chrkeller said:
Soundwave said:

A RTX 2050 is a PS5 class hardware. Saying it's not would be like saying the XBox Series S is not in the same generation class as the PS5 ... no one is buying that.

Just because it's not exactly equal doesn't mean it basically runs the same games.

A 2050 will optimized developer support will run basically every PS5 game, it's not the exact same teraflop performance but with DLSS it can also render at like half the resolution or less and provide a similar image output.

Anyone unhappy with that level of performance from a portable device is smoking crack cocaine, lol.

Lol, not is isn't.  The ps5, from a nvidia perspective, is close to a 2070s.  Jesus, you don't know anything at all.  

A 2050 will run basically anything a PS5 will, it's like saying a XBox Series S is not the same class of hardware as a Series X. A 2050 is about half a PS5 in raw performance but has DLSS on top of that which means it doesn't have to run games at the same resolution to get a similar looking image. 

You guys are ridiculous if you're complaining about that level of hardware in a portable device, lol, that is like the nerdiest fringe of a fringe ("boo hoo! I can't play a game with no anistropic filtering!"). 

Jeezus, how did you live before 2017, lol. It reminds me of a time I was at the Apple store about a year ago and listening to a parent who had a 12 year old having a melt down saying they couldn't live their life if they were forced to use an iPhone 12, lol. Like if that was my kid they'd be getting a swift smack upside the head for being a dumb ass and reminded that even an iPhone 12 is a luxury. 

Portables being able to run modern console games at all is a freaking miracle, that wasn't a normal thing until like a few years ago, a RTX 2050-range portable machine is freaking incredible. In the hands of any developer that really wants to give half an effort and fine tune specifically for that hardware, that can run basically any modern gen game like Alan Wake II at a reasonable fidelity that is very playable.

Like c'mon what are we even talking about at this point, the plot has been completely lost if that is some how not good enough. 



Soundwave said:
Chrkeller said:

Lol, not is isn't.  The ps5, from a nvidia perspective, is close to a 2070s.  Jesus, you don't know anything at all.  

A 2050 will run basically anything a PS5 will, it's like saying a XBox Series S is not the same class of hardware as a Series X. A 2050 is about half a PS5 in raw performance but has DLSS on top of that which means it doesn't have to run games at the same resolution to get a similar looking image. 

You guys are ridiculous if you're complaining about that level of hardware in a portable device, lol, that is like the nerdiest fringe of a fringe ("boo hoo! I can't play a game with no anistropic filtering!"). 

Jeezus, how did you live before 2015, lol. It reminds me of a time I was at the Apple store about a year ago and listening to a parent who had a 12 year old having a melt down saying they couldn't live their life if they were forced to use an iPhone 12, lol. Like if that was my kid they'd be getting a swift smack upside the head for being a dumb ass and reminded that even an iPhone 12 is a luxury. 

Portables being able to run modern console games at all is a freaking miracle, that wasn't a normal thing until like a few years ago, a RTX 2050-range portable machine is freaking incredible. In the hands of any developer that really wants to give half an effort, that can run basically any modern gen game like Alan Wake II at a reasonable fidelity that is very playable.

Lol, I'm not bothering with you.  You think a 50 and 70 are the same class of hardware, lmfao.  That says everything there is to say.  

That and you can't follow an argument.  Nobody is complaining about a 2050 switch 2.  Pointing out it isn't a 2070s isn't a complaint, just a hardcore undisputed fact.  

Last edited by Chrkeller - on 11 February 2024

Around the Network
Chrkeller said:
Soundwave said:

A 2050 will run basically anything a PS5 will, it's like saying a XBox Series S is not the same class of hardware as a Series X. A 2050 is about half a PS5 in raw performance but has DLSS on top of that which means it doesn't have to run games at the same resolution to get a similar looking image. 

You guys are ridiculous if you're complaining about that level of hardware in a portable device, lol, that is like the nerdiest fringe of a fringe ("boo hoo! I can't play a game with no anistropic filtering!"). 

Jeezus, how did you live before 2015, lol. It reminds me of a time I was at the Apple store about a year ago and listening to a parent who had a 12 year old having a melt down saying they couldn't live their life if they were forced to use an iPhone 12, lol. Like if that was my kid they'd be getting a swift smack upside the head for being a dumb ass and reminded that even an iPhone 12 is a luxury. 

Portables being able to run modern console games at all is a freaking miracle, that wasn't a normal thing until like a few years ago, a RTX 2050-range portable machine is freaking incredible. In the hands of any developer that really wants to give half an effort, that can run basically any modern gen game like Alan Wake II at a reasonable fidelity that is very playable.

Lol, I'm not bothering with you.  You think a 50 and 70 are the same class of hardware, lmfao.  That says everything there is to say.  

Hardware is defined by the software it runs, they both run the same games and it's not like one of them is running like a PS2 looking version of the game at 10 fps. 

For practical purposes for a normal person that is the same class of hardware. Normal people are not fucking nerds who pixel count and look for anistropic filtering, dorks do that yes, I don't deny that, but that is a tiny, tiny portion of the general market. It's so small why are we even bothering spending so much time debating what they care about in relation to a Switch 2, like who cares. 

2050 performance in a portable, if that's what the Switch 2 would be incredible. Anyone complaining about that is out to lunch. A 2050 range portable machine is effectively a portable modern-gen console on the go.

Last edited by Soundwave - on 11 February 2024

Soundwave said:
shikamaru317 said:

You are right that high end GPU's are fairly niche in gaming, latest Steam Hardware survey has only around 10% using high end cards from the last 2 Nvidia GPU gens (so 3070-4090) while AMD's high end parts from the last 2 GPU gens are under 1% combined on there.

That 10% number is actually a lot higher than it normally would be. A lot of people circa 2018-2021 bought a higher end GPU than they normally would have because of COVID lockdown + crypto bros. boom (people thinking they could make money during that). 

That 10% number is actually wrong.

Latest Steam Hardware survey has 18.03% for GA104 + AD104 GPUs and better.

Even if we don't count the 3060 Ti (only a few % slower than the 3070, same RAM, same chip), it's still 15%

It's also no secret that RDNA2 + RDNA3 aren't tracked correctly in the Steam survey. Together with the RTX 2080 Ti and 4060 Ti (which are both a bit faster than a RTX 3070) the share of powerful GPUs is well over 20%.



Conina said:
Soundwave said:

That 10% number is actually a lot higher than it normally would be. A lot of people circa 2018-2021 bought a higher end GPU than they normally would have because of COVID lockdown + crypto bros. boom (people thinking they could make money during that). 

That 10% number is actually wrong.

Latest Steam Hardware survey has 18.03% for GA104 + AD104 GPUs and better.

Even if we don't count the 3060 Ti (only a few % slower than the 3070, same RAM, same chip), it's still 15%

It's also no secret that RDNA2 + RDNA3 aren't tracked correctly in the Steam survey. Together with the RTX 2080 Ti and 4060 Ti (which are both a bit faster than a RTX 3070) the share of powerful GPUs is well over 20%.

It's academic anyway.

GPUs had an unsustainable boom the from about 2017-2021 especially because of people being locked inside their homes due to a once-in-100 years global pandemic and crypto miners buying up every new GPU and then reselling them to people for like half the price once the crypto mining/NFT market went kaput. 

What we're seeing in the last 12 months is more indicative of what the GPU market normally is ... it has collapsed from its peak numbers. People have stopped buying. Luckily for Nvidia they have the AI boom to fall back on.



Soundwave said:

haxxiy said:

Problem remains this is a massive waste of SMs. You could get the same performance from 8SMs and just clock a bit higher and have a cheaper chip. Why pay for 12SMs, it's a way larger chip, doesn't really make a whole lot of sense. Like this doesn't even align with things Nintendo has done in the past if that's the argument. 

210 MHz, lol, the Wii's 2001-era GPU has a higher clock than that. These are insanely low clocks for a more expensive and much larger chip for no reason. This is like going out of your way to buy a jumbo popcorn at the theater and paying the $6 premium for it and then eating 10% of it, you could have just bought a freaking regular popcorn and not have paid the extra money. And if the argument for doing so is "well I did that because I'm cheap" ... it's like what? lol. How does that make any sense. 

Because you'd save power. Most hardware runs at just a fraction of how efficient they can be because they're placed far, far above the optimal point in the voltage vs. frequency curve. Besides, again, the older node would be cheaper even with a larger chip, and the frequencies would be higher than that.

What, you think Nintendo would take in the cost of the die shrink just so it can run at a higher frequency? Just so they could have better graphics?

Reminder this is the same company that released the Wii U in 2012 with a ~ 1997 architecture CPU, and underclocked a 15W Tegra by 65% to be 20 times slower than a GTX 1060 with the undocked mode Switch.

All of that being said... I do think the console can be 5/4 nm as I said before, and I hope it is. It's just that a lot of people here are fuming and screaming at the mere thought of it and it definitely there's a universe it could.



 

 

 

 

 

haxxiy said:
Soundwave said:

Problem remains this is a massive waste of SMs. You could get the same performance from 8SMs and just clock a bit higher and have a cheaper chip. Why pay for 12SMs, it's a way larger chip, doesn't really make a whole lot of sense. Like this doesn't even align with things Nintendo has done in the past if that's the argument. 

210 MHz, lol, the Wii's 2001-era GPU has a higher clock than that. These are insanely low clocks for a more expensive and much larger chip for no reason. This is like going out of your way to buy a jumbo popcorn at the theater and paying the $6 premium for it and then eating 10% of it, you could have just bought a freaking regular popcorn and not have paid the extra money. And if the argument for doing so is "well I did that because I'm cheap" ... it's like what? lol. How does that make any sense. 

Because you'd save power. Most hardware runs at just a fraction of how efficient they can be because they're placed far, far above the optimal point in the voltage vs. frequency curve. Besides, again, the older node would be cheaper even with a larger chip, and the frequencies would be higher than that.

What, you think Nintendo would take in the cost of the die shrink just so it can run at a higher frequency? Just so they could have better graphics?

Reminder this is the same company that released the Wii U in 2012 with a ~ 1997 architecture CPU, and underclocked a 15W Tegra by 65% to be 20 times slower than a GTX 1060 with the undocked mode Switch.

All of that being said... I do think the console can be 5/4 nm as I said before, and I hope it is. It's just that a lot of people here are fuming and screaming at the mere thought of it and it definitely there's a universe it could.

I mean you could still clock ridiculously low and get the same performance from 8SMs. I'm not even talking about any other node, I'm talking about 8nm strictly.

At 8nm you could clock an 8SM chip 50% higher and the clock speeds would basically still be very low, 210 MHz increased by 50% clock speed is still a very low 315 MHz, that's still lower than the current undocked spec for the current Switch, lol (current Switch runs at 384 MHz undocked). 

So why pay for 12SMs. It's a massive chip with a huge number of extra graphics cores for no reason. If the argument is "because Nintendo" that doesn't even make sense, because if the logic is "they're cheap" ... so they're cheap ... but they're paying for a more expensive chip with more graphics cores ... because ... ?

I'm not even sure what kind of flexibility Nintendo would have with an 8nm Samsung chip because from what I've heard, 8nm process is a dead end that isn't compatible with their 4nm/5nm processes. Nintendo wouldn't be able to die shrink for future models like how the Switch got the Mariko model + Lite model. Nvidia has not booked any Samsung business outside of their 8nm node either and it doesn't look like they will (they have booked TSMC 3nm, which would die shrink nicely from TSMC 5nm/4N). So I guess Nintendo is just good with losing 20-25 million hardware sales right there?

There's a lot about it that just doesn't make sense.