By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - How will be Switch 2 performance wise?

 

Your expectations

Performance ridiculously ... 0 0%
 
Really below current gen,... 2 100.00%
 
Slightly below current ge... 0 0%
 
On pair with current gen,... 0 0%
 
Total:2

For the process node part surely the fact the Switch used a smaller node than the PS4 and Xbox One shows that the Switch 2 will probably do the same? Especially when considering that the gap between the PS5 and Xbox Series and Switch 2 will be a full 4 years compared to the previous smaller gap of 3 years and 3.5 months.



Around the Network
Norion said:

For the process node part surely the fact the Switch used a smaller node than the PS4 and Xbox One shows that the Switch 2 will probably do the same? Especially when considering that the gap between the PS5 and Xbox Series and Switch 2 will be a full 4 years compared to the previous smaller gap of 3 years and 3.5 months.

The difference is that newer nodes nowadays are more expensive, not about the same price. Imagine you get 200 dies out of an 8 nm wafer for $5,000 while you get 400 out of a 5/4 nm wafer but that costs $15,000 instead.

So that would be your reason. Plus we all have no idea how the frequency-voltage curve behaves for 8 nm vs. 5 nm at those very low Switch-like frequencies. The difference in efficiency might be smaller than some are thinking.

On the other hand, *maybe* it's just the dev kits use 8 nm Orins, not the final product. No one will probably know until people tear down a Switch 2 by ~ December.



 

 

 

 

 

haxxiy said:
Norion said:

For the process node part surely the fact the Switch used a smaller node than the PS4 and Xbox One shows that the Switch 2 will probably do the same? Especially when considering that the gap between the PS5 and Xbox Series and Switch 2 will be a full 4 years compared to the previous smaller gap of 3 years and 3.5 months.

The difference is that newer nodes nowadays are more expensive, not about the same price. Imagine you get 200 dies out of an 8 nm wafer for $5,000 while you get 400 out of a 5/4 nm wafer but that costs $15,000 instead.

So that would be your reason. Plus we all have no idea how the frequency-voltage curve behaves for 8 nm vs. 5 nm at those very low Switch-like frequencies. The difference in efficiency might be smaller than some are thinking.

On the other hand, *maybe* it's just the dev kits use 8 nm Orins, not the final product. No one will probably know until people tear down a Switch 2 by ~ December.

In that case I hope they chose to eat the higher cost for a more capable system though knowing Nintendo they wouldn't but maybe Furukawa's Nintendo would.



sc94597 said:
Chrkeller said:

Halo Infinite runs low settings, 30 fps at 1080p on my 3050.  Looks pretty god awful.  I think you are grossly overestimating what a 2050 can do.  

The minimum for AW2 is a 2060....  and minimum isn't a pretty experience in the PC world.  

Hm, something is up with your 3050 system. A 2050/3050 laptop should get about 50fps at low settings in Halo Infinite at 1080p. 30fps is roughly what these GPU's achieve at high/ultra settings. 

https://youtu.be/HYTHK3VTxIs?si=SS2n-pc3tzQo1BCP

A 3050 Desktop should be in the high 90's -110's at low settings in Halo Infinite.

AW2 also runs at 30-40fps low 1080p on low-TDP 2050/3050 laptops.  

50 fps inside, sure.  Outside, not a chance.  Trees in particular don't load textures well.  System runs fine in smaller games, but outside open world brings it down.  The 3050 is weak as is the 2050.  I beleive your linked video is multi player....  not the massive open world campaign.  Palworld runs better than campaign Halo Infinite.  Also, running games above the vram limit, as seen in the video, causes massive stuttering and latency.  It may look like a good idea on YouTube but doesn't work in the real world.  

Also, not directed at you, but the idea people can't tell the difference between low, medium and high settings is laughably absurd and flat out stupid.  The difference is stark, immediately perceivable and blatantly obvious.  I can't believe the nonsesense people convince themselves off.

My wife (massive casual) and I have been playing campaign coop Halo Infinite.  She is on the 3050 while I'm on the 4090 (which maxes Infinite without even trying) and it took her less than a minute to ask why her screen looked like crap. 

At the end of the day, anyone who is happy with 1080p 30 fps low settings has my 100% support, personal opnion and prefence.  I'm just sick and tired of hearing that nobody can tell a difference...  that is utter horse crap.  

Last edited by Chrkeller - on 10 February 2024

Norion said:

haxxiy said:

In that case I hope they chose to eat the higher cost for a more capable system though knowing Nintendo they wouldn't but maybe Furukawa's Nintendo would.

I agree, and I'd even say that would be likely if it wasn't for this sinking feeling that the Switch 2 will be Nintendo-ed in some manner.

They have a long history of getting at least one crucial point very very wrong in their hardware decisions (better than even odds of fumbling their successor platforms, in fact). Let's hope that's not the case here.



 

 

 

 

 

Around the Network
Chrkeller said:
sc94597 said:

Hm, something is up with your 3050 system. A 2050/3050 laptop should get about 50fps at low settings in Halo Infinite at 1080p. 30fps is roughly what these GPU's achieve at high/ultra settings. 

https://youtu.be/HYTHK3VTxIs?si=SS2n-pc3tzQo1BCP

A 3050 Desktop should be in the high 90's -110's at low settings in Halo Infinite.

AW2 also runs at 30-40fps low 1080p on low-TDP 2050/3050 laptops.  

50 fps inside, sure.  Outside, not a chance.  Trees in particular don't load textures well.  System runs fine in smaller games, but outside open world brings it down.  The 3050 is weak as is the 2050.  I beleive your linked video is multi player....  not the massive open world campaign.  Palworld runs better than campaign Halo Infinite.  Also, running games above the vram limit, as seen in the video, causes massive stuttering and latency.  It may look like a good idea on YouTube but doesn't work in the real world.  

Also, not directed at you, but the idea people can't tell the difference between low, medium and high settings is laughably absurd and flat out stupid.  The difference is stark, immediately perceivable and blatantly obvious.  I can't believe the nonsesense people convince themselves off.

My wife (massive casual) and I have been playing campaign coop Halo Infinite.  She is on the 3050 while I'm on the 4090 (which maxes Infinite without even trying) and it took her less than a minute to ask why her screen looked like crap. 

At the end of the day, anyone who is happy with 1080p 30 fps low settings has my 100% support, personal opnion and prefence.  I'm just sick and tired of hearing that nobody can tell a difference...  that is utter horse crap.  

Bullshit, you have a distorted view since you are a hardcore gaming enthusiast who plays on high end PC hardware. The fact that you think your wife who plays Halo on a gaming PC is a casual shows how off base you are.

I’ve been gaming my whole life and enjoy gaming so much that I’ve been on this forum for over a decade, I absolutely can not immediately tell the difference between different settings or how multiplat games perform on different hardware.

It’s easily noticeable to you because you have trained yourself to see these differences.



When the herd loses its way, the shepard must kill the bull that leads them astray.

I love the "you can't trust your own eyes on Youtube!". There's a good chance Switch 2 has more usable RAM for gaming than a 2050 also.

Normal people don't really give a crap about this stuff, high end GPU sales have been tanking, it was inflated by crypto miners for a few years, had a boost during COVID when everyone was locked in their homes, and is now crashing back down to earth. The market for the really high end GPUs is like maybe 1% of the gaming market, 99% of people aren't in that market and don't care to be in that market. 




To be honest if you're paying for a high end GPU thinking you're going to get a massive lift on the settings side, you're totally get suckered, lol. The difference is not even close to being worth paying 3-4x the price. Developers optimize the low setting to be great these days, it's basically what medium+ settings used to be. Reason is as I stated they can't just make custom lower end models and textures and lighting just for a low version, it would cost a lot of money and time to do that, so the low version is still getting all the same models, same basic baked lighting effects, same textures even for the most part. You're not going to hire a couple of artists just to redo every texture in a game, that would be insane. 

A 3050 is basically a PS5, having DLSS and better ray tracing (Nvidia > AMD) as well I would take a 3050 over a PS5 honestly. 

Last edited by Soundwave - on 11 February 2024

haxxiy said:
Norion said:

In that case I hope they chose to eat the higher cost for a more capable system though knowing Nintendo they wouldn't but maybe Furukawa's Nintendo would.

I agree, and I'd even say that would be likely if it wasn't for this sinking feeling that the Switch 2 will be Nintendo-ed in some manner.

They have a long history of getting at least one crucial point very very wrong in their hardware decisions (better than even odds of fumbling their successor platforms, in fact). Let's hope that's not the case here.

I'm decently optimistic that won't happen this time since the massive failure of the Wii U and the disappointing performance of the 3DS is recent enough to where I expect they'll take great care to release an appealing successor especially now that they don't have a 2nd platform to fall back on if one bombs. The Switch being a fairly capable handheld console for early 2017 for sure helped a lot with it becoming so successful so I expect they'll wanna repeat that and I doubt the Joycons will end up being crap again so my only notable concern is another bad gimmick that harms things but after the Wii U and 3DS I see them being more cautious.



Norion said:
haxxiy said:

I agree, and I'd even say that would be likely if it wasn't for this sinking feeling that the Switch 2 will be Nintendo-ed in some manner.

They have a long history of getting at least one crucial point very very wrong in their hardware decisions (better than even odds of fumbling their successor platforms, in fact). Let's hope that's not the case here.

I'm decently optimistic that won't happen this time since the massive failure of the Wii U and the disappointing performance of the 3DS is recent enough to where I expect they'll take great care to release an appealing successor especially now that they don't have a 2nd platform to fall back on if one bombs. The Switch being a fairly capable handheld console for early 2017 for sure helped a lot with it becoming so successful so I expect they'll wanna repeat that and I doubt the Joycons will end up being crap again so my only notable concern is another bad gimmick that harms things but after the Wii U and 3DS I see them being more cautious.

It was quite capable for 2017, though really lets be honest they were aiming for holiday 2016, Zelda just wasn't finished in time. 

For that time window the Switch was pretty high end, if Sony was making a hybrid at the time, they would not have been magically been able to pull a much better chip out of their ass. The Parker X2 was around the corner but that wasn't really a great chip for gaming anyway, the Denver cores are ass for gaming. 

The Switch for a hybrid is not really that far off in March 2017 to what the PS5/XSX were for AMD home chips in fall 2020. AMD had better stuff coming out too by the time PS5 launched and Nvidia had much better, that wasn't bleeding, bleeding edge technology. The only real remarkable thing about the PS5 hardware really was the SSD speed, lol, which you can buy for a PC nowadays no fuss. 

The Tegra X1 was a fairly high end chip. Top 2-3 in class probably definitely, Apple A9X was probably a bit better but that was also made only for $700+ iPad Pros. 

Last edited by Soundwave - on 11 February 2024

Soundwave said:
Norion said:

I'm decently optimistic that won't happen this time since the massive failure of the Wii U and the disappointing performance of the 3DS is recent enough to where I expect they'll take great care to release an appealing successor especially now that they don't have a 2nd platform to fall back on if one bombs. The Switch being a fairly capable handheld console for early 2017 for sure helped a lot with it becoming so successful so I expect they'll wanna repeat that and I doubt the Joycons will end up being crap again so my only notable concern is another bad gimmick that harms things but after the Wii U and 3DS I see them being more cautious.

It was quite capable for 2017, though really lets be honest they were aiming for holiday 2016, Zelda just wasn't finished in time. 

For that time window the Switch was pretty high end, if Sony was making a hybrid at the time, they would not have been magically been able to pull a much better chip out of their ass. The Parker X2 was around the corner but that wasn't really a great chip for gaming anyway, the Denver cores are ass for gaming. 

The Switch for a hybrid is not really that far off in March 2017 to what the PS5/XSX were for AMD home chips in fall 2020. AMD had better stuff coming out too by the time PS5 launched and Nvidia had much better, that wasn't bleeding, bleeding edge technology. The only real remarkable thing about the PS5 hardware really was the SSD speed, lol, which you can buy for a PC nowadays no fuss. 

The Tegra X1 was a fairly high end chip. Top 2-3 in class probably definitely, Apple A9X was probably a bit better but that was also made only for $700+ iPad Pros. 

This post here is pointless, because we don't know what sony would have done with a portable when they did compete in handhelds they were light years ahead, and 100% by 2020 sony would have had a pro version that would have been more more powerful then steam deck. 2 different companies with 2 different mind sets.