By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 5.26%
 
Outdated 1 1.75%
 
Slightly outdated 14 24.56%
 
On point 31 54.39%
 
High tech! 7 12.28%
 
A mixed bag 1 1.75%
 
Total:57
Hardstuck-Platinum said:
Hiku said:

Borderlands 4 is a Gen 9 only game though, so I'm not sure why you brought up that one?

Anyway, new footage of FF7 Remake seems promising. Not sure what the framerate is here, but it looks fine

Just because it's 9th gen only doesn't mean we can't bring it up with regards to performance issues. Does BDL4 get a free pass to be absolved of all criticism just because it's 9th gen? I'm just saying that now it's Elden ring, in October this topic will resurface with BDL4 release. That's what I suspect anyway

Yes, expectations should be different if it's a port from newer and more powerful consoles. The Switch 2 shouldn't have any problems running games from the 8th gen just like the Switch 1 shouldn't have any problems running games from the PS360 era.

I'm not willing to give From the benefit of the doubt here because they suck at optimizing their games.



 

Around the Network
Hiku said:
Hardstuck-Platinum said:

Just because it's 9th gen only doesn't mean we can't bring it up with regards to performance issues. Does BDL4 get a free pass to be absolved of all criticism just because it's 9th gen? I'm just saying that now it's Elden ring, in October this topic will resurface with BDL4 release. That's what I suspect anyway

No, but you put it in the sentence after mentioning Elden Ring. If a PS4 game isn't reaching 30, is it surprising that a Gen 9 game isn't?
Almost as if suggesting that Borderlands 4 is in the same generation/category.

"It's slower than a Volvo. Even a Ferrari is faster."

Kinda like that.

Yeah I see, but you have to remember that BDL4 isn't targeting the 9th gen console 60fps framerate standard. So, if you can't have 60, you at least need a solid 30. Randy Pitchford has admitted though that it's not a solid 30. So, basically I was suggesting BDL4 not being a solid 30 being similar enough to Elden Ring not being solid 30. 



When analysing any of these ports, context is key.

Ports from radically different hardware rarely show a system at its best, and that's especially true when (a) they're ports from more powerful hardware and (b) it's early in a system's life and devs haven't had the time to really tailor their games to the system. (We know for example that during the development of some of these ports, devs didn't even know the final specs of Switch 2, as per Tecmo Koei)

There's also the fact that the Nintendo version of something like Madden or Borderlands 4 is not going to receive anywhere near the amount of care and attention as say the PS5 version, because developers have limited resources and will concentrate them on the versions they expect to sell the most.

I wouldn't really hold up any of the games arriving in the system's first year as definitive examples of its maximum capability; the games that show that will arrive later in it's life, and will mostly be built from the ground up for the hardware.

Last edited by curl-6 - on 23 August 2025

Pemalite said:
Chrkeller said:

Rebirth was shockingly well optimized on PC.  I ran max at native 4k and indoors locked 120 fps.  Outdoors 100 to 120 fps.  

As for Remake, not even sure my gpu fans turned on.  

Madden, FC, Elden, Remake all 30 fps.  Doesn't shock me, was worried about memory bandwidth being a bottleneck. 

I am a bit surprised Remake doesn't have a 40 fps mode, the game isn't particularly demanding.

I would hope you could hit 4k, 120fps with a 4090... That's the point of a 4090/5090 tier GPU.

curl-6 said:

I wouldn't say any of those are indicative of a bottleneck necessarily; Madden and FC are 30 cos they are ports of the PS5 and Xbox Series versions so just generally built for stronger hardware, (they don't seem to be very good ports either as frame pacing is all over the place) Elden Ring is a From Soft title which are always a bit of a mess, and Remake is based on the Intergrade version, so the upgrades there probably use up the system's extra resources.

Switch 2 does have lower bandwidth than recent home consoles in order to preserve battery life, but I don't think I'd describe it as a bottleneck per se as it doesn't seem to be holding back the rest of the system overly much, relative to say Switch 1.


Bandwidth is a difficult thing to quantify this day and age, especially as GPU's get smarter and more efficient with tiled based rendering, compression, procedural generation, large caches, streaming, neural rendering and more.

AMD has managed to match/beat nVidia in many areas with the RX 9000 series vs RTX 5000 series at certain tiers, despite having a significant bandwidth deficit, thanks to sticking with cheaper GDDR6 over more expensive GDDR7.

Chrkeller said:

You want to think 102 gb/s isnt a bottleneck, go ahead.  You are wrong.  There is a reason the Halo Strix is going after 256 gb/s. 

Perma argued with me over this, post launch even he has made comments that 102 gb/s will limit fps and resolution because it will.

And yes, within an ecosystem of hardware doubling fps requires double bandwidth.

Frame per SECOND 

Gb per SECOND 

Memory bandwidth significantly affects frames per second (FPS) in gaming and graphical applications, as it determines how quickly data can be transferred between the GPU and its memory.

Sure it computing is complex, but claiming 102 gb/s isn't going to limit fps is nonsense.


Bandwidth is only a bottleneck if you make it a bottleneck.

The Switch 2 has fixed and quantifiable hardware that developers can work around... You would sooner be GPU compute bound before you are bandwidth limited if your game is running lots of shaders.

A GPU is a sum of it's parts, not one factor... Which I think is the aspect you are ultimately missing here.

Will the Switch 2's GPU be bandwidth limited in some scenarios? Absolutely. But a games rendering load is extremely dynamic, bandwidth isn't always going to be the limiting factor.

Many rendering loads tend to be "bursty" in nature when it comes to bandwidth demands, I.E. You need lots of bandwidth to fill up the GPU's caches, but then the bandwidth demands drop off, you aren't going to require your chips full bandwidth 24/7, especially once those work sets are loaded into the chips cache.

Chrkeller said:

But look at what developers are doing across the three main sectors of fidelity.

1) resolution impacts bandwidth.  is the S2 rendering games at 360p and look like the witcher 3 on the S1?  Nope.  Resolution rendering, in many cases is quite high.  

Everything impacts bandwidth.

However... If you break up your scene into tiles, you need less bandwidth, which is why Maxwell was able to beat GCN.
It's called doing more with less.

Chrkeller said:

2) image quality impacts bandwidth. is the S2 rending games with rebuilt assets like Hogwarts on the S1?  Nope, image quality (especially textures are quite high).

Keep in mind the Switch 2 has a faster CPU, faster and more modern GPU which is capable of significantly more advanced effects, you cannot chalk up the image quality gain of hogwarts to just bandwidth.

Chrkeller said:

3) fps impact bandwidth.  is the S2 running game at a reduced fps compared to current gene?  YES.  

Many games run at similar framerates as the Xbox Series S... But FPS can often be impacted by the CPU being insufficient, rather than bandwidth.


Chrkeller said:

Additionally, there is no point in having the CPU/GPU render images that cannot be transferred in a timely manner.  

Maybe we have to agree to disagree.  I think the GPU is actually above where I thought it would be.  But it is limited by bandwidth.  

That's what caches are for.

Remember the Radeon 9060XT has 320GB/s of bandwidth and sits just below the 5060 Ti which has 448GB/s of bandwidth, that's a deficit of 128GB/s.
Which tells us that architectural efficiency is often more important than just unadulterated pure bandwidth.

We could take my old RX 580... I upgraded to the RX 6600XT, same 256GB/s bandwidth, I doubled my performance even at 1440P... I then upgraded to the Radeon RX 9060XT which has 320GB/s of bandwidth, which is an extra 64GB/s... And doubled performance again... That's a four fold increase in performance for only a 64GB/s (Quarter!) increase in bandwidth.

Or let's go back to the Radeon 5870 vs the Radeon 7850.
The 5870 has 153GB/s of bandwidth, the Radeon 7850 also has 153GB/s of bandwidth.

The 7850 can beat the 5870 by over 50% with the same memory bandwidth, regardless of resolution.

Efficiency is often the deciding factor over pure black and white bandwidth numbers.

You need to play more games.  FF16 was a mess.  1440p and in the 70s for fps...  so yeah I was pleased with Reirth.  

Last edited by Chrkeller - on 23 August 2025

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

I beat DK last night. Just under 30 hours. A few boss fights were almost exclusively stuck at 30 fps. There was an entire level that mostly ran 30 fps as well.

The instant drop to 30 fps is awful. I wish Nintendo would stop programming their games to go from.60 directly to 30.

DK' fps was not as consistent as Odyssey, in experience.  Late game it drops become a somewhat regular occurrence.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Around the Network
Chrkeller said:

I beat DK last night. Just under 30 hours. A few boss fights were almost exclusively stuck at 30 fps. There was an entire level that mostly ran 30 fps as well.

The instant drop to 30 fps is awful. I wish Nintendo would stop programming their games to go from.60 directly to 30.

The climax of both Void Kong fights dropped to 30 for me, but what "entire level" are you referring to, cos I don't recall that happening in my playthrough.

What you're describing is double buffer v-sync, which is where if the game can't hit its frame time, it waits for the next screen refresh, which means 60 falls to 30. 

As I understand it, there are a couple of solutions to this; you can submit your frame mid-refresh, resulting in screen tearing; Nintendo seems to really dislike this as none of their games have it.

You can also hold a frame in reserve at all times and permanently run one frame behind; this eliminates both screen tearing and hard drops to 30, but it introduces input lag; Nintendo tends to place a premium on responsive gameplay, so input delay was probably a no-no for them.

VRR of course can fix the issue entirely, but for whatever reason its implementation on Switch 2 currently doesn't work. This is unfortunate, but also not unheard of; VRR on PS5 was busted for more than 4 years before finally being fixed a couple months ago.



curl-6 said:
Chrkeller said:

I beat DK last night. Just under 30 hours. A few boss fights were almost exclusively stuck at 30 fps. There was an entire level that mostly ran 30 fps as well.

The instant drop to 30 fps is awful. I wish Nintendo would stop programming their games to go from.60 directly to 30.

The climax of both Void Kong fights dropped to 30 for me, but what "entire level" are you referring to, cos I don't recall that happening in my playthrough.

What you're describing is double buffer v-sync, which is where if the game can't hit its frame time, it waits for the next screen refresh, which means 60 falls to 30. 

As I understand it, there are a couple of solutions to this; you can submit your frame mid-refresh, resulting in screen tearing; Nintendo seems to really dislike this as none of their games have it.

You can also hold a frame in reserve at all times and permanently run one frame behind; this eliminates both screen tearing and hard drops to 30, but it introduces input lag; Nintendo tends to place a premium on responsive gameplay, so input delay was probably a no-no for them.

VRR of course can fix the issue entirely, but for whatever reason its implementation on Switch 2 currently doesn't work. This is unfortunate, but also not unheard of; VRR on PS5 was busted for more than 4 years before finally being fixed a couple months ago.

I don't want to spoil for others, but a couple of the SL16xx ran horrid.  

There are plenty of fixes and it would be nice if Nintendo would take them on.  We shouldn't have hardware in 2025 that goes from 60 directly to 30 fps.

Still a really good game.  But the last few hours were not as polished nor fun.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Chrkeller said:
curl-6 said:

The climax of both Void Kong fights dropped to 30 for me, but what "entire level" are you referring to, cos I don't recall that happening in my playthrough.

What you're describing is double buffer v-sync, which is where if the game can't hit its frame time, it waits for the next screen refresh, which means 60 falls to 30. 

As I understand it, there are a couple of solutions to this; you can submit your frame mid-refresh, resulting in screen tearing; Nintendo seems to really dislike this as none of their games have it.

You can also hold a frame in reserve at all times and permanently run one frame behind; this eliminates both screen tearing and hard drops to 30, but it introduces input lag; Nintendo tends to place a premium on responsive gameplay, so input delay was probably a no-no for them.

VRR of course can fix the issue entirely, but for whatever reason its implementation on Switch 2 currently doesn't work. This is unfortunate, but also not unheard of; VRR on PS5 was busted for more than 4 years before finally being fixed a couple months ago.

I don't want to spoil for others, but a couple of the SL16xx ran horrid.  

There are plenty of fixes and it would be nice if Nintendo would take them on.  We shouldn't have hardware in 2025 that goes from 60 directly to 30 fps.

Still a really good game.  But the last few hours were not as polished nor fun.  

VRR on Switch 2 does absolutely need to be fixed, yeah. If that were working properly, then the dips in DK might be smoothly over considerably, not to mention it would open the door to properly paced 40fps modes in Switch 2 games, which would be a nice option to have when possible.

Hopefully it doesn't take as long to be fixed as it did on PS5.



curl-6 said:
Chrkeller said:

I don't want to spoil for others, but a couple of the SL16xx ran horrid.  

There are plenty of fixes and it would be nice if Nintendo would take them on.  We shouldn't have hardware in 2025 that goes from 60 directly to 30 fps.

Still a really good game.  But the last few hours were not as polished nor fun.  

VRR on Switch 2 does absolutely need to be fixed, yeah. If that were working properly, then the dips in DK might be smoothly over considerably, not to mention it would open the door to properly paced 40fps modes in Switch 2 games, which would be a nice option to have when possible.

Hopefully it doesn't take as long to be fixed as it did on PS5.

It shouldn't take long to get fixed, given vrr has been a standard in electronics since 2020/21.  I get struggling with new tech like RT....  but this is old technology.  

Dropping from 60 to kid 40s wouldn't have been too bad.  

For botw, haven't noticed any drops and I'm around 100 hours.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

curl-6 said:
Chrkeller said:

I beat DK last night. Just under 30 hours. A few boss fights were almost exclusively stuck at 30 fps. There was an entire level that mostly ran 30 fps as well.

The instant drop to 30 fps is awful. I wish Nintendo would stop programming their games to go from.60 directly to 30.

The climax of both Void Kong fights dropped to 30 for me, but what "entire level" are you referring to, cos I don't recall that happening in my playthrough.

What you're describing is double buffer v-sync, which is where if the game can't hit its frame time, it waits for the next screen refresh, which means 60 falls to 30. 

As I understand it, there are a couple of solutions to this; you can submit your frame mid-refresh, resulting in screen tearing; Nintendo seems to really dislike this as none of their games have it.

You can also hold a frame in reserve at all times and permanently run one frame behind; this eliminates both screen tearing and hard drops to 30, but it introduces input lag; Nintendo tends to place a premium on responsive gameplay, so input delay was probably a no-no for them.

VRR of course can fix the issue entirely, but for whatever reason its implementation on Switch 2 currently doesn't work. This is unfortunate, but also not unheard of; VRR on PS5 was busted for more than 4 years before finally being fixed a couple months ago.

This feels like a black and white issue as far as user experience and I think Nintendo is failing here.

Almost every developer has abandoned double buffer vsync, even wothout VRR. No one one wants to experience a game jumping back and forth between the extreme of 30 & 60fps. The hard drop to 30 introduces a delay input anyway as the game likely otherwise be in the 40s/50s. I think a variable frame rate would be less distracting in these moments unless the frame graph in DK would actually spend most of its time in the low 30s anyway.