By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 5.26%
 
Outdated 1 1.75%
 
Slightly outdated 14 24.56%
 
On point 31 54.39%
 
High tech! 7 12.28%
 
A mixed bag 1 1.75%
 
Total:57
curl-6 said:
Chrkeller said:

And it is holding the system back in terms of resolution and fps.  Resolution can be offset via DLSS but there is no offset for the fps part.  

In terms of picture fidelity didn't you tell me Cyber matches (and sometimes beat out) the Series S?  but the fps is half (when comparing performance modes).  Bandwidth is the bottleneck for fps.  

fps is frames p0r second.  second is a time unit.  a frame is a still picture.  One picture per second = x, then two pictures per second is 2x.  ten pictures per second is 10x.  30 is 30x, 60 is 60x, 120 is 120x.  A picture is a data file. 

It is a bottleneck.  I really don't see how anybody can think otherwise.  In order for the bandwidth to not be a bottleneck fidelity would have to be significantly sacrificed, including resolution.  If the S2, in something like Madden, is going to have fidelity similar to the series S, fps will be 30 because of the bandwidth. 

The problem here is you are taking one aspect of a system's technical makeup as if it's the only factor, while ignoring the numerous other components.

Many things other than bandwidth can bottleneck a system and limit performance, from CPU draw calls to pixel and texel fillrate to asset streaming.

Bandwidth is only one factor among many.

But look at what developers are doing across the three main sectors of fidelity.

1) resolution impacts bandwidth.  is the S2 rendering games at 360p and look like the witcher 3 on the S1?  Nope.  Resolution rendering, in many cases is quite high.  

2) image quality impacts bandwidth. is the S2 rending games with rebuilt assets like Hogwarts on the S1?  Nope, image quality (especially textures are quite high).

3) fps impact bandwidth.  is the S2 running game at a reduced fps compared to current gene?  YES.  

Third party developers, thus far, are address the memory bandwidth bottleneck by dropping fps...  this is literally happening, it is a fact.

Additionally, there is no point in having the CPU/GPU render images that cannot be transferred in a timely manner.  

Maybe we have to agree to disagree.  I think the GPU is actually above where I thought it would be.  But it is limited by bandwidth.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Around the Network
Chrkeller said:
curl-6 said:

The problem here is you are taking one aspect of a system's technical makeup as if it's the only factor, while ignoring the numerous other components.

Many things other than bandwidth can bottleneck a system and limit performance, from CPU draw calls to pixel and texel fillrate to asset streaming.

Bandwidth is only one factor among many.

But look at what developers are doing across the three main sectors of fidelity.

1) resolution impacts bandwidth.  is the S2 rendering games at 360p and look like the witcher 3 on the S1?  Nope.  Resolution rendering, in many cases is quite high.  

2) image quality impacts bandwidth. is the S2 rending games with rebuilt assets like Hogwarts on the S1?  Nope, image quality (especially textures are quite high).

3) fps impact bandwidth.  is the S2 running game at a reduced fps compared to current gene?  YES.  

Third party developers, thus far, are address the memory bandwidth bottleneck by dropping fps...  this is literally happening, it is a fact.

There are things other than bandwidth can limit framerate, from CPU load to pixel/polygon fillrate.

It will depend on the game of course; in the right circumstances (say, a game that pushes a ton of alpha transparencies at a high resolution) bandwidth could become the limiting factor, but I expect that in most ports of PS/Xbox games, it'll be more the CPU that necessitates 30fps.



curl-6 said:
Chrkeller said:

But look at what developers are doing across the three main sectors of fidelity.

1) resolution impacts bandwidth.  is the S2 rendering games at 360p and look like the witcher 3 on the S1?  Nope.  Resolution rendering, in many cases is quite high.  

2) image quality impacts bandwidth. is the S2 rending games with rebuilt assets like Hogwarts on the S1?  Nope, image quality (especially textures are quite high).

3) fps impact bandwidth.  is the S2 running game at a reduced fps compared to current gene?  YES.  

Third party developers, thus far, are address the memory bandwidth bottleneck by dropping fps...  this is literally happening, it is a fact.

There are things other than bandwidth can limit framerate, from CPU load to pixel/polygon fillrate.

It will depend on the game of course; in the right circumstances (say, a game that pushes a ton of alpha transparencies at a high resolution) bandwidth could become the limiting factor, but I expect that in most ports of PS/Xbox games, it'll be more the CPU that necessitates 30fps.

Sure, but we are talking Elden, 7 Remake, Cyber, Madden, etc.  All 30 fps, I can't fathom the CPU being why.....  wait for it....  memory bandwidth is.  

Something like Snake Eater 3, yeah, the CPU is going to take hit.  But until stuff like Snake Eater, Wilds or Outlaws are out, it is hard to determine CPU clock speed impact. 

But we do have a spread of ps4 games....    you think the S2's CPU is a bottleneck with ps4 games?  I don't.  It is the memory bandwidth.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Chrkeller said:
curl-6 said:

There are things other than bandwidth can limit framerate, from CPU load to pixel/polygon fillrate.

It will depend on the game of course; in the right circumstances (say, a game that pushes a ton of alpha transparencies at a high resolution) bandwidth could become the limiting factor, but I expect that in most ports of PS/Xbox games, it'll be more the CPU that necessitates 30fps.

Sure, but we are talking Elden, Remake, Cyber, Madden, etc.  All 30 fps, I can't fathom the CPU being why.....  wait for it....  memory bandwidth is.  

Something like Snake Eater 3, yeah, the CPU is going to take hit.  

Elden/FF7 Remake/Cyberpunk all target 30fps on PS4, so with Switch 2's CPU being only a bit better than PS4's, (lower clocks and one less available core, but more efficient per clock) 60 is out of reach no matter how much you cut back the graphics.

Madden is a PS5/Xbox Series port, so while it may not look CPU heavy, it is made for CPUs with much more juice than Switch 2, so it could be that its animation system for instance is quite heavy, or that the graphical effects it uses eat up too many GPU cycles. it's hard to say without an in-depth analysis, and I haven't seen any. (Might have missed it though)



Chrkeller said:
curl-6 said:

There are things other than bandwidth can limit framerate, from CPU load to pixel/polygon fillrate.

It will depend on the game of course; in the right circumstances (say, a game that pushes a ton of alpha transparencies at a high resolution) bandwidth could become the limiting factor, but I expect that in most ports of PS/Xbox games, it'll be more the CPU that necessitates 30fps.

Sure, but we are talking Elden, 7 Remake, Cyber, Madden, etc.  All 30 fps, I can't fathom the CPU being why.....  wait for it....  memory bandwidth is.  

Something like Snake Eater 3, yeah, the CPU is going to take hit.  But until stuff like Snake Eater, Wilds or Outlaws are out, it is hard to determine CPU clock speed impact. 

But we do have a spread of ps4 games....    you think the S2's CPU is a bottleneck with ps4 games?  I don't.  It is the memory bandwidth.  

I think your right. It's the only thing that can explain Elden ring running especially badly in handheld mode, but apparently much better in docked mode. The PS4 has 178GB's memory bandwidth and that's really good for a 2013 console. The Switch 2 bandwidth in handheld is only 68GB's, and docked is 102GB's. Big difference to PS4 in handheld but the CPU difference to PS4 in handheld isn't that big. It's the only explanation there is. 



Around the Network

Memory bandwidth (or rather GPU resources in general) would explain Elden Ring if they don't drop the resolution in handheld mode, which is ridiculous if not. It would be a From Software move to keep it 1080p (likely with DRS) in both docked and handheld mode lol. Even with DRS we know Elden Ring's DRS implementation is broken.

Really a good port would've been to implement DLSS in a similar way to Cyberpunk 2077. Probably could even get the same variable frame-rates as the other consoles 30-50 or 40-60 (maybe a stretch) depending on where you set the internal resolution.

As it is, docked mode is probably underutilizing the GPU's resources and handheld mode over-utilizing/being bottlenecked by them.



Hardstuck-Platinum said:
Chrkeller said:

Sure, but we are talking Elden, 7 Remake, Cyber, Madden, etc.  All 30 fps, I can't fathom the CPU being why.....  wait for it....  memory bandwidth is.  

Something like Snake Eater 3, yeah, the CPU is going to take hit.  But until stuff like Snake Eater, Wilds or Outlaws are out, it is hard to determine CPU clock speed impact. 

But we do have a spread of ps4 games....    you think the S2's CPU is a bottleneck with ps4 games?  I don't.  It is the memory bandwidth.  

I think your right. It's the only thing that can explain Elden ring running especially badly in handheld mode, but apparently much better in docked mode. The PS4 has 178GB's memory bandwidth and that's really good for a 2013 console. The Switch 2 bandwidth in handheld is only 68GB's, and docked is 102GB's. Big difference to PS4 in handheld but the CPU difference to PS4 in handheld isn't that big. It's the only explanation there is. 

Exactly.  And if I recall correctly the CPU clock for handheld is higher than docked.....  it can't be CPU limited.  It is a bandwidth issue.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

We also have to remember that handheld-mode just has less GPU compute in general (about half as much.) 

Handheld mode and docked mode work so well because the resources are about half in handheld mode what they are in docked mode. So ostensibly you could halve the internal resolution (as this scales linearly, more or less), and be able to run games at the same or similar settings and framerate. If From Software did that, then the game probably would have run fine. 

If they haven't, they might even still do that, since the game doesn't even have a release date, and it is as simple as changing a config and performance testing. 

But something tells me that they are lazily using their poor DRS implementation to try to manage the internal resolutions.  

 



sc94597 said:

Memory bandwidth (or rather GPU resources in general) would explain Elden Ring if they don't drop the resolution in handheld mode, which is ridiculous if not. It would be a From Software move to keep it 1080p (likely with DRS) in both docked and handheld mode lol. Even with DRS we know Elden Ring's DRS implementation is broken.

Really a good port would've been to implement DLSS in a similar way to Cyberpunk 2077. Probably could even get the same variable frame-rates as the other consoles 30-50 or 40-60 (maybe a stretch) depending on where you set the internal resolution.

As it is, docked mode is probably underutilizing the GPU's resources and handheld mode over-utilizing/being bottlenecked by them.

I get Curls stance and this is the reason. People are expecting a level of bespoke port that is simply not going to happen and maybe doesn't make sense. Developers are not going to go around targeting 40-50fps performance profiles on S2 which require further reduced assets in other areas.

I think it'll be hard to highlight a single bottleneck when none of these games are designed/intended to run at 60fps on Switch 2 level hardware. bandwidth aside, the question is do the other components look like they should reflect a doubling of PS4's real world performance? 50% gains does not turn a 30fps game into a 60fps one. 

Last edited by Otter - on 22 August 2025

Otter said:
sc94597 said:

Memory bandwidth (or rather GPU resources in general) would explain Elden Ring if they don't drop the resolution in handheld mode, which is ridiculous if not. It would be a From Software move to keep it 1080p (likely with DRS) in both docked and handheld mode lol. Even with DRS we know Elden Ring's DRS implementation is broken.

Really a good port would've been to implement DLSS in a similar way to Cyberpunk 2077. Probably could even get the same variable frame-rates as the other consoles 30-50 or 40-60 (maybe a stretch) depending on where you set the internal resolution.

As it is, docked mode is probably underutilizing the GPU's resources and handheld mode over-utilizing/being bottlenecked by them.

I get Curls stance and this is the reason. People are expecting a level of bespoke port that is simply not going to happen and maybe doesn't make sense. Developers are not going to go around targeting 40-50fps performance profiles on S2 which require further reduced assets in other areas.

I think it'll be hard to highlight a single bottleneck when none of these games are designed/intended to run at 60fps on Switch 2 level hardware. bandwidth aside, the question is do the other components look like they should reflect a doubling of PS4's real world performance? 50% gains does not turn a 30fps game into a 60fps one. 

That was my point.  Take Elden, a ps4 game, and pop it on the S2.  The S2 has a better CPU, thus that can't be the bottleneck.  The S2 has a good deal better GPU, that can't be the bottleneck.  The S2 has 3x the amount of ram, that can't be the bottleneck.  So what is the bottleneck?  What is the one aspect the S2, especially is handheld, that is behind the ps4?  And yeah, per reports handheld mode has major fps issues.  

I'll concede that many here know tech better than I do, but this seems pretty simple.  

Edit

The simple fix would be reduce rendering resolution to free up bandwidth, but From is rather lazy and don't seem to be using DLSS.

Last edited by Chrkeller - on 22 August 2025

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED