By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 5.26%
 
Outdated 1 1.75%
 
Slightly outdated 14 24.56%
 
On point 31 54.39%
 
High tech! 7 12.28%
 
A mixed bag 1 1.75%
 
Total:57
curl-6 said:
Chrkeller said:

My 4090 doesn't even handle real RT that well.  I wouldn't expect much from a RT perspective on the S2.  I keep RT on low with my 4090 because high RT is a fps killer.  

The capability is there; how it is utilized is going to be down to developer choice. PS5/Xbox Series don't have amazing RT capabilities either honestly, yet there are still a number of titles that utilize it. Star Wars Outlaws is using Ray traced global illumination on other platforms, so unless they've totally redone the lighting model, the Switch 2 port of that will be an example. 

Yeah but that isn't really my point.  RT on consoles is low.  When RT is low the quality difference between baked in and RT is negligible.  RE4 is an example of this.  RT on or off is minor at best.  RT shows a stark difference when high, which consoles can't handle, especially the S2.  We are a generation away before we see the real benefits of RT.  Low RT versus off makes no difference in most games.  Half the time I pause the screen and have to hunt to even notice what low RT does.  Now extreme RT is gorgeous but also gives my 4090 20 fps.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Around the Network
Chrkeller said:
curl-6 said:

The capability is there; how it is utilized is going to be down to developer choice. PS5/Xbox Series don't have amazing RT capabilities either honestly, yet there are still a number of titles that utilize it. Star Wars Outlaws is using Ray traced global illumination on other platforms, so unless they've totally redone the lighting model, the Switch 2 port of that will be an example. 

Yeah but that isn't really my point.  RT on consoles is low.  When RT is low the quality difference between baked in and RT is negligible.  RE4 is an example of this.  RT on or off is minor at best.  RT shows a stark difference when high, which consoles can't handle, especially the S2.  We are a generation away before we see the real benefits of RT.  Low RT versus off makes no difference in most games.  Half the time I pause the screen and have to hunt to even notice what low RT does.  Now extreme RT is gorgeous but also gives my 4090 20 fps.  

If we're talking perception, one could argue that say going from 1080p on PS4 to 4K on PS5 isn't a huge perceptible difference due to diminishing returns, but there's still a major technical difference between the two, same for say realtime vs baked shadows, volumetrics vs alpha for things like smoke/fog, etc.

Personally, I find the difference between even console RT and 8th gen rasterized lighting can be pretty significant in games like say Metro Exodus.



Chrkeller said:
curl-6 said:

The capability is there; how it is utilized is going to be down to developer choice. PS5/Xbox Series don't have amazing RT capabilities either honestly, yet there are still a number of titles that utilize it. Star Wars Outlaws is using Ray traced global illumination on other platforms, so unless they've totally redone the lighting model, the Switch 2 port of that will be an example. 

Yeah but that isn't really my point.  RT on consoles is low.  When RT is low the quality difference between baked in and RT is negligible.  RE4 is an example of this.  RT on or off is minor at best.  RT shows a stark difference when high, which consoles can't handle, especially the S2.  We are a generation away before we see the real benefits of RT.  Low RT versus off makes no difference in most games.  Half the time I pause the screen and have to hunt to even notice what low RT does.  Now extreme RT is gorgeous but also gives my 4090 20 fps.  

I will guess you've seen this video, but if you haven't...

There is no fallback...as in, most games that have RT option, have it on certain things, but with still a lot of things being solved the other way if RT is disabled. Not so for SWO.



curl-6 said:
Chrkeller said:

Yeah but that isn't really my point.  RT on consoles is low.  When RT is low the quality difference between baked in and RT is negligible.  RE4 is an example of this.  RT on or off is minor at best.  RT shows a stark difference when high, which consoles can't handle, especially the S2.  We are a generation away before we see the real benefits of RT.  Low RT versus off makes no difference in most games.  Half the time I pause the screen and have to hunt to even notice what low RT does.  Now extreme RT is gorgeous but also gives my 4090 20 fps.  

If we're talking perception, one could argue that say going from 1080p on PS4 to 4K on PS5 isn't a huge perceptible difference due to diminishing returns, but there's still a major technical difference between the two, same for say realtime vs baked shadows, volumetrics vs alpha for things like smoke/fog, etc.

Personally, I find the difference between even console RT and 8th gen rasterized lighting can be pretty significant in games like say Metro Exodus.

Preaching to the choir.  Resolution is overrated.  O can't tell the difference between 1440p and 4k.  I tend to go 1440p for 100 fps over 4k 60 fps.  Resolution is diminishing returns.  Fps aren't.  120 fps is bliss.  

Perhaps the reason we aren't aligned is I'm all about fps.  30 fps can go **** itself.  

Last edited by Chrkeller - on 14 August 2025

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

HoloDust said:
Chrkeller said:

Yeah but that isn't really my point.  RT on consoles is low.  When RT is low the quality difference between baked in and RT is negligible.  RE4 is an example of this.  RT on or off is minor at best.  RT shows a stark difference when high, which consoles can't handle, especially the S2.  We are a generation away before we see the real benefits of RT.  Low RT versus off makes no difference in most games.  Half the time I pause the screen and have to hunt to even notice what low RT does.  Now extreme RT is gorgeous but also gives my 4090 20 fps.  

I will guess you've seen this video, but if you haven't...

There is no fallback...as in, most games that have RT option, have it on certain things, but with still a lot of things being solved the other way if RT is disabled. Not so for SWO.

I am aware and to be clear I am not disputing the S2 does RT.  I am disputing the S2 doing RT with a stable framerate at a good resolution.  And SWO, thus far is showing this.  Clear dips in the low 20s and very clearly lower resolution.

Last edited by Chrkeller - on 14 August 2025

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Around the Network
HoloDust said:
Chrkeller said:

Yeah but that isn't really my point.  RT on consoles is low.  When RT is low the quality difference between baked in and RT is negligible.  RE4 is an example of this.  RT on or off is minor at best.  RT shows a stark difference when high, which consoles can't handle, especially the S2.  We are a generation away before we see the real benefits of RT.  Low RT versus off makes no difference in most games.  Half the time I pause the screen and have to hunt to even notice what low RT does.  Now extreme RT is gorgeous but also gives my 4090 20 fps.  

I will guess you've seen this video, but if you haven't...

There is no fallback...as in, most games that have RT option, have it on certain things, but with still a lot of things being solved the other way if RT is disabled. Not so for SWO.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Chrkeller said:
curl-6 said:

If we're talking perception, one could argue that say going from 1080p on PS4 to 4K on PS5 isn't a huge perceptible difference due to diminishing returns, but there's still a major technical difference between the two, same for say realtime vs baked shadows, volumetrics vs alpha for things like smoke/fog, etc.

Personally, I find the difference between even console RT and 8th gen rasterized lighting can be pretty significant in games like say Metro Exodus.

Preaching to the choir.  Resolution is overrated.  O can't tell the difference between 1440p and 4k.  I tend to go 1440p for 100 fps over 4k 60 fps.  Resolution is diminishing returns.  Fps aren't.  120 fps is bliss.  

Perhaps the reason we aren't aligned is I'm all about fps.  30 fps can go **** itself.  

Both are diminishing returns, a lot of people do not care for 120fps. I was playing Penny's Big Breakaway at 120fps and didn't even realise. Console and PCs spaces are also very different in terms of audience expectation.



Otter said:
Chrkeller said:

Preaching to the choir.  Resolution is overrated.  O can't tell the difference between 1440p and 4k.  I tend to go 1440p for 100 fps over 4k 60 fps.  Resolution is diminishing returns.  Fps aren't.  120 fps is bliss.  

Perhaps the reason we aren't aligned is I'm all about fps.  30 fps can go **** itself.  

Both are diminishing returns, a lot of people do not care for 120fps. I was playing Penny's Big Breakaway at 120fps and didn't even realise. Console and PCs spaces are also very different in terms of audience expectation.

A lot people said the same for 60fps, now almost every game has performance mode. That shows people do care. Performance is must these days for several genres, it effects gameplay and response time in a meaningful way. Switch 2 has not shown it can give you much better performance then PS4 in most ports so that's why DF and many others think it's much closer to PS4. Personally I think it all opinion based, if you care about performance then it's a  fair opinion.

Last edited by redkong - on 14 August 2025

redkong said:
Otter said:

Both are diminishing returns, a lot of people do not care for 120fps. I was playing Penny's Big Breakaway at 120fps and didn't even realise. Console and PCs spaces are also very different in terms of audience expectation.

A lot people said the same for 60fps, now almost every game has performance mode. That shows people do care. Performance is must these days for several genres, it effects gameplay and response time in a meaningful way. Switch 2 has not shown it can give you much better performance then PS4 in most ports so that's why DF and many others think it's much closer to PS4. Personally I think it all opinion based, if you care about performance then it's  fair opinion.

This.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

redkong said:
Otter said:

Both are diminishing returns, a lot of people do not care for 120fps. I was playing Penny's Big Breakaway at 120fps and didn't even realise. Console and PCs spaces are also very different in terms of audience expectation.

A lot people said the same for 60fps, now almost every game has performance mode. That shows people do care. Performance is must these days for several genres, it effects gameplay and response time in a meaningful way. Switch 2 has not shown it can give you much better performance then PS4 in most ports so that's why DF and many others think it's much closer to PS4. Personally I think it all opinion based, if you care about performance then it's a  fair opinion.

60fps mode has been around and common on consoles since the 80s, most SNES games were 60fps, most PS2 games were 60fps. 60fps has always been the default for fighting and racing games. Almost all multiplayer games throughout the PS4 generation were 60fps too.

For this reason I don't think it makes sense to compare 60fps to 120fps, but more to the point, diminishing returns means the jump from 60 > 120, is not as important as the jump from 30> 60... Just as people care less with with resolution jumps, the same applies with FPS. It's not exempt. 

There's a reason why 20fps was never accepted beyond PS1/N64 era (1995-2001) whereas 30fps was accepted for 20 years (2000-2020) and honestly still is... When the next Zelda or GTA releases at 30fps, people will still rate it a 10 and have the time of their lives... The same would not happen to a game locked to 20fps like Ocarina of time was lol. 

Slight side point but I think there's actually a very interesting conversation about how 60fps has been so common throughout all generations bar 1, but it was the PS5/Series X generation where console gamers suddenly acted like they discovered it for the first time. I'd point to in-game toggles and social media as the culprits but that's a conversation for later

Last edited by Otter - on 14 August 2025