By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - How Will be Switch 2 Performance Wise?

 

Switch 2 is out! How you classify?

Terribly outdated! 3 5.26%
 
Outdated 1 1.75%
 
Slightly outdated 14 24.56%
 
On point 31 54.39%
 
High tech! 7 12.28%
 
A mixed bag 1 1.75%
 
Total:57
Biggerboat1 said:
Chrkeller said

It always amazes me when people won't accept being wrong.  The article demonstrates the average player benefits from fps above 60 fps.  You said the "majority" of players don't.  Your statement was wrong.  Pretty simple.  Btw there are other articles as well.  As fps goes up, performance goes up.  When you say the "majority" it was based on nothing and it is factually untrue.  

Also enjoy how you picked the graph of perception but ignored the graphs that show an uptake is objective based performance criteria.

Like I said, it isn't that look of 120 fps, it is the increased accuracy and responsive controls.  The article proves it is a real thing.  

From what I can see from this study the performance & perceived difference of increasing framerate is the very definition of diminishing returns.

You said that diminishing returns doesn't apply to increases from 60 to 120 but that's exactly what these results show.

The 'QoE' & 'Score' differences between 60 > 90 and certainly 90 > 120 border on negligible.

Not sure how you can interpret those graphs as meaning the opposite...

Because the jump from 60 fps to 90 fps is significant.  Already stated 90 fps to 120 fps is diminished.  But those arguing going above 60 fps doesn't have a meaningful impact are factually wrong.  Pretty simple.

Edit

What the article demonstrates is most don't visually perceive a difference but in performance metrics there is a jump in play between 60 and 90.  

Last edited by Chrkeller - on 16 August 2025

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Around the Network
Chrkeller said:

I would like to see pikmin 4 and LM3 upgrades to 60 fps. But yeah, when feasible Nintendo drops fidelity and goes after fps, which is lovely.

Yeah those games at 60fps would look damn nice, they're already pretty good looking even at 30.

I'm keen to see what say Luigi's Mansion 4 or Pikmin 5 could look like on Switch 2, could be something really special.

And agreed, given the constraints of portable hardware, it is nice that they place a higher priority on performance than you'd expect; I feel like most devs under the same restrictions would opt for 30 in most cases.



Bumblaster said:

Have tried 120 FPS and it looks EXACTLY the same to me as 60 FPS. You must have weird sensitive eyeballs!

It mostly has to do with mouse controls on PCs  - 60Hz become "standard" on PC monitors only when LCDs where introduced, before that on CRT monitor I remember usually running them at 80-100Hz Hz, depending on resolution...so 60Hz of LCD era felt like somebody put small rocks on your hands all of the sudden...thus, gamers running games without vsync for most of the time and accepting screen tearing as necessary evil.

So yeah, 60 Hz is more than enough for gamepad, but pretty lousy for mouse controls.



Chrkeller said:

It always amazes me when people won't accept being wrong.  The article demonstrates the average player benefits from fps above 60 fps.  You said the "majority" of players don't.  Your statement was wrong.  Pretty simple.  Btw there are other articles as well.  As fps goes up, performance goes up.  When you say the "majority" it was based on nothing and it is factually untrue.  

Also enjoy how you picked the graph of perception but ignored the graphs that show an uptake is objective based performance criteria.

Like I said, it isn't that look of 120 fps, it is the increased accuracy and responsive controls.  The article proves it is a real thing.  


I have no issue being wrong and honestly I don't even look at it as being wrong half the time because healthy discussions focus on organic exchange and broader meaning/understanding. You are right that I picked the graph on perception but thats because that was quite literally my point lol. People struggle to detect the difference. 

The original exchange was literally someone saying 120fps its a waste of resources for them.

"I'm a normal gamer, not a 'competitive gamer'. 120fps is a waste of resources for me.


You said diminishing returns does not exist with high FPS and I said it does, which is illustrated by the study you grabbed. 

There comes a point however where across the spectrum of gamers, the vast majority cannot detect that extra 10-20ms of lag. 


Now I shat out "10-20ms" as a number but I was just communicating the point of diminishing returns and that at a certain level, a reduction of latency (lets say 10ms from 16ms) is not discernible to the majority. Looking at the study (which is very small I might add), I feel like it's a pretty fair assessment.

The perceptive improvement of smoothness actually dipped at 90fps from 60fps (meaning they couldn't tell the difference and mistook it for being worse despite the higher FPS), and only increased marginally at 120fps.

Additionally these are averages. It's not the same as saying the "average gamer"... If 9 people rate a game a 9/10 and 1 person rates it a 10/10, it's average is 9.1 but the majority rated it a 9, It doesn't mean the average participant rated it 9.1. The vast "Majority" is in this case is the "mode; The value that appears most frequently in a dataset". This study doesn't actually offer that but its safe to assume it doesn't exist where the curve flatlines and the diminishing returns take effect. 

I take your point that even when there is negligible perceptive difference, there is still likey an objective performance one for most of the participants up until 90-120fps. I do think that is specifically relevant to mouse/keyboard input which is what this study is based on and specifically a game that requires twitch aiming movement (just look at those tiny targets in the game they played lol).

Just to weave the meaning back into this, I think it is very fair to say the vast majority of console gamers will not benefit from 120fps whilst playing on consoles with a controller. If it's not a shooter especially, I don't think they'd benefit at all.

Last edited by Otter - on 16 August 2025

PSVR2 demonstrates the difference between 60 and 120 all the time. All the 60 fps reprojected games have a completely different feel and look between stick turning (60 fps) and head turning (120 fps). There is no smooth analog turning at 60fps, hence snap turn or increasing the stick turn rate to treat it like mouse look. (or turn very slowly) Very different from sitting on a merry go round at high speed and get natural motion blur (before throwing up lol)

But it depends on screen size, with VR obviously having the largest possible screen. Or rather it depends on the distance that things 'skip' between frames.

You eyes pick up multiple frame above 30 fps. The human eye's frame rate is flexible, depends on light input and other variables, sits anywhere between 30 and 60 fps. Your eyes also track moving objects to keep the same area of your retina focused on a moving object to collect 'data' of what you're looking at.

At 60 fps, if you turn circles with the analog stick (constant angular turn rate) you see multiple frame overlapping. Natural motion blur doesn't work since your eyes get exposed to discreet frames that get blended together on your retina. Same when following a moving object across the screen like a mouse pointer. The more frames, the smaller steps it make, the easier it is to follow and look at. The bigger the steps the more it jumps across you retina and you see more than a single mouse pointer. Just move the mouse pointer back and forth over a black background, at 144 hz I can see about 8-10 discreet pointers if I move the mouse up and down quickly. If I change my display to 60fps I only see 5-6 pointers with the same movement. So double images (seeing multiple frame together) already starts over 12 fps. The human eye has high persistence. 

The only perfect frame rate there is is when the fastest moving object doesn't skip a single pixel while moving over the screen. Of course there is also an upper limit of being able to track objects. You can follow a baseball with your eyes, not a bullet.

The maximum speed for smooth pursuit in adults is around 100°/s, which translates to roughly 17.5 mph for an object 5 meters away.

100 degrees is about the fov of VR headsets. So at 2000x2040 per eye, you need 2,000 frames per second to be able to follow objects without any judder.
At 20/20 vision, 60 pixels per degree, over 100 degrees that's 6,000 pixels per second you can track. 6,000 frames per second to eliminate any judder. That's the upper limit, but at the same time we can perceive motion at 12 fps. That was the going rate for animated movies and stop motion.

Of course the only thing that does is turn seeing multiple mouse pointers into seeing a streak/smear. Hence motion blur works to make things appear to move more smoothly. However motion blur should work together with eye movement so smooth pursuit still works. Eye tracked dynamic motion blur is a better solution than brute forcing frame rate.


For gaming, the better you can track / follow targets with your eyes the better you can aim at them. Hence higher fps is better for competitive gaming. The reaction time difference is negligible if input lag is already well below human reaction time (200ms). Movement looks more clear / sharper at higher fps. 



Around the Network

One thought I've had in the past, after experiencing VR and being somebody highly susceptible to car/motion sickness in general, is that I wonder if a lot of non-gamers are more sensitive to low frame-rates. 

When I was growing up, my mother loved to play 2D games (Donkey Kong Country, Super Mario, etc.) When games transitioned to 3D she could play relatively fixed camera games like Crash Bandicoot, but nothing with a fully movable camera. 

She'd get motion sickness with any 3d game with a controllable camera, which probably was because of the low framerates of early 3D games. That's when she stopped playing any new titles. 

I wonder if there is a sort of survival bias, where people who have been playing games at low frame-rates with no issues, are more resistant to lower framerates in general and those people tend to play games more. 

Would be interesting to see if people who didn't enjoy 3D games in that transition can now enjoy them now that higher framerates are the standard. 



I think Nintendo needs to fix the screen with a firmware update before VRR. DF said its so bad that 60fps doesn't even feel like 60fps, It feels like 60fps with frame gen.

Last edited by redkong - on 16 August 2025

https://www.testufo.com/



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

I have genuinely only noticed ghosting a few times and that was when it was all the rage to knock any aspect of the Switches hardware. Once the poor refresh time of the screen was no longer the trendy thing to moan about, I stopped noticing ghosting. I'm sure it's there but if you don't notice it then who cares?? It also means Nintendo will get loads of new buyers who do notice it for the inevitable OLED with good refresh times model.



It seems like unreal 5 will really struggle on Switch 2. Borderlands 4 will be missing split-screen co-op and will run at "mostly 30fps". Split Fiction is 30fps and tops out at 914p for reference Series S is with better lighting 1080/60fps. Sparkling 0 810p/30fps. Fornite no nanite or lumen like last gen version is not good. The gap here is massive between series s and switch 2 on unreal 5.