By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - How will be Switch 2 performance wise?

 

Your expectations

Performance ridiculously ... 0 0%
 
Really below current gen,... 2 100.00%
 
Slightly below current ge... 0 0%
 
On pair with current gen,... 0 0%
 
Total:2
Chrkeller said:
Soundwave said:

The PS5 is what it is. There's no magic it has, if it did it would run Cyberpunk 2077 better than it does, unless you are saying they purposely gimped the port of the game. 1440p/60 fps at medium tier settings is the max it can do. Maybe you can squeeze a touch more, but I seriously doubt the Cyberpunk 2077 developers left like 30-40% performance on the floor, if the PS5 was able to run that game at like 60 fps/4K, I'm sure they would be happy to do it. It can't do that because resolutions above 1440p are incredibly taxing on any GPU and eat up a disproportionate amount of power to make an already clean looking image even sharper. 

Just like a 2050 isn't a PS5 exactly, a PS5 certainly isn't a high end GPU either. 

If we're going to bring a 2050 into this, a 2050 runs anything a PS5 can with some dropped settings sure, but it's no game a PS5 game run right now that a 2050 wouldn't be able to run in playable state barring exclusivity nonsense. It's not like the PS5 is running anything at High/Ultra settings, it's ray tracing abilities are shit too, and that's going to be the standard for the next 4-5 years really, like it or not. PS5 Pro is never going to touch sales of the base model. 

Pretty sure everyone agrees.  A 1070 runs 95% the same games as a 4090.   Running the same games is common, it isn't 1990 anymore.  Running the same games doesn't equate to same class of hardware. 

Exactly and it will depend om the game and developer like many games run om switch and for me I would call many a very unpleasant experience.  Like fromsoftware they suck at making games run good and I imagine running on a 2050 will be a mess of a port.



Around the Network
Chrkeller said:
Soundwave said:

The PS5 is what it is. There's no magic it has, if it did it would run Cyberpunk 2077 better than it does, unless you are saying they purposely gimped the port of the game. 1440p/60 fps at medium tier settings is the max it can do. Maybe you can squeeze a touch more, but I seriously doubt the Cyberpunk 2077 developers left like 30-40% performance on the floor, if the PS5 was able to run that game at like 60 fps/4K, I'm sure they would be happy to do it. It can't do that because resolutions above 1440p are incredibly taxing on any GPU and eat up a disproportionate amount of power to make an already clean looking image even sharper. 

Just like a 2050 isn't a PS5 exactly, a PS5 certainly isn't a high end GPU either. 

If we're going to bring a 2050 into this, a 2050 runs anything a PS5 can with some dropped settings sure, but it's no game a PS5 game run right now that a 2050 wouldn't be able to run in playable state barring exclusivity nonsense. It's not like the PS5 is running anything at High/Ultra settings, it's ray tracing abilities are shit too, and that's going to be the standard for the next 4-5 years really, like it or not. PS5 Pro is never going to touch sales of the base model. 

Pretty sure everyone agrees.  A 1070 runs 95% the same games as a 4090.   Running the same games is common, it isn't 1990 anymore.  Running the same games doesn't equate to same class of hardware. 

Well that's pushing to an extreme, but a PS5 isn't much more powerful than a 2050 to begin with. It's maybe double the performance and that doesn't mean jack shit today because resolution and frame rate basically can eat that difference up in a second, so those two pieces of hardware are the same class of hardware. 

It's not like a PS5 is running Alan Wake II on Ultra with ray tracing, not even close. 

In my day simply flipping from low to medium settings is not a generational leap, not sure why I'm supposed to believe that is a different class of hardware today. 



Soundwave said:
Chrkeller said:

Pretty sure everyone agrees.  A 1070 runs 95% the same games as a 4090.   Running the same games is common, it isn't 1990 anymore.  Running the same games doesn't equate to same class of hardware. 

Well that's pushing to an extreme, but a PS5 isn't much more powerful than a 2050 to begin with. It's maybe double the performance and that doesn't mean jack shit today because resolution and frame rate basically can eat that difference up in a second, so those two pieces of hardware are the same class of hardware. 

It's not like a PS5 is running Alan Wake II on Ultra with ray tracing, not even close. 

In my day simply flipping from low to medium settings is not a generational leap, not sure why I'm supposed to believe that is a different class of hardware today. 

Your opinions are your own.  I don't consider a 2070s being 153% higher in benchmarking against a 2050 mobile "not much more powerful."  But to each their own. 

I feel 1440p at 60 fps to be a massive jump compared to 1080p at 30 fps at the same settings.  Add in better settings, low to medium....  but hey, your opinion is your opinion.



Chrkeller said:
Soundwave said:

Well that's pushing to an extreme, but a PS5 isn't much more powerful than a 2050 to begin with. It's maybe double the performance and that doesn't mean jack shit today because resolution and frame rate basically can eat that difference up in a second, so those two pieces of hardware are the same class of hardware. 

It's not like a PS5 is running Alan Wake II on Ultra with ray tracing, not even close. 

In my day simply flipping from low to medium settings is not a generational leap, not sure why I'm supposed to believe that is a different class of hardware today. 

Your opinions are your own.  I don't consider a 2070s being 153% higher in benchmarking against a 2050 mobile "not much more powerful."  But to each their own.  

That's not a different class of hardware, especially if the latter also has the benefit of being able to be played portably. 

A Dreamcast is a different class of hardware from an N64 (and that was a 2 year difference in hardware back then), not like going into a PC a going from Low to Medium settings and bumping resolution from 1080p to 1440p and I'm supposed to fall over in amazement over that, lol. 

Even when comparing to past really you need a 5-6x leap in hardware to really be in a different class entirely, 150% performance uptick over like a Super NES back in the day wouldn't magically give you a "different class of hardware" ... it would just be a SNES class of hardware even back then. 

Today that's even worse with most these chips ending up using most of their performance overhead just on resolution and frame rate basically alone (and ray tracing makes it even worse). There's no actual difference in the base assets as there might have been in games of the past because it costs so much time/money to make the actual assets (models, textures, etc.) today. 

If Nintendo released a version of Breath of the Wild that runs at 1080p or even 1440p/60 fps am I supposed to fall out of my seat in amazement? No, I wouldn't. It's more like "oh ok, that's nice". It's not mind blowing though, not even close. And then on top of that if you had to lose portability to get that performance ... nah, forget it. I'd rather buy BOTW at 900p + only 30 fps but with the upside of being able to play it anywhere I want. 

Last edited by Soundwave - on 13 February 2024

For sure people have their own experience and appreciation for what is considered a big leap. I think almost everyone can appreciate the difference between low/high quality settings with a resolution & FPS boost thrown in. I think the point is that it's not a generational leap or what will define a system. It's more a more polished version of the same experience (most times)

I think for the general gaming public, especially on console, a big leap isn't really defined by settings (low vs high textures etc) but instead the overall target experience of what the developer is able to accomplish.

FPS and resolution alone don't make a huge jump. Consoles have constantly gone between 30-60fps as opposed to a consistent push for 60fps, so clearly people do not 60fps as a generational leap. The only place where this doesn't apply is to fast pace FPS' where 60 became the default last gen. Resolution can constitute a big jump but its always been coupled with other significant improvements in overall rendering and once we hit HD gains were not as significant.




PS4 HD remaster btw (so on pS2 looked worse)






Now jumping from FFXV to XVI




This is what hardware defining gaps are and the first 3 can't be achieved by changing settings. Evidently by the last example, the gaps are getting smaller and hence there is less need for Switch 2 to be punch toe to toe with PS5. If it can be somewhere between FXV and FFXVI in terms of a port, then that is probably a comparable experience to the average console gamer. (not the best image of FFXVI BTW)

Last edited by Otter - on 13 February 2024

Around the Network
Soundwave said:
Chrkeller said:

Your opinions are your own.  I don't consider a 2070s being 153% higher in benchmarking against a 2050 mobile "not much more powerful."  But to each their own.  

That's not a different class of hardware, especially if the latter also has the benefit of being able to be played portably. 

A Dreamcast is a different class of hardware from an N64 (and that was a 2 year difference in hardware back then), not like going into a PC a going from Low to Medium settings and bumping resolution from 1080p to 1440p and I'm supposed to fall over in amazement over that, lol. 

Even when comparing to past really you need a 5-6x leap in hardware to really be in a different class entirely, 150% performance uptick over like a Super NES back in the day wouldn't magically give you a "different class of hardware" ... it would just be a SNES class of hardware even back then. 

Today that's even worse with most these chips ending up using most of their performance overhead just on resolution and frame rate basically alone (and ray tracing makes it even worse). There's no actual difference in the base assets as there might have been in games of the past because it costs so much time/money to make the actual assets (models, textures, etc.) today. 

If Nintendo released a version of Breath of the Wild that runs at 1080p or even 1440p/60 fps am I supposed to fall out of my seat in amazement? No, I wouldn't. It's more like "oh ok, that's nice". It's not mind blowing though, not even close. And then on top of that if you had to lose portability to get that performance ... nah, forget it. I'd rather buy BOTW at 900p + only 30 fps but with the upside of being able to play it anywhere I want. 

I agree with the bold part.  While higher resolutions are nice and a consistent framerate is vital for gameplay, when it comes to the actual graphics it's the assets themselves that really matter to me.  I'd rather see improved base assets than a jump to 4k or 120 fps.

When it comes to the Switch 2, the main thing I want to see is a generational leap over the current Switch.  I want to see Mario look a generation above Odessey, Zelda a generation above Tears of the Kingdom, Metroid a generation above Prime Remastered, etc.  In addition, I want the 3rd party ports to be as high quality as the Switch ports of Doom, Doom Eternal, Witcher 3, Nier Automata, Crysis, Persona 5, etc.  If it has to top out at 1080p to do that I don't care.



Soundwave said:
Chrkeller said:

Your opinions are your own.  I don't consider a 2070s being 153% higher in benchmarking against a 2050 mobile "not much more powerful."  But to each their own.  

That's not a different class of hardware, especially if the latter also has the benefit of being able to be played portably. 

A Dreamcast is a different class of hardware from an N64 (and that was a 2 year difference in hardware back then), not like going into a PC a going from Low to Medium settings and bumping resolution from 1080p to 1440p and I'm supposed to fall over in amazement over that, lol. 

Even when comparing to past really you need a 5-6x leap in hardware to really be in a different class entirely, 150% performance uptick over like a Super NES back in the day wouldn't magically give you a "different class of hardware" ... it would just be a SNES class of hardware even back then. 

Today that's even worse with most these chips ending up using most of their performance overhead just on resolution and frame rate basically alone (and ray tracing makes it even worse). There's no actual difference in the base assets as there might have been in games of the past because it costs so much time/money to make the actual assets (models, textures, etc.) today. 

If Nintendo released a version of Breath of the Wild that runs at 1080p or even 1440p/60 fps am I supposed to fall out of my seat in amazement? No, I wouldn't. It's more like "oh ok, that's nice". It's not mind blowing though, not even close. And then on top of that if you had to lose portability to get that performance ... nah, forget it. I'd rather buy BOTW at 900p + only 30 fps but with the upside of being able to play it anywhere I want. 

Fair.  I'm different.  I would day 1 breath at 1440p and 60 fps.  I loathe 30 fps.  Frankly I don't like 60 fps.  120 fps is so amazing.

Last edited by Chrkeller - on 13 February 2024

Chrkeller said:
Soundwave said:

That's not a different class of hardware, especially if the latter also has the benefit of being able to be played portably. 

A Dreamcast is a different class of hardware from an N64 (and that was a 2 year difference in hardware back then), not like going into a PC a going from Low to Medium settings and bumping resolution from 1080p to 1440p and I'm supposed to fall over in amazement over that, lol. 

Even when comparing to past really you need a 5-6x leap in hardware to really be in a different class entirely, 150% performance uptick over like a Super NES back in the day wouldn't magically give you a "different class of hardware" ... it would just be a SNES class of hardware even back then. 

Today that's even worse with most these chips ending up using most of their performance overhead just on resolution and frame rate basically alone (and ray tracing makes it even worse). There's no actual difference in the base assets as there might have been in games of the past because it costs so much time/money to make the actual assets (models, textures, etc.) today. 

If Nintendo released a version of Breath of the Wild that runs at 1080p or even 1440p/60 fps am I supposed to fall out of my seat in amazement? No, I wouldn't. It's more like "oh ok, that's nice". It's not mind blowing though, not even close. And then on top of that if you had to lose portability to get that performance ... nah, forget it. I'd rather buy BOTW at 900p + only 30 fps but with the upside of being able to play it anywhere I want. 

Fair.  I'm different.  I would day 1 breath at 1440p and 60 fps.  I loathe 30 fps.  Frankly I don't like 60 fps.  120 fps is so amazing.

I agree with you. 30fps feels like a slide show after getting used 60fps. I'm good on 120fps as I don't wanna spend money for a pc right now I got a second job and barely gaming now. Honestly we are the point that these current gen consoles are so powerful that's its all about how they use art style and 60fps.

Last edited by zeldaring - on 13 February 2024

zeldaring said:
Chrkeller said:

Fair.  I'm different.  I would day 1 breath at 1440p and 60 fps.  I loathe 30 fps.  Frankly I don't like 60 fps.  120 fps is so amazing.

I agree with you. 30fps feels like a slide show after getting used 60fps. I'm good on 120fps as I don't wanna spend money for a pc right now I got a second job and barely gaming now. Honestly we are the point that these current gen consoles are so powerful that's its all about how they use art style and 60fps.

Absolutely.  60 fps is a sweet spot.  120, while nice, comes at a huge cost.  30 fps is just antiquated.

And 4k is overrated.  1440p is the sweet spot.  

As far as I'm concerned 1080p 30 fps can go **** itself.  

Last edited by Chrkeller - on 13 February 2024

h2ohno said:
Soundwave said:

That's not a different class of hardware, especially if the latter also has the benefit of being able to be played portably. 

A Dreamcast is a different class of hardware from an N64 (and that was a 2 year difference in hardware back then), not like going into a PC a going from Low to Medium settings and bumping resolution from 1080p to 1440p and I'm supposed to fall over in amazement over that, lol. 

Even when comparing to past really you need a 5-6x leap in hardware to really be in a different class entirely, 150% performance uptick over like a Super NES back in the day wouldn't magically give you a "different class of hardware" ... it would just be a SNES class of hardware even back then. 

Today that's even worse with most these chips ending up using most of their performance overhead just on resolution and frame rate basically alone (and ray tracing makes it even worse). There's no actual difference in the base assets as there might have been in games of the past because it costs so much time/money to make the actual assets (models, textures, etc.) today. 

If Nintendo released a version of Breath of the Wild that runs at 1080p or even 1440p/60 fps am I supposed to fall out of my seat in amazement? No, I wouldn't. It's more like "oh ok, that's nice". It's not mind blowing though, not even close. And then on top of that if you had to lose portability to get that performance ... nah, forget it. I'd rather buy BOTW at 900p + only 30 fps but with the upside of being able to play it anywhere I want. 

I agree with the bold part.  While higher resolutions are nice and a consistent framerate is vital for gameplay, when it comes to the actual graphics it's the assets themselves that really matter to me.  I'd rather see improved base assets than a jump to 4k or 120 fps.

When it comes to the Switch 2, the main thing I want to see is a generational leap over the current Switch.  I want to see Mario look a generation above Odessey, Zelda a generation above Tears of the Kingdom, Metroid a generation above Prime Remastered, etc.  In addition, I want the 3rd party ports to be as high quality as the Switch ports of Doom, Doom Eternal, Witcher 3, Nier Automata, Crysis, Persona 5, etc.  If it has to top out at 1080p to do that I don't care.

It's not like you get nothing out the trade off on a hybrid. The huge addition is you're getting a portable version of every game that you can play anywhere you want. 

Frankly, that's a much bigger deal to a lot of people than having a resolution bump + 60 fps. 

Want proof? Anyone dumb enough to claim a Switch that was not a hybrid but could run all its games like BOTW at 1440p + 60 fps would sell anywhere close to the current Switch? 

It wouldn't. It probably would have trouble selling even over 20 million units instead of the 150+ million the Switch is going to finish at. A lot more people are interested in a Breath of the Wild at 900p + 30 fps AND portable + home play versus a version that would be maybe 1440p + 60 fps but losing the portability. 

There's no game I've ever played in my life that was ugly or not fun to play at 1080p + 30 fps that suddenly became good looking and fun to play at 1440p + 60 fps. 

It's a nice to have. It's not a generational shift or even close to that though. And also funny how we don't have every PS5 thread flooded with people saying the 4090 is a much, much better piece of hardware and you shouldn't enjoy PS5 games because they can't possibly run as well as they could on a 4090 and PS5 is clearly a generation behind a 4090, etc. etc. etc. Every game on the PS5 that's available on PC, a 4090 can run it at a better resolution and better frame rate than a PS5, but no one seems compelled to have to put that into every PS5 discussion. Interesting how the gate keeping only applies to one console and one console only. 

Last edited by Soundwave - on 13 February 2024