By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - How will be Switch 2 performance wise?

 

Your expectations

Performance ridiculously ... 0 0%
 
Really below current gen,... 2 100.00%
 
Slightly below current ge... 0 0%
 
On pair with current gen,... 0 0%
 
Total:2
Chrkeller said:
160rmf said:

Well, if was crap show, she would quit playing. You are the one that are using terms that doesn't resonate with the reality.

Fair enough.  I'll change my verbiage from "crap" to "obvious lower fidelity."

Better?  

Good, and now people who say "not being able" should now say "not caring enough" and we can move this thread just fine



 

 

We reap what we sow

Around the Network

Deleted, to move thread along.  160rmf has a valid perspective.

Last edited by Chrkeller - on 11 February 2024

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

160rmf said:
Chrkeller said:

Fair enough.  I'll change my verbiage from "crap" to "obvious lower fidelity."

Better?  

Good, and now people who say "not being able" should now say "not caring enough" and we can move this thread just fine

Works for me.  Nintendo is my favorite developer and switch 2 will be a purchase the day I can get one.  A 2050 will make their style of games look brilliant and will be a massive improvement.  My biggest hope for the switch 2 is better audio.  Surround on the switch was disappointing.  



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

zeldaring said:
160rmf said:

Well, if was crap show, she would quit playing. You are the one that are using terms that doesn't resonate with the reality.

That's his perspective coming from a person who loves graphics and high framerates. a Samsung galaxy 8 is a crappy old phone by today standards but some people don't care and perfectly fine for them.

The term you are looking for is "high end".

Remember when high end cellphone technology actually made the standard look like prehistoric shit? Because I do.

Last edited by 160rmf - on 11 February 2024

 

 

We reap what we sow

zeldaring said:

Yes it's not worth keep arguing about this. we will get info soon but just like for the past Nintendo consoles. Nintendo fans saying it doesn't make sense then specs come out and they are the lowest end of predictions every time.

A nintendo hater crying over facts won't change reality. We already know it has 12 sms and THAT. DOESN'T. WORK. AT. 8NM.



Around the Network
Blazerz said:

A nintendo hater crying over facts won't change reality. We already know it has 12 sms and THAT. DOESN'T. WORK. AT. 8NM.

There are 8 nm Orins with 16 SMs in the market.

They have a 15W configuration that still manages to run at higher clocks than the docked Switch (while being comparable to it power budget-wise).

Cut that in half to 210 MHz, and assuming the voltage is at a minimum, you already get close to an undocked Switch's power budget with an even higher SM count and much faster memory/tensor cores than the Switch 2 is going to have.



 

 

 

 

 

A RTX 2050 is a PS5 class hardware. Saying it's not would be like saying the XBox Series S is not in the same generation class as the PS5 ... no one is buying that.

Just because it's not exactly equal doesn't mean it basically runs the same games.

A 2050 will optimized developer support will run basically every PS5 game, it's not the exact same teraflop performance but with DLSS it can also render at like half the resolution or less and provide a similar image output.

Anyone unhappy with that level of performance from a portable device is smoking crack cocaine, lol.



haxxiy said:
Blazerz said:

A nintendo hater crying over facts won't change reality. We already know it has 12 sms and THAT. DOESN'T. WORK. AT. 8NM.

There are 8 nm Orins with 16 SMs in the market.

They have a 15W configuration that still manages to run at higher clocks than the docked Switch (while being comparable to it power budget-wise).

Cut that in half to 210 MHz, and assuming the voltage is at a minimum, you already get close to an undocked Switch's power budget with an even higher SM count and much faster memory/tensor cores than the Switch 2 is going to have.

Problem remains this is a massive waste of SMs. You could get the same performance from 8SMs and just clock a bit higher and have a cheaper chip. Why pay for 12SMs, it's a way larger chip, doesn't really make a whole lot of sense. Like this doesn't even align with things Nintendo has done in the past if that's the argument. 

210 MHz, lol, the Wii's 2001-era GPU has a higher clock than that. These are insanely low clocks for a more expensive and much larger chip for no reason. This is like going out of your way to buy a jumbo popcorn at the theater and paying the $6 premium for it and then eating 10% of it, you could have just bought a freaking regular popcorn and not have paid the extra money. And if the argument for doing so is "well I did that because I'm cheap" ... it's like what? lol. How does that make any sense. 

Last edited by Soundwave - on 11 February 2024

Soundwave said:

I love the "you can't trust your own eyes on Youtube!". There's a good chance Switch 2 has more usable RAM for gaming than a 2050 also.

Normal people don't really give a crap about this stuff, high end GPU sales have been tanking, it was inflated by crypto miners for a few years, had a boost during COVID when everyone was locked in their homes, and is now crashing back down to earth. The market for the really high end GPUs is like maybe 1% of the gaming market, 99% of people aren't in that market and don't care to be in that market. 




To be honest if you're paying for a high end GPU thinking you're going to get a massive lift on the settings side, you're totally get suckered, lol. The difference is not even close to being worth paying 3-4x the price. Developers optimize the low setting to be great these days, it's basically what medium+ settings used to be. Reason is as I stated they can't just make custom lower end models and textures and lighting just for a low version, it would cost a lot of money and time to do that, so the low version is still getting all the same models, same basic baked lighting effects, same textures even for the most part. You're not going to hire a couple of artists just to redo every texture in a game, that would be insane. 

A 3050 is basically a PS5, having DLSS and better ray tracing (Nvidia > AMD) as well I would take a 3050 over a PS5 honestly. 

You are right that high end GPU's are fairly niche in gaming, latest Steam Hardware survey has only around 10% using high end cards from the last 2 Nvidia GPU gens (so 3070-4090) while AMD's high end parts from the last 2 GPU gens are under 1% combined on there. Most of the marketshare is on the low-mid range, which is why it is odd that Nvidia didn't release a low end card this generation, they decided to rename the 4050 to the 4060 and charged $300 for it, so there is no proper replacement for the aged 3050, which is easily outclassed by both AMD and Intel Arc cards in the same price range like the Radeon 6600 and 6600XT and Arc a580 & a750.

3050 definitely isn't on par with the PS5 though, most people agree that PS5 performance falls somewhere around the Radeon 6650, which easily outclasses the 3050 in rasterization games and edges it out in Raytracing as well even though AMD isn't great at Raytracing. The closest Nvidia cards to PS5 performance currently are the 3060 and 2070,  which come in just below PS5 in rasterization games but besting PS5 in raytracing, and the 2070 Super, which should about equal PS5 in rasterization while besting it in ray tracing.

Last edited by shikamaru317 - on 11 February 2024

shikamaru317 said:
Soundwave said:

I love the "you can't trust your own eyes on Youtube!". There's a good chance Switch 2 has more usable RAM for gaming than a 2050 also.

Normal people don't really give a crap about this stuff, high end GPU sales have been tanking, it was inflated by crypto miners for a few years, had a boost during COVID when everyone was locked in their homes, and is now crashing back down to earth. The market for the really high end GPUs is like maybe 1% of the gaming market, 99% of people aren't in that market and don't care to be in that market. 




To be honest if you're paying for a high end GPU thinking you're going to get a massive lift on the settings side, you're totally get suckered, lol. The difference is not even close to being worth paying 3-4x the price. Developers optimize the low setting to be great these days, it's basically what medium+ settings used to be. Reason is as I stated they can't just make custom lower end models and textures and lighting just for a low version, it would cost a lot of money and time to do that, so the low version is still getting all the same models, same basic baked lighting effects, same textures even for the most part. You're not going to hire a couple of artists just to redo every texture in a game, that would be insane. 

A 3050 is basically a PS5, having DLSS and better ray tracing (Nvidia > AMD) as well I would take a 3050 over a PS5 honestly. 

You are right that high end GPU's are fairly niche in gaming, latest Steam Hardware survey has only around 10% using high end cards from the last 2 Nvidia GPU gens (so 3070-4090) while AMD's high end parts from the last 2 GPU gens are under 1% combined on there. Most of the marketshare is on the low-mid range, which is why it is odd that Nvidia didn't release a low end card this generation, they decided to rename the 4050 to the 4060 and charged $300 for it, so there is no proper replacement for the aged 3050, which is easily outclassed by both AMD and Intel Arc cards in the same price range.

3050 definitely isn't on par with the PS5 though, most people agree that PS5 performance falls somewhere around the Radeon 6650 or 6700, both of which easily outclassed the 3050 in rasterization games and edge it out in Raytracing as well even though AMD isn't great at Raytracing. The closest Nvidia cards to PS5 performance currently are the 3060 and 2070, roughly equaling it in rasterization games but besting PS5 in raytracing.

That 10% number is actually a lot higher than it normally would be. A lot of people circa 2018-2021 bought a higher end GPU than they normally would have because of COVID lockdown + crypto bros. boom (people thinking they could make money during that). 

We're seeing now with those two factors gone, new GPU sales have gone down the toilet. 

A 3050 doesn't have to have the same exact performance, it's within the same hardware class is my point. No one says the XBox Series S is a different generation from the XBox Series X, PS5. And if the Series S had DLSS ability on top of what it already does that gap would shrink even smaller. 

You guys are frankly spoiled today with hardware, when I was growing up the idea of a portable device even being in the same neighborhood as a home console was unbelievable. Like the idea of a portable even 1/2 of an N64 or Playstation back in the day ... lol would have been not even comprehensible to people. People freaked out over the PSP being even somewhat comparable to a PS2. A 3050 is a lot closer to a PS5 than a PSP is to a PS2. Especially with how games use power today, you can take the same game more or less with the same assets basically and just bring the resolution down and bring the frame rate from 60 to 30 fps and that basically wipes out most of the what the extra power was used for. 

We take for granted what portable hardware can do today, people would have shit themselves if circa 2008 you showed them a handheld running like a modern game on it like Call of Duty for XBox 360 running on any kind of handheld device even at half the resolution, half the frame rate, with no anti-aliasing but the same assets, same game otherwise. People's heads would have exploded. The first iPhone which was ridiculously expensive in 2007 and only a few people could afford had a screen resolution of 320x480 lol. 

If someone is complaining about 3050, even 2050 level hardware in a portable device that doesn't cost $1000+ and can be thrown into a coat pocket, like lol at some point you've just lost the plot. That's not even "hardcore gamer" things that's like being a nerdy fringe of a fringe of a fringe. 

Last edited by Soundwave - on 11 February 2024