By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - How will be Switch 2 performance wise?

 

Your expectations

Performance ridiculously ... 0 0%
 
Really below current gen,... 2 100.00%
 
Slightly below current ge... 0 0%
 
On pair with current gen,... 0 0%
 
Total:2
Louie said:

I think what most people are interested in is: "Will this console be able to play the games I like at decent settings?" That's basically the question things revolve around. And I think for the vast majority of people playing a game at 1080p @ 30fps is perfectly fine. We already saw that many people are willing to sacrifice graphical fidelity with a lot of major Switch ports.

I think it will.

There will be differences compared to the other consoles obviously, but the big question is... People will be bothered by this?

People are using the term graphical "jump", like if we are going to see something like Black Ops DS vs PS3.

Just to have an idea, I still think Infamous second son holds till this day as a beautiful game. People on this site don't seem to understand how they overestimate their perception over the visual evolution of the games in these recent generations

Last edited by 160rmf - on 06 February 2024

 

 

We reap what we sow

Around the Network
HoloDust said:

Once you throw in Ray/Path tracing, voxel based worlds for persistence and destructibility (and I'm not talking Minecraft sized cubes, but quite small, for fine details), and AI agents for NPCs (and that's what will all eventually happen), there is never enough power under the hood.

That is something I want to see and to perceive, instead of chasing for the best visual performance 



 

 

We reap what we sow

The famous comparison which really highlights what people are talking about when referring to diminishing returns

Obviously an untextured, fairly undetailed mesh has it limits in communicating the point, but back in the day a generation leap meant immediate and obvious ways of making night and day difference to a character model, to a texture, an environment. That doesn't apply as much anymore.



You won't suddenly get a hugely noticeable difference by doubling the polys on Ellie's character model or increasing the texture detail. The scenes are already populated to a degree that feels authentic, doubling the number of meshes within the environment won't suddenly make the game more believable. The rendering techniques on skin and hair & pretty much every texture is  already are very close to objective realism. The baked lighting is amazing and doesn't need raytraycing to make the scene fully convincing. 

Last edited by Otter - on 06 February 2024

Otter said:

The famous comparison which really highlights what people are talking about when referring to diminishing returns

Obviously an untextured, fairly undetailed mesh has it limits in communicating the point, but back in the day a generation leap meant immediate and obvious ways of making night and day difference to a character model, to a texture, an environment. That doesn't apply as much anymore.



You won't suddenly get a hugely noticeable difference by doubling the polys on Ellie's character model or increasing the texture detail. The scenes are already populated to a degree that feels authentic, doubling the number of meshes within the environment won't suddenly make the game more believable. The rendering techniques on skin and hair & pretty much every texture is  already are very close to objective realism. The baked lighting is amazing and doesn't need raytraycing to make the scene fully convincing. 

The polygon wars died a decade ago.  It is all about lighting, shadows, particles, volumetric effects, particles, scattering, fps, resolution, texture quality, anisotropic, anti aliasing, etc.  All of which eat memory bandwidth and vram.  

I have to disagree that texture quality doesn't matter.  Re2, as an example, is immediately better looking at 8 gb when compared to bilinear.  Higher settings also reduces stuttering.  Most importantly is anisotropic filtering.  16x is incredible.  

Last edited by Chrkeller - on 06 February 2024

Chrkeller said:
Biggerboat1 said:

You're contradicting yourself, you're saying that the graphics jump isn't apparent on consoles but then go on to say there's still a long way to go before consoles hit diminishing returns.

Diminishing returns, doesn't mean no returns.

And the fact that your example relies on a gap bigger than a generational console jump proves my point, not yours...

You also said yourself that many of these advances rely on good/modern/large displays. What proportion of the market has access to a large high-end display, capable of 120fps?

Diminishing returns, implies to me, graphic jumps are gone.  My point is that isn't true.  Consoles, especially the switch 2, struggle with memory bandwidth.  A 3050 is going to be 200 gb/s, while the ps5 is 400 and 4090 is 1,000.  Once consoles catch up with bandwidth there will be a massive jump.  The gap from ps5 to ps6 will be bigger than ps4 to ps5.  It isn't Diminishing, this gen is just weak because of market elements.  

120 hz panels are pretty common these days.  A 10 gb still image at 30 fps requires bandwidth of 300 gb/s.  60 fps is 600 and 120 fps is 1200.  The switch 2 is going to be very noticeable limited.  

I think people are forgetting about the gpu shortage and underestimate cross gen ports holding the ps5 back.  I could see a ps6 hitting 1000 memory bandwidth and the jump will be impressive. 

Edit

By definition diminishing returns is an asymtote.  We are not seeing this with graphics.  We are seeing an exponential curve.  There is a major difference.

Your definition of diminishing returns is incorrect - see pic. You're creating a straw man to allow yourself wiggle room.

Bolded/underlined 1, are you talking about a bigger jump in perceived improvement? If not, you're entirely missing the point of the convo.

Bolded/underlined 2, I specifically cited Spiderman 1 vs Spiderman 2 as an example, which avoids this issue, you really think the average gamer could play those two games and instantly assume they must be playing on different gen consoles?

Bolded/underlined 3, you're moving the goalposts, it's not about technology not improving, it's about the diminishing returns of the perceived difference in improved technology.

Otter has given a good example, which seems to go straight over your head. His same point could be applied to all of the elements you listed (lighting, shadows, particles, volumetric effects, particles, scattering, fps, resolution, texture quality, anisotropic, anti aliasing, etc.)



Around the Network

I chose series s a I think it'll be closer to that than the ps4 pro or it'll be somewhere in the middle. I think Nintendo will concerntrate more on the handheld aspect of the switch 2 because as from the beginning of the switch life it was considered a console but was always a handheld that really replaced the 3DS. So the hardware won't need 4k resolution, probably won't even bother with  ray tracing. Won't make sense to have in the art style of Mario or zelda. The handheld will benefit more from having 60htz or even 120. That would be a great advantage to advertise. But also it's hard to say if Nintendo want 3rd party games to run just aswell. Even old PS4 games look shocking on the switch. Does Nintendo need 3rd party. They seem to sell OK without the need of GTA, COD 



Biggerboat1 said:
Chrkeller said:

Diminishing returns, implies to me, graphic jumps are gone.  My point is that isn't true.  Consoles, especially the switch 2, struggle with memory bandwidth.  A 3050 is going to be 200 gb/s, while the ps5 is 400 and 4090 is 1,000.  Once consoles catch up with bandwidth there will be a massive jump.  The gap from ps5 to ps6 will be bigger than ps4 to ps5.  It isn't Diminishing, this gen is just weak because of market elements.  

120 hz panels are pretty common these days.  A 10 gb still image at 30 fps requires bandwidth of 300 gb/s.  60 fps is 600 and 120 fps is 1200.  The switch 2 is going to be very noticeable limited.  

I think people are forgetting about the gpu shortage and underestimate cross gen ports holding the ps5 back.  I could see a ps6 hitting 1000 memory bandwidth and the jump will be impressive. 

Edit

By definition diminishing returns is an asymtote.  We are not seeing this with graphics.  We are seeing an exponential curve.  There is a major difference.

Your definition of diminishing returns is incorrect - see pic. You're creating a straw man to allow yourself wiggle room.

Bolded/underlined 1, are you talking about a bigger jump in perceived improvement? If not, you're entirely missing the point of the convo.

Bolded/underlined 2, I specifically cited Spiderman 1 vs Spiderman 2 as an example, which avoids this issue, you really think the average gamer could play those two games and instantly assume they must be playing on different gen consoles?

Bolded/underlined 3, you're moving the goalposts, it's not about technology not improving, it's about the diminishing returns of the perceived difference in improved technology.

Otter has given a good example, which seems to go straight over your head. His same point could be applied to all of the elements you listed (lighting, shadows, particles, volumetric effects, particles, scattering, fps, resolution, texture quality, anisotropic, anti aliasing, etc.)

Play at 120 fps in ultra settings and tell me there isn't a massive and immediately perceivable difference.  All I have to say at this point.  Give it a try yourself.

Edit 

And the other point would be we have all given our positions.  I'll stick with the switch 2 struggling with next gen games like rebirth, alan wake 2, gta6, etc.  Time will tell which of us of is correct.  

Last edited by Chrkeller - on 06 February 2024

DF still seems to be assuming 8nm.



Oneeee-Chan!!! said:

DF still seems to be assuming 8nm.

Pretty obvious that's what it's gonna be. I would say 8nm is 90% locked with everything we know and how Nintendo operates. 



8nm is the default process to go with since Ampere is manufactured on it. I wouldn't be surprised if Nintendo decided to throw us a curveball and used 4nm tsmc instead.