By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Will the Switch 2 finally be powerful enough and popular enough to get Nintendo all the top games?

curl-6 said:
Mar1217 said:

Anyway as usual, Nintendo and close partners are the master of their hardware so it'll be fun to see what they'll achieve this time around.

Yeah that's what I'm most looking forward to.

Given what studios like EPD, Retro, Next Level, etc were able to squeeze out of the Switch, the thought of what they can pull off with a generationally more powerful chipset is tantalizing.

I'm of the opinion that the best looking PS4/XBO games still look great; seeing a new Zelda or Xenoblade with the level of fidelity of something like The Last of Us Part 2 or Ghost of Tsushima, now that I'd love to see.

Yes, the early days of a new console is certainly the moment where we should get excited for these kind of graphical jumps, especially because it hasn't been as big of one since the jump from the Wii to WiiU arguably. (Or 3DS to Switch if you want to be the "acshooly" guy)

Anywoo, I don't know about fidelity in the likes of TLOU2 or GoT because Nintendo tends not go for those flashy realistic graphics. They'll probably gain a density and a clarity we haven't had before at least, their differing graphic styles might pop even more than before. I'm already trying to wrap my head around what something like the next Mario Kart is supposed to look like. 

Monolith Soft gonna be cooking I presume though, Imma guess there going for their usual grand open space adventure RPG with semi realistic environments while pushing an absurd amount of characters and effects on screen lol.



Switch Friend Code : 3905-6122-2909 

Around the Network
Mar1217 said:
curl-6 said:

Yeah that's what I'm most looking forward to.

Given what studios like EPD, Retro, Next Level, etc were able to squeeze out of the Switch, the thought of what they can pull off with a generationally more powerful chipset is tantalizing.

I'm of the opinion that the best looking PS4/XBO games still look great; seeing a new Zelda or Xenoblade with the level of fidelity of something like The Last of Us Part 2 or Ghost of Tsushima, now that I'd love to see.

Yes, the early days of a new console is certainly the moment where we should get excited for these kind of graphical jumps, especially because it hasn't been as big of one since the jump from the Wii to WiiU arguably. (Or 3DS to Switch if you want to be the "acshooly" guy)

Anywoo, I don't know about fidelity in the likes of TLOU2 or GoT because Nintendo tends not go for those flashy realistic graphics. They'll probably gain a density and a clarity we haven't had before at least, their differing graphic styles might pop even more than before. I'm already trying to wrap my head around what something like the next Mario Kart is supposed to look like. 

Monolith Soft gonna be cooking I presume though, Imma guess there going for their usual grand open space adventure RPG with semi realistic environments while pushing an absurd amount of characters and effects on screen lol.

Oh I know Nintendo's games aren't gonna look like GoT or TLOU2 stylistically, I meant more in terms of just technical fidelity. Though some first party devs might push a somewhat realistic style, like say Retro did for Prime Remastered/Prime 4.

And yeah it's been ages since we saw a big graphical leap for Nintendo's titles, I can't wait to see it.

Mar1217 said:

That actually be a fair point ... If the games were actually optimized for the platform with a closed API, but they're not since the Steam Deck works through a compatibility layer. The Switch successor would benefit from said optimization, possibly better bandwidth and a more efficient Nvidia chipset. 

An example of this is Switch vs the Nvidia Shield TV. They have the exact same Tegra X1 chip at their core, and the Shield TV runs at higher clocks, but the Switch performs significantly better in most instances due to having a low level API and games being more specifically tailored to the hardware.



Chrkeller said:

Evidence meaning like space marines 2 and dragons dogma 2 not running on the Steam Deck?

I mean we can keep dancing but I don't agree with your stance.  Is it a requirement that we agree?  

Seems to me we should be able to agree to disagree.

Let's leave opinions out of it and stick to facts and evidence that I have presented.

You aren't actually disagreeing with me specifically, you are disagreeing with evidence without providing better evidence... Which actually just means you are flat-out wrong.
I've been running you around in circles and nothing you have stated has been a compelling argument so far.

And no, it's not a requirement that we agree, I actually don't debate with people to change their minds, I debate with people so that others who may be perusing the thread can see the arguments put forth and the evidence that has been presented and hopefully they opt for the better arguments position or gain some additional insight into an issue. Again. Evidence.

Space Marines 2 and Dragons Dogma 2 run fine on the Rog Ally X. An AMD Radeon Handheld with less than 120GB/s of bandwidth.





So if AMD Radeon powered handhelds can run it, then a more efficient Tegra should do it too.
Again, Rog Ally is running a full blown Windows install, so compatibility is good, performance is lower than a more bespoke option running custom API's and OS.

curl-6 said:

An example of this is Switch vs the Nvidia Shield TV. They have the exact same Tegra X1 chip at their core, and the Shield TV runs at higher clocks, but the Switch performs significantly better in most instances due to having a low level API and games being more specifically tailored to the hardware.

Interesting you mention the Shield TV.
The OS in the Switch is definitely leaner and cleaner with much less rubbish running in the background.
But the NVN API has also proven itself to be very capable, much more-so than the high-level API's like OpenGL that most Android developers use which comes with a big impact to performance especially when lots of draw calls are thrown around.

But the Switch has had some very horrendous looking games as well, which likely highlights developers using OpenGL/Vulkan or struggling with the NVN API.



--::{PC Gaming Master Race}::--

Pemalite said:
curl-6 said:

An example of this is Switch vs the Nvidia Shield TV. They have the exact same Tegra X1 chip at their core, and the Shield TV runs at higher clocks, but the Switch performs significantly better in most instances due to having a low level API and games being more specifically tailored to the hardware.

Interesting you mention the Shield TV.
The OS in the Switch is definitely leaner and cleaner with much less rubbish running in the background.
But the NVN API has also proven itself to be very capable, much more-so than the high-level API's like OpenGL that most Android developers use which comes with a big impact to performance especially when lots of draw calls are thrown around.

But the Switch has had some very horrendous looking games as well, which likely highlights developers using OpenGL/Vulkan or struggling with the NVN API.

Yeah there are definitely some ugly Switch games out there, most of which seem to result from games built for PS4/XBO/PS5/XBS being cut harshly cut down to size rather than carefully nipped and tucked to fit.

Not all studios have the necessary resources or capabilities to get the best out of the hardware it seems; it takes a lot of work and very talented programmers to pull off results like Ace Combat 7 or Sniper Elite 4 which make smart and finely tuned concessions in all the right places to keep the experience looking and feeling intact.



Pemalite said:
Chrkeller said:

Evidence meaning like space marines 2 and dragons dogma 2 not running on the Steam Deck?

I mean we can keep dancing but I don't agree with your stance.  Is it a requirement that we agree?  

Seems to me we should be able to agree to disagree.

Let's leave opinions out of it and stick to facts and evidence that I have presented.

You aren't actually disagreeing with me specifically, you are disagreeing with evidence without providing better evidence... Which actually just means you are flat-out wrong.
I've been running you around in circles and nothing you have stated has been a compelling argument so far.

And no, it's not a requirement that we agree, I actually don't debate with people to change their minds, I debate with people so that others who may be perusing the thread can see the arguments put forth and the evidence that has been presented and hopefully they opt for the better arguments position or gain some additional insight into an issue. Again. Evidence.

Space Marines 2 and Dragons Dogma 2 run fine on the Rog Ally X. An AMD Radeon Handheld with less than 120GB/s of bandwidth.





So if AMD Radeon powered handhelds can run it, then a more efficient Tegra should do it too.
Again, Rog Ally is running a full blown Windows install, so compatibility is good, performance is lower than a more bespoke option running custom API's and OS.

curl-6 said:

An example of this is Switch vs the Nvidia Shield TV. They have the exact same Tegra X1 chip at their core, and the Shield TV runs at higher clocks, but the Switch performs significantly better in most instances due to having a low level API and games being more specifically tailored to the hardware.

Interesting you mention the Shield TV.
The OS in the Switch is definitely leaner and cleaner with much less rubbish running in the background.
But the NVN API has also proven itself to be very capable, much more-so than the high-level API's like OpenGL that most Android developers use which comes with a big impact to performance especially when lots of draw calls are thrown around.

But the Switch has had some very horrendous looking games as well, which likely highlights developers using OpenGL/Vulkan or struggling with the NVN API.

Rog Ally X is also 24 gb of ram and $800 via Z1 exteme chipset.  So yeah, it little to do with the S2 because the S2 will much cheaper and less power.  Funny thing is I knew that would be your response, and part of it is sad given the comparison is disingenuous.  

At the end of the day can I see developers looking at the S2 maxed at 112 gb/s and reacting to a potential port with "nah, too much effort?"  Yeah I can.  Funny thing is most people in this thread agree...  but some reason I have your panties twisted.

I stand by my position.  It would be nice if you channel your inner Elsa and stop pulling a Zeldaring.  

And ironically you support my position but your fragile ego is in the way.  What is the best way to offset low memory bandwidth?  Increase the ram.  And what do you hope?  The 12 gb rumor is wrong and it ends up 16 gb.....  to offset the memory bandwidth bottleneck....  which is the bottleneck I highlighted on PAGE 1.  But YOUR temper tantrum has us in page 21.

I'm not wrong. 

Last edited by Chrkeller - on 08 September 2024

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Around the Network
Chrkeller said:

Rog Ally X is also 24 gb of ram and $800 via Z1 exteme chipset.  So yeah, it little to do with the S2 because the S2 will much cheaper and less power.  Funny thing is I knew that would be your response, and part of it is sad given the comparison is disingenuous.  

You are shifting the goal post.
The original argument was whether 120GB/s of bandwidth was up to task. And it is. I've proven it.

Nothing to do with the CPU, nothing to do with RAM capacity, nothing to do with the GPU capabilities, it was Ram bandwidth.

But if you would really like to see a handheld console with 16GB of Ram run those cherry-picked games...

Here is the Legion Go handheld with 16GB of Ram playing the same games. Same performance.




Keep in mind the Switch 2.0 is likely going to be nVidia powered, so it's going to have a more efficient ARM CPU and a more efficient Tegra GPU, which can return better results... And a more efficient Operating System and better low-level API's that incentivizes efficiency over a bloated Windows/DirectX12 setup.

Chrkeller said:

At the end of the day can I see developers looking at the S2 maxed at 112 gb/s and reacting to a potential port with "nah, too much effort?"  Yeah I can.  Funny thing is most people in this thread agree...  but some reason I have your panties twisted.

Does it actually matter who has agreed? There are many hilarious instances where groups or even the majority of individuals got together and "agreed" on something that turned out to be wrong. Even when electing a new leader for their nation.

If you surround yourself with people who think like you, then you are going to feed into your own confirmation biases and ignore facts and evidence... It's a reinforcement of your own beliefs rather than a challenge of it.

The perfect example is individuals who believe the Earth is flat, will often surround themselves with other flat-earthers and use each other to reinforce their belief systems, whilst claiming the evidence provided by science as being incorrect.

This is the exact same logic you are using here.

Chrkeller said:

What is the best way to offset low memory bandwidth?  Increase the ram.  And what do you hope?  The 12 gb rumor is wrong and it ends up 16 gb.....  to offset the memory bandwidth bottleneck....  which is the bottleneck I highlighted on PAGE 1.  But YOUR temper tantrum has us in page 21.

I'm not wrong. 

You don't "offset" low memory bandwidth with increased Ram capacity.

You increase it with another tier of cache. I.E. eSRAM/eDRAM/L4.

This is why AMD has infinity cache on it's GPU's, this is why Microsoft had eDRAM on the Xbox 360, this is why Nintendo opted for eSRAM on WiiU and the Gamecube had 1T-SRAM and more.
https://www.anandtech.com/show/16202/amd-reveals-the-radeon-rx-6000-series-rdna2-starts-at-the-highend-coming-november-18th/2
https://www.anandtech.com/show/1864/inside-microsoft-s-xbox-360/8

And as demonstrably demonstrated in my previous quote... Reducing Ram capacity from 24GB to 16GB had no impact to performance.

Ram capacity never increases performance. It only prevents a "reduction in performance" when there is more data than what can be held in Ram... This is basic computing knowledge that has been established over the last 40 years.

You are hilariously wrong.

More evidence:



--::{PC Gaming Master Race}::--

Pemalite said:
Chrkeller said:

Rog Ally X is also 24 gb of ram and $800 via Z1 exteme chipset.  So yeah, it little to do with the S2 because the S2 will much cheaper and less power.  Funny thing is I knew that would be your response, and part of it is sad given the comparison is disingenuous.  

You are shifting the goal post.
The original argument was whether 120GB/s of bandwidth was up to task. And it is. I've proven it.

Nothing to do with the CPU, nothing to do with RAM capacity, nothing to do with the GPU capabilities, it was Ram bandwidth.

But if you would really like to see a handheld console with 16GB of Ram run those cherry-picked games...

Here is the Legion Go handheld with 16GB of Ram playing the same games. Same performance.




Keep in mind the Switch 2.0 is likely going to be nVidia powered, so it's going to have a more efficient ARM CPU and a more efficient Tegra GPU, which can return better results... And a more efficient Operating System and better low-level API's that incentivizes efficiency over a bloated Windows/DirectX12 setup.

Chrkeller said:

At the end of the day can I see developers looking at the S2 maxed at 112 gb/s and reacting to a potential port with "nah, too much effort?"  Yeah I can.  Funny thing is most people in this thread agree...  but some reason I have your panties twisted.

Does it actually matter who has agreed? There are many hilarious instances where groups or even the majority of individuals got together and "agreed" on something that turned out to be wrong. Even when electing a new leader for their nation.

If you surround yourself with people who think like you, then you are going to feed into your own confirmation biases and ignore facts and evidence... It's a reinforcement of your own beliefs rather than a challenge of it.

The perfect example is individuals who believe the Earth is flat, will often surround themselves with other flat-earthers and use each other to reinforce their belief systems, whilst claiming the evidence provided by science as being incorrect.

This is the exact same logic you are using here.

Chrkeller said:

What is the best way to offset low memory bandwidth?  Increase the ram.  And what do you hope?  The 12 gb rumor is wrong and it ends up 16 gb.....  to offset the memory bandwidth bottleneck....  which is the bottleneck I highlighted on PAGE 1.  But YOUR temper tantrum has us in page 21.

I'm not wrong. 

You don't "offset" low memory bandwidth with increased Ram capacity.

You increase it with another tier of cache. I.E. eSRAM/eDRAM/L4.

This is why AMD has infinity cache on it's GPU's, this is why Microsoft had eDRAM on the Xbox 360, this is why Nintendo opted for eSRAM on WiiU and the Gamecube had 1T-SRAM and more.
https://www.anandtech.com/show/16202/amd-reveals-the-radeon-rx-6000-series-rdna2-starts-at-the-highend-coming-november-18th/2
https://www.anandtech.com/show/1864/inside-microsoft-s-xbox-360/8

And as demonstrably demonstrated in my previous quote... Reducing Ram capacity from 24GB to 16GB had no impact to performance.

Ram capacity never increases performance. It only prevents a "reduction in performance" when there is more data than what can be held in Ram... This is basic computing knowledge that has been established over the last 40 years.

You are hilariously wrong.

More evidence:

Lol, bandwidth is a bottleneck, which will push some 3rd party developers away, especially the lazy ones.

Games are only going to get more demanding, not less.  Think less "today" and more 5 years out.

Doesn't increase performance but prevents a reduction in performance...  lol.  Sounds like different side of same coin, but fair enough I'll endure to use "prevent reduction."

Nothing about my posts is wrong.  I stand by it.  But I'm sure you will zeldaring me again.

Edit

Legion Go is $600 with a Z1...  I don't see the S2 being a $600 system especially with the dock.

Simply put I think two things can be true, meaning they aren't mutually exclusive.  The S2 will be well positioned but some and expecting way too much performance.  

My argument is the hardware will push some 3rd party away.  If you disagree then by default you think no 3rd party companies are going to be disinterested in the S2....  good luck.  

You seem to be under the impression that I'm claiming ports can't happen.  That isn't my claim.  I'm saying many AAA won't happen, not they can't happen.  You might want to understand an argument before going full stalker mode.

To emphasize what should be obvious, I said I don't think it "will" get many newer ports.  I never said the hardware made it impossible.  

Last edited by Chrkeller - on 08 September 2024

i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Chrkeller said:
Pemalite said:

You are shifting the goal post.
The original argument was whether 120GB/s of bandwidth was up to task. And it is. I've proven it.

Nothing to do with the CPU, nothing to do with RAM capacity, nothing to do with the GPU capabilities, it was Ram bandwidth.

But if you would really like to see a handheld console with 16GB of Ram run those cherry-picked games...

Here is the Legion Go handheld with 16GB of Ram playing the same games. Same performance.




Keep in mind the Switch 2.0 is likely going to be nVidia powered, so it's going to have a more efficient ARM CPU and a more efficient Tegra GPU, which can return better results... And a more efficient Operating System and better low-level API's that incentivizes efficiency over a bloated Windows/DirectX12 setup.

Chrkeller said:

At the end of the day can I see developers looking at the S2 maxed at 112 gb/s and reacting to a potential port with "nah, too much effort?"  Yeah I can.  Funny thing is most people in this thread agree...  but some reason I have your panties twisted.

Does it actually matter who has agreed? There are many hilarious instances where groups or even the majority of individuals got together and "agreed" on something that turned out to be wrong. Even when electing a new leader for their nation.

If you surround yourself with people who think like you, then you are going to feed into your own confirmation biases and ignore facts and evidence... It's a reinforcement of your own beliefs rather than a challenge of it.

The perfect example is individuals who believe the Earth is flat, will often surround themselves with other flat-earthers and use each other to reinforce their belief systems, whilst claiming the evidence provided by science as being incorrect.

This is the exact same logic you are using here.

You don't "offset" low memory bandwidth with increased Ram capacity.

You increase it with another tier of cache. I.E. eSRAM/eDRAM/L4.

This is why AMD has infinity cache on it's GPU's, this is why Microsoft had eDRAM on the Xbox 360, this is why Nintendo opted for eSRAM on WiiU and the Gamecube had 1T-SRAM and more.
https://www.anandtech.com/show/16202/amd-reveals-the-radeon-rx-6000-series-rdna2-starts-at-the-highend-coming-november-18th/2
https://www.anandtech.com/show/1864/inside-microsoft-s-xbox-360/8

And as demonstrably demonstrated in my previous quote... Reducing Ram capacity from 24GB to 16GB had no impact to performance.

Ram capacity never increases performance. It only prevents a "reduction in performance" when there is more data than what can be held in Ram... This is basic computing knowledge that has been established over the last 40 years.

You are hilariously wrong.

More evidence:

Lol, bandwidth is a bottleneck, which will push some 3rd party developers away, especially the lazy ones.

Games are only going to get more demanding, not less.  Think less "today" and more 5 years out.

Doesn't increase performance but prevents a reduction in performance...  lol.  Sounds like different side of same coin, but fair enough I'll endure to use "prevent reduction."

Nothing about my posts is wrong.  I stand by it.  But I'm sure you will zeldaring me again.

Edit

Legion Go is $600 with a Z1...  I don't see the S2 being a $600 system especially with the dock.

Simply put I think two things can be true, meaning they aren't mutually exclusive.  The S2 will be well positioned but some and expecting way too much performance.  

My argument is the hardware will push some 3rd party away.  If you disagree then by default you think no 3rd party companies are going to be disinterested in the S2....  good luck.  

You seem to be under the impression that I'm claiming ports can't happen.  That isn't my claim.  I'm saying many AAA won't happen, not they can't happen.  You might want to understand an argument before going full stalker mode.

To emphasize what should be obvious, I said I don't think it "will" get many newer ports.  I never said the hardware made it impossible.  

There's most definitely some wires crossed between both your arguments here and yet nonetheless, you have yet to substantially use actual facts to get your point across or refute his side of the argument. 

Most of what you said is speculatively based from anecdotal evidence and your own feelings. 

The moment you used the " Yet, the Steam Deck can't run games like Space Marine 2 or Dragon's Dogma 2" implies that you think another similar device wouldn't be able to support it.

Yet, a Lenovo Go is capable of doing it with the use of a more inefficient GPU/CPU than an eventual Switch 2 Nvidia custom chipset and API.

Despite the fact you both acquiesced to the bandwidth being a potential bottleneck, Real life applications as demonstrated by Pemalite truly proves that appart from "lazy devs" as you call it, these ports are definitely not out of the realm for the Switch successor if the current leaked specs are true.

Actually probably even less so in the future, considering how efficient and accustomed to some of these devs will be with the API in the future.

Even from a proportional standpoint, the Switch 2 hypothetically sits closer to the 9th Gen console bandwidth than the Switch 1 did with it's 8th Gen counterpart. Yet it got a port for Hogwarts Legacy. 

You're certainly not "wrong" that a callous publisher who's not willing to have the devs work on an eventual sound port are things that are gonna happen. There's the money aspect to the business.

Nonetheless, the point you tried to come across during this whole thread makes me think you genuinely think the porting job of those current "Next-Gen" games are to be herculean tasks that will outweight the cost-benefit aspect of the port jobs. 

Imo, I think there's going to be much less "friction" due to the industry also already dabbling into ARM architecture ports for mobile devices in general. Tools are more readily available for the task than they were when the Switch came in.

Anyway, I do think there isn't much that could be said about the argument in question. Otherwise, it's just gonna circle back again for another page or two. 



Switch Friend Code : 3905-6122-2909 

Mar1217 said:
Chrkeller said:

Lol, bandwidth is a bottleneck, which will push some 3rd party developers away, especially the lazy ones.

Games are only going to get more demanding, not less.  Think less "today" and more 5 years out.

Doesn't increase performance but prevents a reduction in performance...  lol.  Sounds like different side of same coin, but fair enough I'll endure to use "prevent reduction."

Nothing about my posts is wrong.  I stand by it.  But I'm sure you will zeldaring me again.

Edit

Legion Go is $600 with a Z1...  I don't see the S2 being a $600 system especially with the dock.

Simply put I think two things can be true, meaning they aren't mutually exclusive.  The S2 will be well positioned but some and expecting way too much performance.  

My argument is the hardware will push some 3rd party away.  If you disagree then by default you think no 3rd party companies are going to be disinterested in the S2....  good luck.  

You seem to be under the impression that I'm claiming ports can't happen.  That isn't my claim.  I'm saying many AAA won't happen, not they can't happen.  You might want to understand an argument before going full stalker mode.

To emphasize what should be obvious, I said I don't think it "will" get many newer ports.  I never said the hardware made it impossible.  

There's most definitely some wires crossed between both your arguments here and yet nonetheless, you have yet to substantially use actual facts to get your point across or refute his side of the argument. 

Most of what you said is speculatively based from anecdotal evidence and your own feelings. 

The moment you used the " Yet, the Steam Deck can't run games like Space Marine 2 or Dragon's Dogma 2" implies that you think another similar device wouldn't be able to support it.

Yet, a Lenovo Go is capable of doing it with the use of a more inefficient GPU/CPU than an eventual Switch 2 Nvidia custom chipset and API.

Despite the fact you both acquiesced to the bandwidth being a potential bottleneck, Real life applications as demonstrated by Pemalite truly proves that appart from "lazy devs" as you call it, these ports are definitely not out of the realm for the Switch successor if the current leaked specs are true.

Actually probably even less so in the future, considering how efficient and accustomed to some of these devs will be with the API in the future.

Even from a proportional standpoint, the Switch 2 hypothetically sits closer to the 9th Gen console bandwidth than the Switch 1 did with it's 8th Gen counterpart. Yet it got a port for Hogwarts Legacy. 

You're certainly not "wrong" that a callous publisher who's not willing to have the devs work on an eventual sound port are things that are gonna happen. There's the money aspect to the business.

Nonetheless, the point you tried to come across during this whole thread makes me think you genuinely think the porting job of those current "Next-Gen" games are to be herculean tasks that will outweight the cost-benefit aspect of the port jobs. 

Imo, I think there's going to be much less "friction" due to the industry also already dabbling into ARM architecture ports for mobile devices in general. Tools are more readily available for the task than they were when the Switch came in.

Anyway, I do think there isn't much that could be said about the argument in question. Otherwise, it's just gonna circle back again for another page or two. 

I used SM2 as an example because it does, at least right now, prevent a mobile device from running it.  Bear in mind SM2 recommends a 2060.  There are already a few 2025 titles that recommend a 2080, which is a substantial step up.  If the Switch launches in 2025 and has a 7 year life, I do question what ports will look over time.  In 2028, will games recommend a 3070?  A 3080?  Early gen ports won't be an issue at all for the S2.  My concern has only been for late gen ports, but perhaps you are correct and tools will still make porting simple.  I expect the switch 2 to be on the market post 2030.  

But yeah, not much to say really.  Other than people need to respectfully agree to disagree.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Chrkeller said:
Mar1217 said:

There's most definitely some wires crossed between both your arguments here and yet nonetheless, you have yet to substantially use actual facts to get your point across or refute his side of the argument. 

Most of what you said is speculatively based from anecdotal evidence and your own feelings. 

The moment you used the " Yet, the Steam Deck can't run games like Space Marine 2 or Dragon's Dogma 2" implies that you think another similar device wouldn't be able to support it.

Yet, a Lenovo Go is capable of doing it with the use of a more inefficient GPU/CPU than an eventual Switch 2 Nvidia custom chipset and API.

Despite the fact you both acquiesced to the bandwidth being a potential bottleneck, Real life applications as demonstrated by Pemalite truly proves that appart from "lazy devs" as you call it, these ports are definitely not out of the realm for the Switch successor if the current leaked specs are true.

Actually probably even less so in the future, considering how efficient and accustomed to some of these devs will be with the API in the future.

Even from a proportional standpoint, the Switch 2 hypothetically sits closer to the 9th Gen console bandwidth than the Switch 1 did with it's 8th Gen counterpart. Yet it got a port for Hogwarts Legacy. 

You're certainly not "wrong" that a callous publisher who's not willing to have the devs work on an eventual sound port are things that are gonna happen. There's the money aspect to the business.

Nonetheless, the point you tried to come across during this whole thread makes me think you genuinely think the porting job of those current "Next-Gen" games are to be herculean tasks that will outweight the cost-benefit aspect of the port jobs. 

Imo, I think there's going to be much less "friction" due to the industry also already dabbling into ARM architecture ports for mobile devices in general. Tools are more readily available for the task than they were when the Switch came in.

Anyway, I do think there isn't much that could be said about the argument in question. Otherwise, it's just gonna circle back again for another page or two. 

I used SM2 as an example because it does, at least right now, prevent a mobile device from running it.  Bear in mind SM2 recommends a 2060.  There are already a few 2025 titles that recommend a 2080, which is a substantial step up.  If the Switch launches in 2025 and has a 7 year life, I do question what ports will look over time.  In 2028, will games recommend a 3070?  A 3080?  Early gen ports won't be an issue at all for the S2.  My concern has only been for late gen ports, but perhaps you are correct and tools will still make porting simple.  I expect the switch 2 to be on the market post 2030.  

But yeah, not much to say really.  Other than people need to respectfully agree to disagree.

I guess it's a valid concern that even I would join you on if I actually had interest in most of the graphics intensive experiences  especially coming from the western hemisphere. 

Though I also do believe we are about of an age where fewer devs and publishers will go out of their way for the top of the line graphic experience. Scalability will be more expected in the future as it is now. 

Anywoo, the main experiences I give a care about for the Switch successor will be mostly first party games. Those are about to see a leap in motion from the aging Switch. 

I believe it's the same for you considering you'll get most of your 3rd party games for PC anyway :P



Switch Friend Code : 3905-6122-2909