By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - VGC: Switch 2 Was Shown At Gamescom Running Matrix Awakens UE5 Demo

zeldaring said:

The chances it has the 2024 tech is like 1%. Nintendo loves to make profit from day one and T239 is already a huge jump from tegra, no reason for them to go with to go super high tech especially when every rumor says it won't even be a Olded display. i'LL eat crow happily though and might even consider getting a switch 2 day 1 but it's not the nintendo way.

The issue is that Orin is getting old (almost 6 years old from Nvidia's perspective by 2024), and Nvidia cancelled Atlan (Orin's successor that was suppose to release next year.) It's possible that Nvidia wants to deprecate Orin, given its age, and that means the entire manufacturing process would be set to expire (no new orders) so that investment can be made elsewhere. 

If that is the case then Nvidia and Nintendo would have to find a new deal, and given the way GPU binning works it wouldn't necessarily have to cost a lot more to use a Lovelace chip. In fact, it might actually cost more in the long-term to use the older technology if Nvidia needs to keep manufacturing processes open five years from now that would otherwise been shut down because they support the Switch 2. See (note that the main target for Tegra chips are auto-manufacturers): https://getjerry.com/insights/why-cant-automakers-use-newer-chips-in-stock

"Even the chip manufacturers themselves are requesting that car manufacturers update their technology and make the switch to chips that are easier to produce."

I don't see how the Switch 2 not having an OLED helps your point. Cost savings from not having an OLED could be used elsewhere.

Just for context the architecture (Maxwell) used for the Tegra X1 in the Switch was only 2 years old when the Switch released. Lovelace will be 2 years old in late 2024. 

Last edited by sc94597 - on 10 September 2023

Around the Network
zeldaring said:
Chrkeller said:

To each their own but not only do I think PC gamers figured out resolution diminishing returns but they figured it way before console gamers did.  Think about it.  The ps5 and Xbox are pushing for 4k....  PC gamers have long locked at 1440p and are pushing for 120 fps....  because 1440p to 4k is minimal.  Meanwhile 30 fps to 120 fps is the grand canyon.  PC gamers are also chasing strong effects as seen by Ray tracing.

 Though perhaps we are saying the same thing.  Resolution has hit a ceiling.  The future is high fps, RT (when done properly), etc.  

When i say diminishing returns i'm talking about on a small screen like TOTk looks nice on a small screen on TV not so much. At the same time  dead space and hogwarts looks like shit on steam deck but for the most part switch 2 won't see games as ugly as witcher 3 or doom i hope. 

How close are you sitting to that big screen? The bigger the screen you meant to sit further away from it. I sit about 3-4meters away from a 55inch screen, and unless you really looking, hard to tell there is a difference even between a DVD and a BluRay. 

zeldaring said:
sc94597 said:

It isn't clear what you think a video about 4k DLSS has to do with the post you're quoting. 

Nobody is talking about running the game at a 4k target. 1080p was the target being discussed. 

I don't know where you got the idea that "DLSS looks better at 30fps", but it is wrong. There is an argument to be made that with higher framerates the better the temporal solution (because the motion vectors are closer to the mark), but with DLSS (as opposed to heuristic-based TAUU's) this is a marginal factor. 

Also all of my posts haven't been talking about anything close to miracles. A portable system likely releasing at the end of 2024 running at about the performance of a low-end 75W desktop GPU based on a 2018 architecture, isn't a "miracle." It's the by-product of two transistor die shrinks (12nm -> 7nm -> 5nm) and the fact that the VRAM bottleneck of said GPU wouldn't be an issue for the portable system (4GB -> 8-10GB.) It's precisely what one would expect to happen. 

you expect that in a console that will be 349$ the same size as a switch? not mention nintendo is gonna be using something that should have came out in 2020-2021 but they cancelled there plans so its not like it's 2024 tech. to me it sounds like wishful thinking and not being realistic. aside from that if you watch the video he says DLSS works much better at 30fps if you wanna use ultra performance. 

He didn't say that, he said if you targeting 4K output, 30fps s better because it reduces the render time. However if you want to aim for 60fps, perhaps the max output Nintendo will opt for in their games is 1440p. This is all based on their guess of what the tech will be.



 

 

Soundwave said:
Chrkeller said:

People need to stop using teraflops. It is a worthless measurement. The ps4 pro has more flops than the series s. The ps4 pro isn't more powerful.

You can't have it both ways. 

The reason the XBox Series S is able to display graphics "better" than a XBox One X despite a XB1X having higher teraflop performance is because the Series S has a more modern feature set, so even if it is only 1/3 the teraflop performance of the Series X, it can run the same games. 

But when it's pointed out that the Switch 2 will have a similar kind of massive improvement in being a more modern architecture (probably better actually than the PS5/XBX/XBS) than the PS4, the excuse becomes "well that's not really a big deal, it's still a PS4". 

Yet for the XBox Series S of course it gets given the benefit of the doubt. How convenient. A more modern feature set makes a massive difference, it can even basically push one console (Series S) into what's considered a different generation from another (XBox One X). 

I also would not be so sure that something like the XBox One X can't run modern games either, the Pro models for the PS4/XB1 are stuck basically having to run PS4/XB1 versions of games with just resolution/RAM improvements, they were intended to have exclusive software built specifically for their Polaris hardware (which is a lot better than the PS4/XB1's GCN2 garbage). No one's made a game expressly for the XBox One X ... I bet it can run a game like Starfield at 720p probably just fine but since it gets no exclusives apart from the XBox One, it'll never happen (Sony/MS have also basically cleared all stock of these models). 

Xbox One X absolutely could not run modern games like Starfield. You're ignoring the massive difference in CPU and I/O performance, that matters WAY more for why Series S is ahead of the Xbox One X. Starfield pushes CPU's hard which is why it's locked to 30fps on Xbox Series and it doesn't run properly on a hard drive with the game occasionally freezing for 1-2 seconds when portions of the map get loaded. You're clearly ignorant when it comes to this stuff so should do some more research before making more posts on this sort of topic.

Last edited by Norion - on 10 September 2023

Norion said:
Soundwave said:

You can't have it both ways. 

The reason the XBox Series S is able to display graphics "better" than a XBox One X despite a XB1X having higher teraflop performance is because the Series S has a more modern feature set, so even if it is only 1/3 the teraflop performance of the Series X, it can run the same games. 

But when it's pointed out that the Switch 2 will have a similar kind of massive improvement in being a more modern architecture (probably better actually than the PS5/XBX/XBS) than the PS4, the excuse becomes "well that's not really a big deal, it's still a PS4". 

Yet for the XBox Series S of course it gets given the benefit of the doubt. How convenient. A more modern feature set makes a massive difference, it can even basically push one console (Series S) into what's considered a different generation from another (XBox One X). 

I also would not be so sure that something like the XBox One X can't run modern games either, the Pro models for the PS4/XB1 are stuck basically having to run PS4/XB1 versions of games with just resolution/RAM improvements, they were intended to have exclusive software built specifically for their Polaris hardware (which is a lot better than the PS4/XB1's GCN2 garbage). No one's made a game expressly for the XBox One X ... I bet it can run a game like Starfield at 720p probably just fine but since it gets no exclusives apart from the XBox One, it'll never happen (Sony/MS have also basically cleared all stock of these models). 

Xbox One X absolutely could not run modern games like Starfield. You're ignoring the massive difference in CPU and I/O performance, that matters WAY more for why Series S is ahead of the Xbox One X. Starfield pushes CPU's hard which is why it's locked to 30fps on Xbox Series and it doesn't run properly on a hard drive with the game occasionally freezing for 1-2 seconds when portions of the map get loaded. You're clearly ignorant when it comes to this stuff so should do some more research before making more posts on this sort of topic.

Firstly, my main point stands in that post, I notice you don't want to debate any part of that. 

On Starfield, it runs on a ROG Ally (a portable hybrid machine). I think it probably could run on an XBox One X, but there were no games really that were made just for the XBox One X, we'll never really know I don't think what its full potential was. 

The main reason it may not work on XBox One X is not because the compute strength of the GPU (which is decent) but the shit Jaguar CPU cores that it inherited from the PS4/XB1. And the Switch 2 will likely blow those crap Jaguar cores out of the water too (ARM A78). 



Soundwave said:
Norion said:

Xbox One X absolutely could not run modern games like Starfield. You're ignoring the massive difference in CPU and I/O performance, that matters WAY more for why Series S is ahead of the Xbox One X. Starfield pushes CPU's hard which is why it's locked to 30fps on Xbox Series and it doesn't run properly on a hard drive with the game occasionally freezing for 1-2 seconds when portions of the map get loaded. You're clearly ignorant when it comes to this stuff so should do some more research before making more posts on this sort of topic.

Firstly, my main point stands in that post, I notice you don't want to debate any part of that. 

On Starfield, it runs on a ROG Ally (a portable hybrid machine). I think it probably could run on an XBox One X, but there were no games really that were made just for the XBox One X, we'll never really know I don't think what its full potential was. 

The problem for the One X and PS4 Pro was that their CPU's were pretty trash and therefore bottlenecked their very decent graphics processors.

On paper the GPU in the One X was on par with a RX 580, but in practice in any CPU-bound game, performance wasn't anything near what you'd expect from a 580.

It'd be like trying to run Starfield on a PC with a heavily underclocked FX-8100+ RX 580. The closest thing I can find to that is this overclocked FX-8300 + RX5700XT (90% better than a RX 580), and it isn't the most stable 30fps experience. 

Compare that to how  the 5700xt runs the game with a decent CPU.  

The Rog Ally has an excellent CPU/GPU combo in comparison. The GPU isn't as good at the Xbox One X's (about 25% slower), but the CPU doesn't bottleneck it into a stutter mess. 

Edit: Oops, missed that you made that point in your post. 



Around the Network
sc94597 said:
zeldaring said:

The chances it has the 2024 tech is like 1%. Nintendo loves to make profit from day one and T239 is already a huge jump from tegra, no reason for them to go with to go super high tech especially when every rumor says it won't even be a Olded display. i'LL eat crow happily though and might even consider getting a switch 2 day 1 but it's not the nintendo way.

The issue is that Orin is getting old (almost 6 years old from Nvidia's perspective by 2024), and Nvidia cancelled Atlan (Orin's successor that was suppose to release next year.) It's possible that Nvidia wants to deprecate Orin, given its age, and that means the entire manufacturing process would be set to expire (no new orders) so that investment can be made elsewhere. 

If that is the case then Nvidia and Nintendo would have to find a new deal, and given the way GPU binning works it wouldn't necessarily have to cost a lot more to use a Lovelace chip. In fact, it might actually cost more in the long-term to use the older technology if Nvidia needs to keep manufacturing processes open five years from now that would otherwise been shut down because they support the Switch 2. See (note that the main target for Tegra chips are auto-manufacturers): https://getjerry.com/insights/why-cant-automakers-use-newer-chips-in-stock

"Even the chip manufacturers themselves are requesting that car manufacturers update their technology and make the switch to chips that are easier to produce."

I don't see how the Switch 2 not having an OLED helps your point. Cost savings from not having an OLED could be used elsewhere.

Just for context the architecture (Maxwell) used for the Tegra X1 in the Switch was only 2 years old when the Switch released. Lovelace will be 2 years old in late 2024. 

Remember there is still a chip shortage and nintendo wants to have as many  switches as possible I wouldn't be surprised if ninetndo been stacking up these Orin chips for year  or 2 for them not have any problems which would be smart.

Ninetndo went with Tegra cause they got a mind blowing deal and those chips were basically useless to nvda.

Last edited by zeldaring - on 10 September 2023

zeldaring said:
sc94597 said:

The issue is that Orin is getting old (almost 6 years old from Nvidia's perspective by 2024), and Nvidia cancelled Atlan (Orin's successor that was suppose to release next year.) It's possible that Nvidia wants to deprecate Orin, given its age, and that means the entire manufacturing process would be set to expire (no new orders) so that investment can be made elsewhere. 

If that is the case then Nvidia and Nintendo would have to find a new deal, and given the way GPU binning works it wouldn't necessarily have to cost a lot more to use a Lovelace chip. In fact, it might actually cost more in the long-term to use the older technology if Nvidia needs to keep manufacturing processes open five years from now that would otherwise been shut down because they support the Switch 2. See (note that the main target for Tegra chips are auto-manufacturers): https://getjerry.com/insights/why-cant-automakers-use-newer-chips-in-stock

"Even the chip manufacturers themselves are requesting that car manufacturers update their technology and make the switch to chips that are easier to produce."

I don't see how the Switch 2 not having an OLED helps your point. Cost savings from not having an OLED could be used elsewhere.

Just for context the architecture (Maxwell) used for the Tegra X1 in the Switch was only 2 years old when the Switch released. Lovelace will be 2 years old in late 2024. 

Remember there is still a chip shortage and nintendo wants to have as many  switches as possible I wouldn't be surprised if ninetndo been stacking up these Orin chips for year  or 2 for them not have any problems which would be smart.

Moving to 5nm would actually make more sense under the conditions of a chip shortage. Sony switched to 6nm to alleviate its shortage, for example. Why? Because the 5nm and 6nm nodes have fewer fabrication defects than the 7nm node.  

https://www.tomshardware.com/news/playstation-5-refresh-boasts-new-6nm-amd-oberon-plus-soc#:~:text=Lastly%2C%20Angstronomics%20also%20observes%20that%20the%20PS5,silicon%20bill%20and%20new%20lower%20BOM%2C%20recently

Lastly, Angstronomics also observes that the PS5 is the first of the big three current-gen consoles to get a 6nm chip and that Sony is getting nearly 50% more PS5 chips per wafer than Microsoft with its Xbox Series X processors. Even so, Sony, with its cheaper silicon bill and new lower BOM, recently pushed price hikes worldwide (except in the U.S.).

It is also an example of how going to a recent node could save money. 

https://www.pcmag.com/news/new-ps5-model-uses-more-efficient-oberon-plus-6nm-chip

The move to a 6nm chip means the logic transistor density increased by 18.8% and the die size shrunk from 300 square millimeters to just 270 (roughly 15% smaller). Combined, it means the CPU requires less power and produces less heat, which led to Sony introducing a smaller, cheaper cooling solution. The other benefit of the smaller chip for Sony is the fact 20% more of them can be produced per wafer, with little difference in production cost.

Last edited by sc94597 - on 10 September 2023

Soundwave said:
Norion said:

Xbox One X absolutely could not run modern games like Starfield. You're ignoring the massive difference in CPU and I/O performance, that matters WAY more for why Series S is ahead of the Xbox One X. Starfield pushes CPU's hard which is why it's locked to 30fps on Xbox Series and it doesn't run properly on a hard drive with the game occasionally freezing for 1-2 seconds when portions of the map get loaded. You're clearly ignorant when it comes to this stuff so should do some more research before making more posts on this sort of topic.

Firstly, my main point stands in that post, I notice you don't want to debate any part of that. 

On Starfield, it runs on a ROG Ally (a portable hybrid machine). I think it probably could run on an XBox One X, but there were no games really that were made just for the XBox One X, we'll never really know I don't think what its full potential was. 

The main reason it may not work on XBox One X is not because the compute strength of the GPU (which is decent) but the shit Jaguar CPU cores that it inherited from the PS4/XB1. And the Switch 2 will likely blow those crap Jaguar cores out of the water too (ARM A78). 

Cause there was nothing to say about it, it's a fact that the Switch 2 will benefit from architecture improvements while the part I replied to was completely wrong. And again, it absolutely would not run on a One X, it's not a may. The ROG Ally has a much better CPU and an SSD while the One X has a terrible CPU and slow hard drive and those things matter massively. A somewhat capable GPU combined with a very low end CPU and slow hard drive is not fit to run modern games designed around modern hardware like Starfield.

You're right that the Switch will destroy the Xbox One and PS4 in terms of CPU performance but again I think you need to do more research on this subject before getting so involved in arguments over it.

Last edited by Norion - on 11 September 2023

I very much doubt Switch 2 is going to have more "realized power per a TFLOPS" than Series S or X, let alone PS5. Being more advanced and recent will not make up for downsides associated with its handheld nature. So even when disregarding the deficiencies in CPU, storage system and RAM, I would expect Switch 2's GPU to be less efficient than the current consoles.

PS5's so called "RDNA 1.5" is proven to be more performant "per a TFLOPS" (and per buck) than Series S or X with their Velocity Architecture, ML acceleration, DirectStorage, hardware accelerated VRS, true RDNA2, and other buzzwords. PS5's supposedly inferior RDNA implementation (which was supposed to overheat and throttle to just 8 TFLOPS according to "experts") is literally more efficient according to the real world results.

Even the PS4 with its outdated GPU design and terrible CPU holds its own against the SteamDeck, which also has twice the RAM amount (but lower speeds). Per a TFLOPS, PS4's GPU is more or less as performant/efficient as the much more technologically advanced SteamDeck GPU.

I think some of you are underestimating handheld limitations.



Kyuu said:

I very much doubt Switch 2 is going to have more "realized power per a TFLOPS" than Series S or X, let alone PS5. Being more advanced and recent will not make up for downsides associated with its handheld nature. So even when disregarding the deficiencies in CPU, storage system and RAM, I would expect Switch 2's GPU to be less efficient than the current consoles.

PS5's so called "RDNA 1.5" is proven to be more performant "per a TFLOPS" (and per buck) than Series S or X with their Velocity Architecture, ML acceleration, DirectStorage, hardware accelerated VRS, true RDNA2, and other buzzwords. PS5's supposedly inferior RDNA implementation (which was supposed to overheat and throttle to just 8 TFLOPS according to "experts") is literally more efficient according to the real world results.

Even the PS4 with its outdated GPU design and terrible CPU holds its own against the SteamDeck, which also has twice the RAM amount (but lower speeds). Per a TFLOPS, PS4's GPU is more or less as performant/efficient as the much more technologically advanced SteamDeck GPU.

I think some of you are underestimating handheld limitations.

Exactly it's all takes is some common sense. Look at the steam deck which is probably losing money on hardware. The goal was to build a handheld PC and the thing Is massive compared to switch. You think Nintendo which hasn't cared about power for 20 years and killed it  with the switch is gonna go all out and produce the most powerful handheld by miles in a much smaller from factor.