By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - VGC: Switch 2 Was Shown At Gamescom Running Matrix Awakens UE5 Demo

Soundwave said:

Updated the title to fix the error there.

More updates today, Resetera industy insider Nate the Great has put out a podcast and has said he has also heard the Matrix Awakens demo was shown and sheds more light on the Zelda: BOTW demo:



He says The Matrix Awakens was running with ray tracing and uses no ray reconstruction. I don't really know how this is possible, but that's what he's saying. If that's the case, then I almost wonder if this has to be Lovelace architecture and not Ampere. Kopite when he first leaked the Tegra T239 said it was Lovelace but I dunno I think we just assumed Ampere? If it's Lovelace I think that means it has to be a 4nm chip because Lovelace is 4nm or lower only from what I understand. 

Lovelace did start shipping in fall 2022 (Nvidia 40 series cards), so if the Switch 2 is fall 2024, it would be a two year gap there ... it isn't actually too far off from the Tegra X1 releasing in 2015 (Maxwell 20nm process), and Switch launching in early 2017 I guess (probably not unreasonable to think Nintendo was targeting holiday 2016 and just missed it due to software not being ready). The other can of worms Lovelace opens up is it would potentially open the door to DLSS Frame Generation (I said potentially) as that is supported by Lovelace architecture which could alter performance quite a bit in a good way. 

Zelda: BOTW demo was 4K 60 fps using DLSS but the interesting quirk was all the load time was eliminated. So as I've said before I suspect Nintendo is using some faster internal storage, maybe like UFS 3.1 or 4? An internal NVMe seems like it would be too expensive ... but who knows. 



A person on reddit posted a very detailed speculation for why they think 4nm makes the most sense from both a business perspective and a technological one and for both Nintendo and Nvidia. 

https://pastebin.com/V5nTeh4h

The prices of NVMe drives have been crashing in the last 6 months or so. A regular consumer can get a 512GB one for about $25, and Nintendo will get it even cheaper. 



Around the Network
sc94597 said:
Soundwave said:

Updated the title to fix the error there.

More updates today, Resetera industy insider Nate the Great has put out a podcast and has said he has also heard the Matrix Awakens demo was shown and sheds more light on the Zelda: BOTW demo:



He says The Matrix Awakens was running with ray tracing and uses no ray reconstruction. I don't really know how this is possible, but that's what he's saying. If that's the case, then I almost wonder if this has to be Lovelace architecture and not Ampere. Kopite when he first leaked the Tegra T239 said it was Lovelace but I dunno I think we just assumed Ampere? If it's Lovelace I think that means it has to be a 4nm chip because Lovelace is 4nm or lower only from what I understand. 

Lovelace did start shipping in fall 2022 (Nvidia 40 series cards), so if the Switch 2 is fall 2024, it would be a two year gap there ... it isn't actually too far off from the Tegra X1 releasing in 2015 (Maxwell 20nm process), and Switch launching in early 2017 I guess (probably not unreasonable to think Nintendo was targeting holiday 2016 and just missed it due to software not being ready). The other can of worms Lovelace opens up is it would potentially open the door to DLSS Frame Generation (I said potentially) as that is supported by Lovelace architecture which could alter performance quite a bit in a good way. 

Zelda: BOTW demo was 4K 60 fps using DLSS but the interesting quirk was all the load time was eliminated. So as I've said before I suspect Nintendo is using some faster internal storage, maybe like UFS 3.1 or 4? An internal NVMe seems like it would be too expensive ... but who knows. 



A person on reddit posted a very detailed speculation for why they think 4nm makes the most sense from both a business perspective and a technological one and for both Nintendo and Nvidia. 

https://pastebin.com/V5nTeh4h

Yeah I'm starting to think that too, especially with the Matrix demo. It's actually not far off from what the Tegra X1 Maxwell was for even early 2017 to be honest ... it was still a cutting edge chip even when it came out on the Switch. Really only the Apple A9X which was a monster processor for Apple was comparable at the time. 

When people act like the Tegra X1 was some garbage chip that Nintendo just slapped into the Switch, that's pretty misleading. The Tegra X1 was the best mobile chip period when it released in 2015, it was a monster chip. And it was still very high end by late 2016/early 2017 for the Switch. Mobile chip tech has boomed in the years since then too. You look at what Apple is doing today, their M1 Max chip gets the performance of high end PC laptops with like 30 series cards at less than 1/3 of the power consumption (incredible). 

If Sony had made a Vita 2 or PSP3 in 2017 they would not have been able to significantly outperform a Tegra X1, it's not just a "well Nintendo used this ancient tech". For the time it was a pretty big ticket processor in the mobile world. 

So I'm not sure why some people are so hung up on the idea that Nintendo could not use a higher end mobile processor for 2024. They already did exactly that in 2017. It just so happens that mobile tech has improved a lot since then and Nvidia's feature set side has improved massively from 2015 with things like RTX and DLSS. 

Last edited by Soundwave - on 11 September 2023

Chrkeller said:

Steam runs 7 at 720p with lower quality settings. I still don't get how this matches the ps4. If the argument is the differences don't matter, no issues given that is subjective. But it is entirely false to say it matches from a technical perspective. And 720p in handheld is fine, on a large TV, no thanks. I'll stick with my home console/pc.

I still think we are confusing matching and if that means no differences or if people mean they don't personally care about the differences.  Those are two completely different arguments.

Here is a demo of unreal 4 for tegra looks like it can match ps4 graphics  just like with the unreal 5 demo people thinking it will match consoles.

conclusion demos are useless, and the zelda demo tells us very little it. would be really impressive of it was 4k native.

  



This is the Unreal Engine demo (Elemental) for the Tegra X1, I would say that's not unreasonable for the Switch to pull this off nor anyone would fall out of their chair if there was a Switch game with a real time cut scene like that. It's the higher end of what you would expect from the hardware, but I don't think people would say it's impossible:

Mobile chip tech has gotten massively better the last 7-8 years, the Tegra X1 and Apple A9X really kicked it off, today you see processors like the M2 Max which are ridiculous. 

Getting into 16nm and lower really was a game changer for mobile chip tech. 

Last edited by Soundwave - on 11 September 2023

Soundwave said:

This is the Unreal Engine demo (Elemental) for the Tegra X1, I would say that's not unreasonable for the Switch to pull this off nor anyone would fall out of their chair if there was a Switch game with a real time cut scene like that. It's the higher end of what you would expect from the hardware, but I don't think people would say it's impossible:

Mobile chip tech has gotten massively better the last 7-8 years, the Tegra X1 and Apple A9X really kicked it off, today you see processors like the M2 Max which are ridiculous. 

Getting into 16nm and lower really was a game changer for mobile chip tech. 

point is, you would look in the demo and think switch is capable of ps4 like graphics in real world performance when it wasn't really close.



Around the Network
zeldaring said:
Soundwave said:

This is the Unreal Engine demo (Elemental) for the Tegra X1, I would say that's not unreasonable for the Switch to pull this off nor anyone would fall out of their chair if there was a Switch game with a real time cut scene like that. It's the higher end of what you would expect from the hardware, but I don't think people would say it's impossible:

Mobile chip tech has gotten massively better the last 7-8 years, the Tegra X1 and Apple A9X really kicked it off, today you see processors like the M2 Max which are ridiculous. 

Getting into 16nm and lower really was a game changer for mobile chip tech. 

point is, you would look in the demo and think switch is capable of ps4 like graphics in real world performance when it wasn't really close.

I mean it can display those levels of graphics, so ...? I don't think anyone watches that and goes "yeah no way can a Tegra X1 do that" today.

Unfortunately I don't think we got to see the max Tegra X1 performance from the Switch either because the 20nm process was a bit of a curse for Nvidia (ran too inefficiently and then got overshadowed by 16nm). It should output 500 gflops, not the 383 that the Switch uses, the Mariko Switch is capable of actually going higher than that (probably around 600 GFLOPS).

A 4nm or 5nm node is really going to help them either of those is way more efficient and a better node than 20nm was. 

Last edited by Soundwave - on 11 September 2023

sc94597 said:
zeldaring said:

you expect that in a console that will be 349$ the same size as a switch? not mention nintendo is gonna be using something that should have came out in 2020-2021 but they cancelled there plans so its not like 2024 tech.

The latest rumor is that the Switch is going to have a Lovelace GPU (that's 2024 technology, as Tegra Lovelace's aren't going to be in anything until then.) 

https://www.notebookcheck.net/Optimistic-Nintendo-Switch-2-specs-leak-puts-forward-huge-CPU-and-GPU-changes-that-would-render-Tegra-T239-obsolete.748000.0.html

If it does indeed have that chip rather than the T239, then yes it is possible to get GTX 1650 level performance in a 20W form-factor ("switch-size"), especially when you alleviate the biggest bottleneck that the 1650 has in recent titles (4GB of VRAM.) 

Again this isn't magic, it's the result of reducing the size of transistors from 12nm to 5 nm, and of course having more VRAM available. 

FFVII (the game we were talking about) is very demanding on VRAM. In the video we see that the 4GB of VRAM was bottlenecking the compute-load of the 1650 (it was at 80% utilization.) The Switch 2 won't have this issue if it has the rumored 12GB of total unified memory. 8 GB could be allocated to graphics, 4GB to the OS. 

Actually, what makes you think Nintendo will opt to use 4GB out of the 12 ones for their OS, when the previous enjoyed quite the minimalistic menu OS to run pretty much everything smoothly(except the Eshop) on 1GB if not less ?

Sounds like it'd be a waste of potential performance if the unified memory pool was to be true ...

Anywoo, quite comical how the debate has turned from the Tegra 239 to the Lovelace series which could lead the Switch successor to be more capable than initially thought when the thread began. 

These speculations need to continue !



Switch Friend Code : 3905-6122-2909 

The big elephant in the room also is this is really the most underwhelming generational leap probably of any generation of video games, going back to like the NES at least if not before that.

A lot of these supposed "next-gen" PS5/XSX games just look like PS4 games with a coat of gloss put on them and a few bells and whistles ... well then a system that's "supposed to be" just a PS4 with a feature set that can add a coat of gloss (fancier lighting here, a few other effects there, maybe even some ray tracing with DLSS) well then you're basically in the same neighborhood anyway.

PS4 tier games as is can require 5-7+ years of development time, the budget even at PS4 fidelity can get so large that is can bankrupt or badly damage a studio if a game underperforms, and the visual fidelity as is while yes you can get better I think many devs have basically sorta deemed it as "good enough".

The PS5/XSX games just seem to spend all their processing power budget on increasing the resolution and sometimes frame rate and if there's anything left over maybe some ray tracing but AMD GPUs suck ass at ray tracing. I don't think developers really want to increase their budgets, dev time, their team sizes any further, so its easier just to pump all your resources.

A lot of the reaction to Spider-Man 2 for PS5 (exclusive) is the game while looking great ... doesn't really look like a full generational leap over Miles Morales not even close really. The other part is just diminishing returns, like once you reach a certain level of realism (PS4), it's become harder to push further than that unless you are exponentially increasing the graphics quality (not just frame rate + resolution).

But that is a good thing for Switch 2, because if basically it's a PS4 level fidelity but you add some shiny lighting/shadows and up the textures a bit all of the sudden you have something that can pass for a what a PS5 game looks like. You may not get 60 fps, but 30 fps to 60 fps is not a freaking generational leap. No way, no how, never has been, never was, never will be. It would be like saying the first XBox was a generational leap over the PS2, that's just ridiculous.

I mean like full stop the difference between Dreamcast (a console released in 1998) and N64 (a console released in 1996) is bigger than this PS4 to PS5 jump. Shit even Mario 64 and Wave Race 64 looked like an immediate large leap over the PS1. Again PS2 vs Dreamcast. Even GameCube versus PS2, Star Wars Rogue Squadron 2 immediately looked like a fairly notable step beyond what we were seeing on the PS2. PS3 was a massive lift past PS2, PS4 as well but here diminishing returns I think started to seep in. I get it, resolution and frame rate are massive resource sucks, diminishing returns, reflections in a puddle that you have to stop and stare at to appreciate, etc. etc. all that stuff is important, but still man what a lame generation this is.



Mar1217 said:
sc94597 said:

The latest rumor is that the Switch is going to have a Lovelace GPU (that's 2024 technology, as Tegra Lovelace's aren't going to be in anything until then.) 

https://www.notebookcheck.net/Optimistic-Nintendo-Switch-2-specs-leak-puts-forward-huge-CPU-and-GPU-changes-that-would-render-Tegra-T239-obsolete.748000.0.html

If it does indeed have that chip rather than the T239, then yes it is possible to get GTX 1650 level performance in a 20W form-factor ("switch-size"), especially when you alleviate the biggest bottleneck that the 1650 has in recent titles (4GB of VRAM.) 

Again this isn't magic, it's the result of reducing the size of transistors from 12nm to 5 nm, and of course having more VRAM available. 

FFVII (the game we were talking about) is very demanding on VRAM. In the video we see that the 4GB of VRAM was bottlenecking the compute-load of the 1650 (it was at 80% utilization.) The Switch 2 won't have this issue if it has the rumored 12GB of total unified memory. 8 GB could be allocated to graphics, 4GB to the OS. 

Actually, what makes you think Nintendo will opt to use 4GB out of the 12 ones for their OS, when the previous enjoyed quite the minimalistic menu OS to run pretty much everything smoothly(except the Eshop) on 1GB if not less ?

Sounds like it'd be a waste of potential performance if the unified memory pool was to be true ...

Anywoo, quite comical how the debate has turned from the Tegra 239 to the Lovelace series which could lead the Switch successor to be more capable than initially thought when the thread began. 

These speculations need to continue !

Just wait till next week when people start expecting the switch 2 to be on the ps6 pro level!



I don't hate the rumors and the speculations around the next console, but I just want that thing announced already. I'm really anxious to see the lineup and the "unbearable" difference performance with multiplatforms games.

I bet by the time we get to see the games playing in your hands, 1080p will be repugnant



 

 

We reap what we sow