By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - VGC: Switch 2 Was Shown At Gamescom Running Matrix Awakens UE5 Demo

Pemalite said:
zeldaring said:

you know why he wont agree but i guess we are so bored that we kept responding.

Ps3 pro is exactly where the switch was which is not a bad thing at all. i mean just look all the ports of wiiu to switch and ps4 to ps4 pro. I think ps3 pro is actually where it line up perfectly.

There are many hardware differences between a PS3 and Playstastion 4 outside of performance, aimed at improving efficiency and visuals, many of which the Switch has or sometimes even exceed.

* Polymorph Engines. (Tessellation)
* Shader model 6.7 (PS3 was SM3, PS4 SM6.5 )
* Vertex Shader 5.0
* Geometry Shaders
* Support for higher resolution (16k textures) - Not to be confused with display resolution.
* Compute Shaders.

And more. It's not just about raw throughput, but about hardware feature sets and efficiency.

So calling the Switch a "Playstation 3 Pro" is highly disingenuous, it's nothing like it from a high or low level.

Oneeee-Chan!!! said:

The GPU of the PS3 is equivalent to a 7600GT.
If there is a PS3 pro, it will be equivalent to 7800GTX or 7900GTX.

PS3 had a Geforce 7 class GPU with a core layout of: 24:8:24:8 (Pixel shaders: vertex shaders: texture mapping units: render output units)
Core: 550Mhz. Memory bandwidth: 20.8GB/s

It GPU core is like-for-like of the Geforce 7 7800 GTX 512MB... But with less than half the bandwidth and half the ROPS.
7800GTX: 24:8:24:16 (Pixel shaders: vertex shaders: texture mapping units: render output units)
Core: 550mhz. Memory bandwidth: 54.4GB/s.

The 7600GT has half the functional units with a core layout of 12:5:12:8 @560Mhz and 22.6GB/s of Ram.

In non-memory bandwidth (I.E. Shader heavy) scenarios, the PS3 should be able to get fairly close to the 7800GTX, but once you bog it down with Alpha effects, it will come up short.

Oneeee-Chan!!! said:

Really?
I had heard that the GPU specs were changed just before the PS3 launch.

Besides, aren't the GPUs of PS 3 and Xbox 360 almost equivalent?

If the PS 3 GPU is equivalent to 7900 GT, then which GPU is equivalent to the Xbox 360 GPU?

I was thinking about x800.

The Xbox 360's GPU was more powerful, it was the CPU where the PS3 had a commanding lead.

The Xbox 360's GPU was closely related to the Radeon 2900, but with a lower core clock and memory bandwidth.

AMD Backported the Radeon 2900 design for Microsoft and made it a custom chip.

sc94597 said:

1.7Ghz boost is only possible in the top-powered model (80W.) The lower TDP models (the one in the video was a 45W model with a max boost clock of 1.35 Ghz) have much lower base and boost rates.

https://www.notebookcheck.net/NVIDIA-GeForce-RTX-3050-Laptop-GPU-Benchmarks-and-Specs.513790.0.html

I was only talking about the 80w model, the highest-end configuration.

sc94597 said:

Also the RTX 3050 mobile is VRAM limited with only 4GB. In a game like Final Fantasy 7 Intergrade (among many other more recent releases) 4GB is a significant bottleneck at 1080p.

The 3060 with 6GB isn't much better.

It's nVidia's M.O.

sc94597 said:

A Lovelace Tegra with a TDP of about 15-20W and 8GB of allocated video memory available could be competitive with an RTX 3050 mobile @35W-45W. Especially in a closed-platform where game optimization can be targeted. I'd expect it to be even more true as games continue to become VRAM hogs. 

"

I don't think it necessarily will happen, but it is the upper-limit of possibilities. 

Edit: It wouldn't be surprising at all if the performance difference between the Switch 2 and an RTX 3050 35W was less than the performance difference between the RTX 3050 35W and the RTX 3050 80W. 

You might be on the money, but without confirmation it's just a hypothesis.

zeldaring said:

Its not really worthless measure, it's just series S has a vastly better CPU and everything else.  

No. It's literally a worthless measure.

RDNA3 on a flop to flop basis will perform worse than an RDNA2 part.. Because AMD introduced a dual-issue pipeline.

That means it can "theoretically" have Twice the flops, but it will never have twice the performance as it can only issue a second instruction when it can extract a second instruction from the wavefront.
And when that is unable to happen, you will only get "Half the flops".

This is the same wall AMD found itself with prior VLIW/Terascale designs and why they opted for Graphics Core Next in the first place, it then becomes very driver/compiler heavy.

This is the dumbed down version of course.

The Series S actually has a better GPU than the Playstation 4 Pro.

Teraflop numbers you see floating around are all theoretical, not real world benchmarked numbers.

When we say ps3 pro means a estimate something like the Jump from ps4 to ps4 pro. Now look at the results of ports and it pretty much lines up. I don't care about all the tech talk if games ain't lining up. cause everyone has a story when it comes to tech. How many  wiiu ports did switch run at double the framerate basically 0. while ps4 pro did run some ps4 games at double the framerate so calling it a pro its actually more then it deserves. 



Around the Network
sc94597 said:

The Steam Deck comparison is really silly.

1. The Steam Deck, as impressive as it is, runs games through a compatibility layer (Proton) that provides on average a 10% performance hit. Valve has done a lot toward reducing that hit in optimized titles, but basically any game they haven't fine tuned for is going to get a 10% hit on the platform versus running directly in  Linux. This is a software problem, not a hardware one. If games were developed directly for Linux/Steam OS, the Steam Deck would gain a significant performance boost in many of them. 

2. By the time the Switch 2 releases, the Steam deck will be almost three years old, and we'll probably be talking about an incoming Steam Deck 2 release at that point. 

3. The Steam Deck has about the same power demands as a Nintendo Switch (maxing out at 15W tdp.) 

4. Despite looking like one, the Steam Deck is not a closed platform. It runs games that had to be optimized for a wide-range of hardware. 

A Switch 2 released on a 5nm process node is going to get efficiency gains, by the simple fact that 5nm provides about 30% performance-per-watt over  7nm. This can be utilized to reduce cooling (while giving the same performance as a Steam Deck), improve performance while requiring as much cooling, or a little bit of both. 

Things often don't make sense if you don't know basic details about them. 

Yes exactly. 

Saying "a Steam Deck can't even beat a PS4" is misleading. The Steam is a two way street. You're never going to see it's power fully unleashed probably because no one is going to make a game that is expressly made for the Steam Deck's hardware and really code down to the metal to optimize everything for it. Then there is performance lost with the Proton layer and all that among other factors. 

That's the price you pay to basically be a PC ... the upside is you get broad compatibility with hundreds/thousands of existing games. Which is the only way that product can work, you can't expect Valve or Lenovo or Asus to make games like Nintendo, Sony, MS, etc. and get 3rd party support like bespoke platform does. 



Soundwave said:
sc94597 said:

The Steam Deck comparison is really silly.

1. The Steam Deck, as impressive as it is, runs games through a compatibility layer (Proton) that provides on average a 10% performance hit. Valve has done a lot toward reducing that hit in optimized titles, but basically any game they haven't fine tuned for is going to get a 10% hit on the platform versus running directly in  Linux. This is a software problem, not a hardware one. If games were developed directly for Linux/Steam OS, the Steam Deck would gain a significant performance boost in many of them. 

2. By the time the Switch 2 releases, the Steam deck will be almost three years old, and we'll probably be talking about an incoming Steam Deck 2 release at that point. 

3. The Steam Deck has about the same power demands as a Nintendo Switch (maxing out at 15W tdp.) 

4. Despite looking like one, the Steam Deck is not a closed platform. It runs games that had to be optimized for a wide-range of hardware. 

A Switch 2 released on a 5nm process node is going to get efficiency gains, by the simple fact that 5nm provides about 30% performance-per-watt over  7nm. This can be utilized to reduce cooling (while giving the same performance as a Steam Deck), improve performance while requiring as much cooling, or a little bit of both. 

Things often don't make sense if you don't know basic details about them. 

Yes exactly. 

Saying "a Steam Deck can't even beat a PS4" is misleading. The Steam is a two way street. You're never going to see it's power fully unleashed probably because no one is going to make a game that is expressly made for the Steam Deck's hardware and really code down to the metal to optimize everything for it. Then there is performance lost with the Proton layer and all that among other factors. 

That's the price you pay to basically be a PC ... the upside is you get broad compatibility with hundreds/thousands of existing games. Which is the only way that product can work, you can't expect Valve or Lenovo or Asus to make games like Nintendo, Sony, MS, etc. and get 3rd party support like bespoke platform does. 

I mean if we talking AAA multiplatform games no one is coding down to metal those days are over with except for a very few exclusives and rockstar. the most optimization probably goes to ps5 since that's where third-party games sell the most. 



Steam runs 7 at 720p with lower quality settings. I still don't get how this matches the ps4. If the argument is the differences don't matter, no issues given that is subjective. But it is entirely false to say it matches from a technical perspective. And 720p in handheld is fine, on a large TV, no thanks. I'll stick with my home console/pc.

I still think we are confusing matching and if that means no differences or if people mean they don't personally care about the differences.  Those are two completely different arguments.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Chrkeller said:

Steam runs 7 at 720p with lower quality settings. I still don't get how this matches the ps4. If the argument is the differences don't matter, no issues given that is subjective. But it is entirely false to say it matches from a technical perspective. And 720p in handheld is fine, on a large TV, no thanks. I'll stick with my home console/pc.

I still think we are confusing matching and if that means no differences or if people mean they don't personally care about the differences.  Those are two completely different arguments.

red dead 2 looks really bad on steam, but yea i guess he forget some people like to game on 40 plus inch screens. also elden while i did play the ps4 version at locked 60fps the graphics looks much better on ps5 version and really does add to the game sucks about the framerate though.



Around the Network

By the way, is the thread title correct?
I was told the location was Gamescom.
What is the difference between this and Eurocom?



Oneeee-Chan!!! said:

By the way, is the thread title correct?
I was told the location was Gamescom.
What is the difference between this and Eurocom?

Where did you hear Eurocom? There is no convention or relevant company by that name.

Edit: Wait what, why does the thread title say Eurocom?



Jules98 said:
Oneeee-Chan!!! said:

By the way, is the thread title correct?
I was told the location was Gamescom.
What is the difference between this and Eurocom?

Where did you hear Eurocom? There is no convention or relevant company by that name.

Edit: Wait what, why does the thread title say Eurocom?

https://gamrconnect.vgchartz.com/post.php?id=9468285



Legend11 correctly predicted that GTA IV will outsell Super Smash Bros. Brawl. I was wrong.

Updated the title to fix the error there.

More updates today, Resetera industy insider Nate the Great has put out a podcast and has said he has also heard the Matrix Awakens demo was shown and sheds more light on the Zelda: BOTW demo:



He says The Matrix Awakens was running with ray tracing and uses no ray reconstruction. I don't really know how this is possible, but that's what he's saying. If that's the case, then I almost wonder if this has to be Lovelace architecture and not Ampere. Kopite when he first leaked the Tegra T239 said it was Lovelace but I dunno I think we just assumed Ampere? If it's Lovelace I think that means it has to be a 4nm chip because Lovelace is 4nm or lower only from what I understand. 

Lovelace did start shipping in fall 2022 (Nvidia 40 series cards), so if the Switch 2 is fall 2024, it would be a two year gap there ... it isn't actually too far off from the Tegra X1 releasing in 2015 (Maxwell 20nm process), and Switch launching in early 2017 I guess (probably not unreasonable to think Nintendo was targeting holiday 2016 and just missed it due to software not being ready). The other can of worms Lovelace opens up is it would potentially open the door to DLSS Frame Generation (I said potentially) as that is supported by Lovelace architecture which could alter performance quite a bit in a good way. 

Zelda: BOTW demo was 4K 60 fps using DLSS but the interesting quirk was all the load time was eliminated. So as I've said before I suspect Nintendo is using some faster internal storage, maybe like UFS 3.1 or 4? An internal NVMe seems like it would be too expensive ... but who knows. 



Last edited by Soundwave - on 11 September 2023

10:17

The botw demo is 4k60 DLSS.