bonzobanana said:
Of course ARM cores are used on different fabrication processes my point was the performance figures are given for the best fabrication process and to use the same A78C cores you have to have a different design often downgraded to work on older fabrication processes. The roots of the Switch 2 T239 with its tensor core design and reduced cache are of the same age and generation which goes back to final silicon in 2020/21. You can't just inject later tech into the design and its unfair to compare same or similar CPUs on different fabrication processes. That shows you how these CPU's have to be heavily redesigned to work on different fabrication processes and often feature limited on older fabrication processes.
|
Are you just pulling things out of the aether now? You don't need a "different design" that is "often downgraded to work on older fabrication processes." What do you even mean by "downgraded" here? What is downgraded? What performance figures are you even talking about? All process node tells us is how much power a given design will use at a set of frequencies, and the dimensions of the physical transistors in the chip. The actual design of an A78 core is the same design whether it is on 3nm Samsung or 5nm Samsung. Likewise with an A78AE or an A78C which are special-case iterations of the A78. And yes, while a core's architecture isn't infinitely compatible with all process nodes, it doesn't mean ARM is targeting a specific fab process when they design a core. Usually they are targeting a range of fabs for which their core's design is compatible.
All of this is moot, because we are talking about actual performance in the real-world on an 8nm chip. The Geekerwan benchmark isn't some hypothetical figure given to us by ARM ,Samsung, Nvidia, or Nintendo. It is an actual benchmark on physical hardware of A78AE cores (also on an 8nm node) clocked at 1.1Ghz and 1 Ghz for handheld and docked mode respectively.
|
We are seeing the Switch 2 natively render at very low resolutions and then upscaling with DLSS to make its performance competitive. If it really had the spec you claimed why does it need to render at 360p at times often below that of the original Switch lowest rendering resolution. The Switch 2 only has a 20Wh battery it cannot afford the performance you believe and there are no indications it can do that anyway. It is DLSS that is saving the day by rendering at such low resolutions.Â
|
Which is typically something you can only do when the workload is GPU-bound. Dropping the resolution wouldn't do anything if the CPU were the issue and we are seeing a CPU-bottleneck or an over-utilized CPU. You keep bringing up the fact that the Switch 2 only has a 20Wh battery. Okay, most platforms that have A78C cores have even smaller power-budgets than the Switch 2 because most of these platforms are either smartphones that can't run at 5-7W because they'll get too hot without active cooling or they are Thinkpad laptops that need to last 10-15 hours when the efficiency cores are being utilized and also only have a 5-7W power budget when running off charge, even if their batteries are larger.
| You have to get back to reality a 10/8Nm fabrication process, a very low capacity battery, only being allowed a peak wattage for the SOC of 4-6W per hour to give the 10W maximum allowing for the screen that is going to be around 4-6W on its own. The Switch 2 is based around a very low performance power efficient Nvidia chipset from 2020/2021 on a fabrication process of that time. You are throwing loads of data in your replies most of which isn't relevant to the Switch 2 and its fabrication process and age of design. |
I am in reality. We literally see a benchmark where an A78AE (on the same exact node as the T239) clocked at 1 GHz is performing as well you are saying is not possible. The Geekbench 6 numbers aren't a lie. You have still not addressed them. You tried to pivot before when they were brought up suggesting that it included a GPU benchmark.
| The Switch 2 has about 3-4x the CPU performance of the original Switch a huge upgrade. It has a GPU capable of 5-6x the docked performance in graphic teraflops and 3x the memory. It's a huge generational leap but its still having to render more ambitious games at resolutions as low as 360p. You need to get real rather than for every bit of Switch 2 spec assume the very best possible performance. It's a low cost low performance design from 2021 given a boost with more system memory and storage plus of course DLSS upscaling technology. |
Again, this already has been mentioned to you. Running a game at 360p (ultra-performance DLSS) and upscaling it to 720p is not the same burden as running a game at 360p and doing nothing. The tensor cores and CUDA cores (not the CPU) have to do work to upscale that image. The graphics settings can also be scaled higher if the image is clean enough from 360p -> 720p. It is also important to identify that 360p is a minimum internal resolution when combined with dynamic resolution scaling and isn't the average internal resolution of the game, which is more like 500p-600p depending on the specific game we're talking about. This is also not atypical for modern handheld platforms. To get games in playable states on the Steam Deck you often have to have the internal resolution in the 300-500p range and upscale with FSR. Cyberpunk, for example, only runs at 40fps at low graphics settings if you use FSR 2.1 Balanced at 720p, which is about 500,000 pixels or a sub-540p resolution.
| I'm sure as time goes on we will get a much better analysis of retail Switch 2's and what they are capable of. Today is only the launch day, in the coming weeks and months we will get a lot more information why the Switch 2 is limited to rendering at such low resolutions and relying on upscaling so much. |
It's "relying on upscaling so much" because it is a modern feature-set that it would be dumb not to use. You can push your GPU harder (by dialing up or keeping setting high) and not get too much loss in graphics or image quality while maintaining your resolution target when using DLSS. Why wouldn't every game use it and use it heavily?
Last edited by sc94597 - on 05 June 2025