By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - How much do you care about the graphical leap between consoles at this point?

Soundwave said:
curl-6 said:

DLSS can help with pixel fillrate but to be fair there is more to it than that; bandwith, CPU, SSD, etc.

I'm sure Switch 2 can exceed PS4 but I'm very skeptical of it getting "close to PS5."

Are you also skeptical that the current Switch can get "close to PS4"? Because it already does with games like Witcher 3 and DOOM. You can also see games like Resident Evil 3 Remake and Star Wars Battlefront are able to run on a portable GPD Win 2 which is not much better than an existing Switch. 

DLSS makes that far easier and it does impact your bandwidth when you only have to render at a fraction of the resolution to boot.

Arm A78 as a CPU core, which releases later this year will be comparable to Zen 2 AMD cores. By 2023 that CPU will be cheap. 

You also are underestimating that Switch 2 likely will be using a better architecture than the PS5's GPU. Switch 2 would likely be an Ampere or Orin-based part ... PS5 is RDNA2 ... RDNA2 can't even beat Nvidia's 2 year old Turing architecture. A *laptop* version (which is less powerful than the desktop version) of the RTX 2080 outperformed the PS5 on that much balley hooed Unreal Engine 5 test (40 fps vs 30 fps), lol. 

SSD is another overhyped thing, UFS 3.1 which is mobile flash storage can get up to 3GB/sec which is faster than what the XBox SX is using and Apple just flat out has been using NVMe drives in iPhones/iPads for 5 years now. 

If Nvidia gives Nintendo a Switch 2 chip that's akin to the Tegra X1 was for 2015, but in 2023, with DLSS, yes you are going to get PS5 level games. They may not render at anywhere close to the same resolution, but if your eyes can't tell the difference anyway, what does it matter? I mean shit, that 540p DLSS image on Control to me actually honestly looked sharper and cleaner than native 1080p. I've seen other tests where 576p was scaled up to 1440p and it looks very close. It's ridiculous. Even N64-era (1990s-era) resolutions like 512x288 look playable for an undocked mode if need be. 

At bold.

What? I can't even. Not even close.

Its not over hyped, its huge performance increase overall coming from a ultra slow 2.5" in the PS4/Xbox One series consoles it was massive bottleneck this last generation. And really? Comparing PC's to consoles is disingenuous. SSD's have been around for years, and yeah it isn't a new thing but the gaming industry as a whole hasn't moved on yet as the software hasn't caught up to the hardware; this goes for game design. As HDD been a the standard storage device for consoles and PC for last decade/s.

And I wouldn't count on Nintendo offering the latest and greatest chip-sets from Nvidia. They've been going for trusted older chipsets which is tried and tested for the last few generations. Makes for more affordable console and more profitable for them. I would think around PS4 (overall performance) with Ampere feature set is more plausible in a few years for Switch two, which is enough.

Also, why the downplay constantly? We haven't even seen any games in action that take advantage of the new consoles on the Xbox One Series X or PS5 yet.

Last edited by hinch - on 25 May 2020

Around the Network

The Switch using an "old chip" is not true. Relative to the portable chips that were available at the time the Tegra X1 was as cutting edge as the PS4 or XB1 were. Switch is more in line technology wise with the N64 and GameCube. They were aiming for a 2016 launch too, 2017 only happened because they needed a couple extra months for software.

The Wii really could not for example run at any "real" 360/PS3 high end games .... the Switch can run games like Witcher 3, DOOM, Dragon Quest XI, NBA 2K, Wolfenstein, Mortal Kombat, which for a mobile chip is quite impressive.

Another obvious benefit of DLSS as well even if you're not making high end games is you don't need to optimize for docked/undocked anymore if you don't want to. A Nintendo developer making say a Mario Party game could just make one version of the game at 960x540 (low resolution).

The DLSS will then simply just take that and use the AI algorithm to create a 720p-1080p image undocked, and 1080p-1440p+ docked. No real work has to be done by the developer. This will save time and man power when developing all Switch games, not just higher end ones by eliminating the need to optimize two versions of every game.



If the rumoured advances of DLSS 3.0 are true, it really will do wonders. Assuming the Switch 2 is still years away and Nvidia will use the next gen 7nm tech for the mobile GPUs, I can see the switch being able to run games at 540p or higher using DLSS without much issues if the ps5/xsx are aiming for 4k. Obviously at lower settings than both the Ps5/XsX. But we will have to wait and see since you can never know with Nintendo. I do have more confidence on Nvidia though with their hardware decisions...



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Captain_Yuri said:
If the rumoured advances of DLSS 3.0 are true, it really will do wonders. Assuming the Switch 2 is still years away and Nvidia will use the next gen 7nm tech for the mobile GPUs, I can see the switch being able to run games at 540p or higher using DLSS without much issues if the ps5/xsx are aiming for 4k. Obviously at lower settings than both the Ps5/XsX. But we will have to wait and see since you can never know with Nintendo. I do have more confidence on Nvidia though with their hardware decisions...

I think Switch 2 may actually have 5nm available if we're talking 2023. 

Apple is moving to 5nm this fall for their iPhones, in three years time for mobile chips 5nm will be a common and well worn in. 

Maybe we see a version of the current Switch (and Lite) move to 7nm around fall 2021 in another refresh for example, 5nm for Switch 2, then the Switch 2 Lite could work at 2nm/3nm say 2 years later. 



Soundwave said:
Captain_Yuri said:
If the rumoured advances of DLSS 3.0 are true, it really will do wonders. Assuming the Switch 2 is still years away and Nvidia will use the next gen 7nm tech for the mobile GPUs, I can see the switch being able to run games at 540p or higher using DLSS without much issues if the ps5/xsx are aiming for 4k. Obviously at lower settings than both the Ps5/XsX. But we will have to wait and see since you can never know with Nintendo. I do have more confidence on Nvidia though with their hardware decisions...

I think Switch 2 may actually have 5nm available if we're talking 2023. 

Apple is moving to 5nm this fall for their iPhones, in three years time for mobile chips 5nm will be a common and well worn in. 

Maybe we see a version of the current Switch (and Lite) move to 7nm around fall 2021 in another refresh for example, 5nm for Switch 2, then the Switch 2 Lite could work at 2nm/3nm say 2 years later. 

Yea 5nm would be even more crazy and because of the low power requirements and how efficient Nvidia GPUs are with power consumption, they could put a more powerful GPU in there instead of only 256 cores. Also it is rumoured that with next gen, Nvidia will have RTX in their entire line up of GPUs. Again, we will see how true that is but if it is true, I can see the switch 2 having Tensor cores for sure.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

Around the Network
hinch said:

I would think around PS4 (overall performance) with Ampere feature set is more plausible in a few years for Switch two, which is enough.

I would think (and hope) that Switch 2 turns out quite a bit more capable than PS4. By the time it launches the PS4 will probably be a decade old, and it's CPU in particular was low end even for 2013.

I'm no expert but surely by 2022/2023 Nvidia can deliver a mobile SoC that can outpeform a 10 year old console; Tegra X1 came out 10 years after Xbox 360 and 9 years after PS3 and it outperforms them.

Last edited by curl-6 - on 25 May 2020

JRPGfan said:


Jaguar core is pretty damn weak by todays standards.

Jaguar was pretty damn weak when it first released.
It was a chip destined for netbooks and tablets, it was made to be cheap as possible with a low power consumption, it was AMD's worst CPU in a historically terrible CPU lineup from AMD.

Soundwave said:

The Switch using an "old chip" is not true. Relative to the portable chips that were available at the time the Tegra X1 was as cutting edge as the PS4 or XB1 were. Switch is more in line technology wise with the N64 and GameCube. They were aiming for a 2016 launch too, 2017 only happened because they needed a couple extra months for software.

Nah.
Tegra X1 came out in 2015, the Switch came out in 2017.

2 years in the mobile technology world is a stupidly long time, which is why nVidia essentially moved out of that market because they couldn't keep up with the likes of Qualcomm.

In 2016, nVidia was demonstrating Tegra X2 and had already outlined the chip
https://www.anandtech.com/show/10596/hot-chips-2016-nvidia-discloses-tegra-parker-details

So Nintendo "targeting" a 2016 launch isn't an excuse I am afraid on why the X2 was never used.

The Tegra X1 gets dominated by Apple's A9 processor from the same year in all graphics duties.
https://www.anandtech.com/show/9972/the-google-pixel-c-review/3

Even the Samsung Galaxy Note 5 with it's Exynos 7420 was able to dominate the Tegra in a bunch of benchmarks, especially CPU tasks.

But then we need to remember that... Unlike the Pixel C, the Switch ended up having it's clockrates castrated.

* CPU clock was reduced from 1,900mhz to 1020mhz or an 86.26% reduction.
* GPU clock was reduced from 1,000mhz to a 768mhz docked, 468mhz portable. - A reduction of 30% and 113% respectively.

That is going to affect benchmarks... If we were to apply a 30% performance reduction to the benchmarks, we are probably looking around a
Microsoft Surface 3/Galaxy Tab S2 with it's Exynos 5433/Snapdragon 652 SoC.

Sure, the Switch gets an advantage due to having a more efficient software stack and more mature drivers and API's, but from a hardware level, this is what we are looking at here.

Part of the issue is, nVidia was just not innovating fast enough, Tegra X1 was still stuck on 20nm while Qualcomm had already moved the Snapdragon 821 to 14nm in 2016 and even to 10nm in late 2016/early 2017 with the Snapdragon 835.
https://www.notebookcheck.net/Qualcomm-announces-Snapdragon-835-processor-built-on-Samsung-s-10-nm-process.184643.0.html

Even the updated Tegra X1+ featured in the "new" Nintendo Switch is still only using 16nm FF rather than a cutting edge process, this has left performance on the table.

Either way, something like the Snapdragon 845 with an underclocked CPU would have probably beaten the Switch handily in every scenario.

Soundwave said:

I think Switch 2 may actually have 5nm available if we're talking 2023. 

Apple is moving to 5nm this fall for their iPhones, in three years time for mobile chips 5nm will be a common and well worn in. 

Maybe we see a version of the current Switch (and Lite) move to 7nm around fall 2021 in another refresh for example, 5nm for Switch 2, then the Switch 2 Lite could work at 2nm/3nm say 2 years later. 

The main reason why the Switch was moved from 20nm to 16nm FF was due to costs, 20nm was slowly being depreciated due to under-utilization, TSMC wanted to retool those fabs to modernize them.
There is no guarantees that will occur again this generation.

The Switch is in lockstep with nVidia's development cadence, the 16nm Tegra X1 was already documented a good 12+ months before the Switch revision occurred as the Tegra does end up in other products too.
https://www.eurogamer.net/articles/digitalfoundry-2019-switch-new-tegra-x1-silicon-comes-into-focus

In saying that, battery life is also very much dependent on a multitude of factors... Display plays a big role, the Switch has an extremely inefficient display by modern standards.
A 1080P OLED panel would be a big upgrade in clarity, colour, contrasts and brightness.. Whilst offering significant power savings.


Don't get me wrong, the Switch as a console is a damn solid device, it's hardware is a little lacking as it was never cutting edge. - It just manages to do more with less because of strong art direction from developers and a very optimized software stack that the likes of Android typically doesn't get, which means the Switch is able to punch a little above it's weight.
I am talking purely from a hardware point of view though... And from there the Switch has always fallen short of being the best.



--::{PC Gaming Master Race}::--

Soundwave said:

The Switch using an "old chip" is not true. Relative to the portable chips that were available at the time the Tegra X1 was as cutting edge as the PS4 or XB1 were. Switch is more in line technology wise with the N64 and GameCube. They were aiming for a 2016 launch too, 2017 only happened because they needed a couple extra months for software.

The Wii really could not for example run at any "real" 360/PS3 high end games .... the Switch can run games like Witcher 3, DOOM, Dragon Quest XI, NBA 2K, Wolfenstein, Mortal Kombat, which for a mobile chip is quite impressive.

Another obvious benefit of DLSS as well even if you're not making high end games is you don't need to optimize for docked/undocked anymore if you don't want to. A Nintendo developer making say a Mario Party game could just make one version of the game at 960x540 (low resolution).

The DLSS will then simply just take that and use the AI algorithm to create a 720p-1080p image undocked, and 1080p-1440p+ docked. No real work has to be done by the developer. This will save time and man power when developing all Switch games, not just higher end ones by eliminating the need to optimize two versions of every game.

Tegra X1 launched back in 2015, Tegra X2 a year after. Switch came out two years later using the X1 launch (using Maxwell). Pascal GPU's was already out by the time Switch was available to buy. And that is a whole generational leap. Like I said, they like tried and tested mobile technology that has been used and applied on other products. It has been shown historically again and again with the DS, 3DS etc. They like to play it safe and won't go with cutting edge.

Yeah, its quite impressive to see those games on those handheld but that's it. Only ports from old gen past and some other ones this gen. If they only went with Tegra X2 it would have been a beastly handheld.

As great as DLSS is, we have no idea if the new Switch will support it. Hell, they might even use another SoC altogether. All I can say is no mobile technology is going to reach equivalent performance of PS5/Xbox Series X in a few years.



Pemalite said:
JRPGfan said:


Jaguar core is pretty damn weak by todays standards.

Jaguar was pretty damn weak when it first released.
It was a chip destined for netbooks and tablets, it was made to be cheap as possible with a low power consumption, it was AMD's worst CPU in a historically terrible CPU lineup from AMD.

Soundwave said:

The Switch using an "old chip" is not true. Relative to the portable chips that were available at the time the Tegra X1 was as cutting edge as the PS4 or XB1 were. Switch is more in line technology wise with the N64 and GameCube. They were aiming for a 2016 launch too, 2017 only happened because they needed a couple extra months for software.

Nah.
Tegra X1 came out in 2015, the Switch came out in 2017.

2 years in the mobile technology world is a stupidly long time, which is why nVidia essentially moved out of that market because they couldn't keep up with the likes of Qualcomm.

In 2016, nVidia was demonstrating Tegra X2 and had already outlined the chip
https://www.anandtech.com/show/10596/hot-chips-2016-nvidia-discloses-tegra-parker-details

So Nintendo "targeting" a 2016 launch isn't an excuse I am afraid on why the X2 was never used.

The Tegra X1 gets dominated by Apple's A9 processor from the same year in all graphics duties.
https://www.anandtech.com/show/9972/the-google-pixel-c-review/3

Even the Samsung Galaxy Note 5 with it's Exynos 7420 was able to dominate the Tegra in a bunch of benchmarks, especially CPU tasks.

But then we need to remember that... Unlike the Pixel C, the Switch ended up having it's clockrates castrated.

* CPU clock was reduced from 1,900mhz to 1020mhz or an 86.26% reduction.
* GPU clock was reduced from 1,000mhz to a 768mhz docked, 468mhz portable. - A reduction of 30% and 113% respectively.

That is going to affect benchmarks... If we were to apply a 30% performance reduction to the benchmarks, we are probably looking around a
Microsoft Surface 3/Galaxy Tab S2 with it's Exynos 5433/Snapdragon 652 SoC.

Sure, the Switch gets an advantage due to having a more efficient software stack and more mature drivers and API's, but from a hardware level, this is what we are looking at here.

Part of the issue is, nVidia was just not innovating fast enough, Tegra X1 was still stuck on 20nm while Qualcomm had already moved the Snapdragon 821 to 14nm in 2016 and even to 10nm in late 2016/early 2017 with the Snapdragon 835.
https://www.notebookcheck.net/Qualcomm-announces-Snapdragon-835-processor-built-on-Samsung-s-10-nm-process.184643.0.html

Even the updated Tegra X1+ featured in the "new" Nintendo Switch is still only using 16nm FF rather than a cutting edge process, this has left performance on the table.

Either way, something like the Snapdragon 845 with an underclocked CPU would have probably beaten the Switch handily in every scenario.

Soundwave said:

I think Switch 2 may actually have 5nm available if we're talking 2023. 

Apple is moving to 5nm this fall for their iPhones, in three years time for mobile chips 5nm will be a common and well worn in. 

Maybe we see a version of the current Switch (and Lite) move to 7nm around fall 2021 in another refresh for example, 5nm for Switch 2, then the Switch 2 Lite could work at 2nm/3nm say 2 years later. 

The main reason why the Switch was moved from 20nm to 16nm FF was due to costs, 20nm was slowly being depreciated due to under-utilization, TSMC wanted to retool those fabs to modernize them.
There is no guarantees that will occur again this generation.

The Switch is in lockstep with nVidia's development cadence, the 16nm Tegra X1 was already documented a good 12+ months before the Switch revision occurred as the Tegra does end up in other products too.
https://www.eurogamer.net/articles/digitalfoundry-2019-switch-new-tegra-x1-silicon-comes-into-focus

In saying that, battery life is also very much dependent on a multitude of factors... Display plays a big role, the Switch has an extremely inefficient display by modern standards.
A 1080P OLED panel would be a big upgrade in clarity, colour, contrasts and brightness.. Whilst offering significant power savings.


Don't get me wrong, the Switch as a console is a damn solid device, it's hardware is a little lacking as it was never cutting edge. - It just manages to do more with less because of strong art direction from developers and a very optimized software stack that the likes of Android typically doesn't get, which means the Switch is able to punch a little above it's weight.
I am talking purely from a hardware point of view though... And from there the Switch has always fallen short of being the best.

Hey I guess you made a slight mistake by putting the increase instead of the downgrade, since you can't reduce something 113%



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:

Hey I guess you made a slight mistake by putting the increase instead of the downgrade, since you can't reduce something 113%

Yeah I did. But essentially you are decreasing by more than half. - Couldn't be bothered to redo the calculation, but my point was still made.



--::{PC Gaming Master Race}::--