By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - How will be Switch 2 performance wise?

 

Your expectations

Performance ridiculously ... 0 0%
 
Really below current gen,... 2 100.00%
 
Slightly below current ge... 0 0%
 
On pair with current gen,... 0 0%
 
Total:2
Soundwave said:
sc94597 said:

It almost never makes sense to run a game at native resolution these days. DLSS Quality/FSR Ultra Quality (sometimes even FSR Quality) often give better image quality and frame rates than native in their current forms.

So if the target is 1080p and the native resolution is 720p, often you get a better than native 1080p image quality.

For handheld mode this image quality isn't just "okay", I think it is pretty great. I play games on my Rog Ally at this resolution often and the image is very clean.  

Combine it with a VRR screen (hopefully Nintendo goes this route) that let's you play at least at 40hz, and you have an excellent gaming handheld. High-end for 2023, and probably mid-range for 2024 when the Switch 2 likely releases, given that next-gen APU's are releasing soon. 

For larger displays, a 1440p target from say 840p/900p (for Nintendo games), isn't too bad at console-gaming distances. For PC gaming, sure you probably want at least an internal 1080p -> 1440p, but if somebody is playing on a 50 inch TV (probably the average size), sitting a few meters away, the image shouldn't be too bad. 

This will be a substantial upgrade from the original Switch. Because of the Series S and other gaming handhelds existing, obviously 3rd party support is going to be more viable than with Switch and Wii with respect to their competitors. 

I think this is something most people in this thread can agree on, independently of the semantic argument over "comparable." 

FSR is shit compared to DLSS too, DLSS provides a very clean looking image that most people are not going to be able to tell a big difference from actual native res and then add in on top of that you're getting basically free anti-aliasing, something current Switch games suffer from a lack of and it's just laughable to me that as a Switch 2 dev you'd want to brute force 4-8x the pixel count for no real big gain in visual fidelity. 540p to 1080p even is very, very acceptable and that's coming from me, I'm an image quality enthusiast. 

I was so disgusted visiting my friend at a Best Buy that he worked at at their TV/OLED section that I made him get me the remotes for all their flagship big screen TVs and recalibrated the settings for each one manually. His manager even offered me a job, but I'm not working minimum wage at retail no way, lol. 

If 540p to 1080p DLSS can pass on a big screen display as acceptable, "Joe Fucking Average" gamer is going to be more than fine with that on a 8 inch screen, it's not even worth arguing. For a TV mode, once you give DLSS 720p pixels even, it can produce good visuals on very large displays. 

Keep in mind we are talking about a general public who don't even know what a damn real 4K image looks like most of the time. Most people don't understand that Netflix "4K" streams are dog shit bit rate, lower than even 1080p physical Blu-Ray and many PS5/XSX games aren't doing 4K native either.  Most people don't know native 4K from their ass.

Nintendo is going to be more than fine with DLSS from much lower resolutions, 99% of people are never going to know any better and think they are just playing native resolution. 

FSR at 1440p->4k is almost as good as DLSS.
The area where DLSS shines, compaired to fsr, is at low res upscaleing.
Like 540p -> 1080p ect (thats where DLSS kicks FSR's arse).

No joke, but at higher resolutions, DLSS vs FSR, is near identical.... esp if your sitting on a bit away from the TV/Monitor.
You need to zoom way in, and take screenshots/recordings to spot minor differences between the two.



Around the Network
Chrkeller said:

I think people really need to understand resolution is just one piece of visuals. Many other factors include fps, texture quality, anisotropic filtering, lighting, shadows, reflections, SSAA, etc. All of which eat memory bandwidth.

Point being 1080p 120 fps ultra settings versus 1440p 30 fps low settings.... not a contest. The former wins by a landslide.

Resolution is being given too much emphasis.

Resolution is the easiest way to scale with memory bandwidth bottlenecks though. We see this with the Rog Ally and its 100 GBps memory bandwidth. The simplest way to scale frame-rate is to reduce internal resolution by changing the FSR setting or target resolution. You can also scale these other things too, but the Rog Ally (and probably Switch 2 as well) already run games at low-medium PC settings so there isn't much else to scale there. 

The Rog Ally is effectively a 720p -> 900p/1080p machine that plays demanding games at low-medium settings at either a fixed 30fps or variable 30-45 fps when in its 15W mode. Given the Switch 2's specs, we should expect similar, depending on the target TDP of the Switch 2 maybe a bit more like the 25W mode of the Rog Ally, which can do 40 - 60 fps variable at 720p -> 900p/1080p in demanding titles. I think that is what we should expect for the Switch 2 for demanding AAA 3rd party games. For Nintendo's first party games, which tend to depend on less demanding assets but nice visual effects, I think 900p -> 1440p docked is definitely a possibility. 

It is not unreasonable to expect the Switch 2 to be comparable to a mid-end 2024 gaming handheld, just as non-hybrid consoles are comparable to mid-end PC's of the year they release. 



JRPGfan said:
Soundwave said:

FSR is shit compared to DLSS too, DLSS provides a very clean looking image that most people are not going to be able to tell a big difference from actual native res and then add in on top of that you're getting basically free anti-aliasing, something current Switch games suffer from a lack of and it's just laughable to me that as a Switch 2 dev you'd want to brute force 4-8x the pixel count for no real big gain in visual fidelity. 540p to 1080p even is very, very acceptable and that's coming from me, I'm an image quality enthusiast. 

I was so disgusted visiting my friend at a Best Buy that he worked at at their TV/OLED section that I made him get me the remotes for all their flagship big screen TVs and recalibrated the settings for each one manually. His manager even offered me a job, but I'm not working minimum wage at retail no way, lol. 

If 540p to 1080p DLSS can pass on a big screen display as acceptable, "Joe Fucking Average" gamer is going to be more than fine with that on a 8 inch screen, it's not even worth arguing. For a TV mode, once you give DLSS 720p pixels even, it can produce good visuals on very large displays. 

Keep in mind we are talking about a general public who don't even know what a damn real 4K image looks like most of the time. Most people don't understand that Netflix "4K" streams are dog shit bit rate, lower than even 1080p physical Blu-Ray and many PS5/XSX games aren't doing 4K native either.  Most people don't know native 4K from their ass.

Nintendo is going to be more than fine with DLSS from much lower resolutions, 99% of people are never going to know any better and think they are just playing native resolution. 

FSR at 1440p->4k is almost as good as DLSS.
The area where DLSS shines, compaired to fsr, is at low res upscaleing.
Like 540p -> 1080p ect (thats where DLSS kicks FSR's arse).

No joke, but at higher resolutions, DLSS vs FSR, is near identical.... esp if your sitting on a bit away from the TV/Monitor.
You need to zoom way in, and take screenshots/recordings to spot minor differences between the two.

Well the Switch 2 should never be rendering at higher resolutions to begin with. It's a monumental waste. 

720p to 1440p DLSS looks terrific. Most normal gamers sitting on their couch several feet away from even 65-77 inch display are not going to know any better. 

540 to 1080p DLSS is a laugh, you can approximate this if you have DLSS titles like Alan Wake II or Cyberpunk 2077 ... set the game to "windowed" mode on your full size monitor, in that case I was getting a display image of roughly 13 inches on my 4K monitor PC monitor (way larger than a Switch 2's hypothetical 8 inch display but at least in the same ball park) and even at 13 inches display ... toggling back and forth between native 1080p and 540-to-1080p DLSS is extremely hard to tell a difference. 

A developer would be utterly stupid IMO to try and brute force native res on a machine like the Switch 2. You're asking the system to push so many more pixels for almost no benefit. If I'm a Switch 2 developer working on even a remotely challenging type of port for the system, I'm sticking with 540p (1080p DLSS) and 720p (1440p DLSS) as my main resolutions and not a single pixel more. 

A ROG Ally is basically a portable PS5/XSX ... like on no planet can you seriously state that's "just a PS4" when its running Starfield and Alan Wake II in playable states even at 15 watts only. Indoor areas it even climbs above 40 fps for Starfield and some areas in AW2 too as well. We have portables that can already run PS5/XSX tiers, we've had it for a while. The Steam Deck is almost 2 years old. 

Last edited by Soundwave - on 12 January 2024

sc94597 said:
Chrkeller said:

I think people really need to understand resolution is just one piece of visuals. Many other factors include fps, texture quality, anisotropic filtering, lighting, shadows, reflections, SSAA, etc. All of which eat memory bandwidth.

Point being 1080p 120 fps ultra settings versus 1440p 30 fps low settings.... not a contest. The former wins by a landslide.

Resolution is being given too much emphasis.

Resolution is the easiest way to scale with memory bandwidth bottlenecks though. We see this with the Rog Ally and its 100 GBps memory bandwidth. The simplest way to scale frame-rate is to reduce internal resolution by changing the FSR setting or target resolution. You can also scale these other things too, but the Rog Ally (and probably Switch 2 as well) already run games at low-medium PC settings so there isn't much else to scale there. 

The Rog Ally is effectively a 720p -> 900p/1080p machine that plays demanding games at low-medium settings at either a fixed 30fps or variable 30-45 fps when in its 15W mode. Given the Switch 2's specs, we should expect similar, depending on the target TDP of the Switch 2 maybe a bit more like the 25W mode of the Rog Ally, which can do 40 - 60 fps variable at 720p -> 900p/1080p in demanding titles. I think that is what we should expect for the Switch 2 for demanding AAA 3rd party games. For Nintendo's first party games, which tend to depend on less demanding assets but nice visual effects, I think 900p -> 1440p docked is definitely a possibility. 

It is not unreasonable to expect the Switch 2 to be comparable to a mid-end 2024 gaming handheld, just as non-hybrid consoles are comparable to mid-end PC's of the year they release. 

I fully agree with everything you stated.  My point, which you seem to agree with, is the switch 2 is going to be low to medium settings.  The ps5 and series x is going to be medium to high.  



Chrkeller said:
sc94597 said:

Resolution is the easiest way to scale with memory bandwidth bottlenecks though. We see this with the Rog Ally and its 100 GBps memory bandwidth. The simplest way to scale frame-rate is to reduce internal resolution by changing the FSR setting or target resolution. You can also scale these other things too, but the Rog Ally (and probably Switch 2 as well) already run games at low-medium PC settings so there isn't much else to scale there. 

The Rog Ally is effectively a 720p -> 900p/1080p machine that plays demanding games at low-medium settings at either a fixed 30fps or variable 30-45 fps when in its 15W mode. Given the Switch 2's specs, we should expect similar, depending on the target TDP of the Switch 2 maybe a bit more like the 25W mode of the Rog Ally, which can do 40 - 60 fps variable at 720p -> 900p/1080p in demanding titles. I think that is what we should expect for the Switch 2 for demanding AAA 3rd party games. For Nintendo's first party games, which tend to depend on less demanding assets but nice visual effects, I think 900p -> 1440p docked is definitely a possibility. 

It is not unreasonable to expect the Switch 2 to be comparable to a mid-end 2024 gaming handheld, just as non-hybrid consoles are comparable to mid-end PC's of the year they release. 

I fully agree with everything you stated.  My point, which you seem to agree with, is the switch 2 is going to be low to medium settings.  The ps5 and series x is going to be medium to high.  

PS5 and Series X are more like a mix of low, medium, and high settings these days, but they are targeting sub-1440p 60fps (Native) or near-1440p/super-1440p (Native) 30fps in most current demanding AAA titles. 

For example in Starfield (internal 1296p upscaled to 4k on Series X, 30fps target), these are the equivalent PC settings to Series X.

https://www.eurogamer.net/digitalfoundry-2023-starfield-on-pc-is-the-best-way-to-play-but-the-game-still-requires-a-lot-of-work

A Rog Ally is more like low-medium on all settings 720p -> 1080p. Series S is similar to Rog Ally in terms of features, some better ; some worse, but 900p -> 1440p. 

The Switch 2 likely won't ever get Starfield (not entirely impossible given Microsoft wants to work with Nintendo more), but I'd assume it would be similar to the Rog Ally if it ever did. 



Around the Network
Soundwave said:

Why would you ever run any game at native resolution on Switch 2 to begin with? There's no point when you have DLSS, undocked really never needs to go above 540p as far as I'm concerned. Yes native looks slightly better, but not good enough that it's worth forcing the system to render 4x the pixels. On top of that DLSS gives you basically a "free" form of anti-aliasing. Running at native + wasting resources on top of that for AA is just brain dead, in fact I would postulate that DLSS implementation is the automatic default for Switch 2 dev kits, the system will be designed to run with that on. 

1) DLSS is propriety, Nintendo would need a licence to use it on their console and that will potentially have a corresponding cost, whether Nintendo enables the technology remains to be seen.
2) DLSS would break backwards compatibility if Nintendo's next console (Aka. Switch 3) doesn't use nVidia technology.
3) DLSS gets it's best resolve with more data that you feed it, you will always introduce artifacts.
4) DLSS needs to be implemented by the developer, Nintendo doesn't get a choice in that, some developers will still prefer FSR or their own internal algorithms for upscaling. DLSS will be on a game-by-game basis.

So the better question is... Why hedge all your bets on DLSS?

DLSS isn't the magic answer for all things performance, it's not going to turn a Playstation 4 into a Playstation 4 Pro, doesn't work like that.

In the end, games and game engines will still use dynamic resolution scaling, but the UI will continue to sit at native output and look clean and sharp.

Soundwave said:

Especially on a freaking small 7-8 inch-ish display, the regular joe, even most "game enthusiast joes" are not really going to know or care that their game is actually only rendering from 540p, shit I think you could go even lower than that. 

The most notable improvement going from 720P to 1080P on a 7-8" screen is the reduction in general rendering artifacts like stair-stepping caused by aliasing.

Otherwise, I agree, the difference would be marginal... However not everything in a game runs at the native output resolution, many rendering aspects are done at half or quarter resolution, like Shadows... So they would see marked improvements running at a higher resolution.

So on a Switch 1 a quarter resolution shadow resolve with a 480P game would actually be 240P and you *do* notice that in a game as they are chunky blobs.

Soundwave said:

Yes you can move sliders around on PC games that have a performance overhead to "match" lower console settings, that doesn't really have anything to do with what I'm saying. I'm saying if the Steam Deck version of Ratchet & Clank runs at 30-40 fps, if Insomniac sat down with a team of 20-30 people who worked on the port for 6-7 months JUST for that one hardware, do I think they could get that up to a solid locked 40 fps and/or maybe even bump the settings from Low to Medium 30 fps ... yes, I do. Optimization does matter.

No one builds games to the metal anymore, so you don't get that "hardware optimization" these days.

When was the last game you remember being built in assembly? ;) I rest my case.

It just doesn't happen due to:
1) Time.
2) Cost.
3) Marginal improvements as compilers these days are extremely good.
4) Compilers can optimize for certain hardware nuances and features.
5) Consoles aren't a single device. Xbox Series S and X, Playstation 5 and potentially a Pro console? Plus supporting the Xbox One and Playstation 4 and PC? Bit of a mess to optimize for.

PC also gets game optimizations, there is reason why nVidia, AMD and Intel release graphics drivers extremely frequently to optimize games for the PC.

Soundwave said:

Actually I will say 720p to 1440p DLSS does look great. It does look like you are playing something very close to real 1440p. 

It doesn't. As often the temporal information just isn't there for finer details like particles, hair etc'.

The best result you get with DLSS is run a game at your native resolution, then use DLSS on top to upscale, then downsample to your native resolution.

Soundwave said:

I've tried it and tested it on a 77 inch display, on a 27 inch PC monitor that I sit right in front of too, it looks great either way. I have larger displays in my house than "Joe Average" does. 

Then what do we call the smallest display in my home at 85"?

Either way, display size itself is actually irrelevant.

If you are viewing your panel from 20 meters away, then someone with a 40" TV and sitting a meter away will be able to perceive more information.

Soundwave said:

You have to push DLSS down to to about 360p (which is ridiculously low) to really have the image quality look actually bad. From 540p it starts to look good more than good enough for a small screen display, 720p to 1440p looks very good, 1080p to 4K looks fantastic. Someone would have to present a pretty compelling case as far I'm concerned for why you would ever really want to render above 540p undocked, and 720p-1080p docked on Switch 2. There's no point. Even 1080p is kind of ridiculous, I think 900p DLSS would be more than good enough, you can get a very nice image quality from just 1280x720 pixels going to 1440p, and yes I'm talking about for big screen TVs. 

The quality of the DLSS resolve varies from game to game... And even then, will vary from scene to scene in a game.
There are scenarios where DLSS starts to breakup and fall apart. Especially in fine-grain details.


I can assure you, no PC gamer with a decent GPU is running games at 720P and using DLSS to take it to 1440P or 4k, because in the end, native is simply better.

Soundwave said:

For actual fucking Nintendo fans, you should be very pleased if the Switch 2 has DLSS technology. It is the real fucking deal, way better than that jaggy shit called FSR 2.0/3.0, you're going to be getting fantastic image quality at a fraction of the pixel budget, so much so that there's no point I believe in rendering natively on the Switch 2 at all. You are in for a treat. 

No guarantees that developers will use DLSS, making your proclamations as DLSS being the best thing since sliced bread, ultimately redundant.

Keep in mind there are other up-scaling technologies rolling out. We are in the era of A.I inferencing.

I would not want to see Nintendo games locked and tied into a propriety technology, it's asking for trouble.

Louie said:

Would Switch 2 in that case be able to handle a significant proportion of new third party games released on Xbox Series and PS5 with acceptable visual sacrifices? And would it be able to handle a bigger proportion of third party games than Switch 1 during its run? 

Yes.

Just in the same way the current Switch is able to handle "Xbox One" titles, there are significant cut backs and it can do it.

Games can be cutback to extremely bare-bone levels and run on some very antiquated PC hardware, there is actually a community of PC gamers who will take the latest and greatest games and mod/tweak those games to run it on the slowest hardware they can.

For example, take Elder Scrolls Oblivion or Fallout 3 which was an Xbox 360 tier title... PC gamers managed to mod those games and run it on Original Xbox-tier hardware. (I.E. Pentium 3/Geforce 3)



--::{PC Gaming Master Race}::--

Conina said:
Hiku said:

And these newer modern portable gaming devices are bigger and heavier than Switch.

Steam Deck for example weighs nearly twice that off an OLED Switch.

  • 398g = Switch LCD (including JoyCons)
  • 420g = Switch OLED (including JoyCons)
  • 640g = Steam Deck OLED
  • 669g = Steam Deck LCD

So the launch Steam Deck is 68% heavier than the launch Switch.
And the Steam Deck OLED is only 52% heavier than a Switch OLED.

Actually the OLED difference is even a bit less according to my kitchen scale.
My Switch OLED weights 424g, my Steam Deck OLED 635g (both including a microSD card), so the Steam Deck OLED is only 50% heavier.

Also the Steam Deck is much more comfortable to hold for adults with "normal sized" hands... if I want similar ergonomics, I use my Switch OLED with Hori Split controllers... which adds some weight. My Steam Deck OLED (635g) is only 33% heavier than my Switch OLED with Hori Split controllers (477g).

And with Hori Split Controllers, the size difference is also negligible:

Maybe it's a different model, but I was looking at this article that has Steam Deck at 748.4g.

Sizing up the Legion Go vs. Switch, Steam Deck, and ROG Ally | Windows Central

But yeah, weight doesn't neccesarily determine comfortability when holding.

Last edited by Hiku - on 12 January 2024

Soundwave said:
JRPGfan said:

FSR at 1440p->4k is almost as good as DLSS.
The area where DLSS shines, compaired to fsr, is at low res upscaleing.
Like 540p -> 1080p ect (thats where DLSS kicks FSR's arse).

No joke, but at higher resolutions, DLSS vs FSR, is near identical.... esp if your sitting on a bit away from the TV/Monitor.
You need to zoom way in, and take screenshots/recordings to spot minor differences between the two.

Well the Switch 2 should never be rendering at higher resolutions to begin with. It's a monumental waste. 

720p to 1440p DLSS looks terrific. Most normal gamers sitting on their couch several feet away from even 65-77 inch display are not going to know any better. 

540 to 1080p DLSS is a laugh, you can approximate this if you have DLSS titles like Alan Wake II or Cyberpunk 2077 ... set the game to "windowed" mode on your full size monitor, in that case I was getting a display image of roughly 13 inches on my 4K monitor PC monitor (way larger than a Switch 2's hypothetical 8 inch display but at least in the same ball park) and even at 13 inches display ... toggling back and forth between native 1080p and 540-to-1080p DLSS is extremely hard to tell a difference. 

A developer would be utterly stupid IMO to try and brute force native res on a machine like the Switch 2. You're asking the system to push so many more pixels for almost no benefit. If I'm a Switch 2 developer working on even a remotely challenging type of port for the system, I'm sticking with 540p (1080p DLSS) and 720p (1440p DLSS) as my main resolutions and not a single pixel more. 

A ROG Ally is basically a portable PS5/XSX ... like on no planet can you seriously state that's "just a PS4" when its running Starfield and Alan Wake II in playable states even at 15 watts only. Indoor areas it even climbs above 40 fps for Starfield and some areas in AW2 too as well. We have portables that can already run PS5/XSX tiers, we've had it for a while. The Steam Deck is almost 2 years old. 

The bolded part is where we disagree.  You seem to think running the same games makes hardware "basically" the same.  A rtx 2060 runs anything that a rtx 4090 does.  But saying the rtx 2060 is "basically" a rtx 4090 is nonsense.  I mean 1,920 cuda cores versus 16,384 is a slaughter.  

Likely we need to agree to disagree.  But I think "running the same games" is a terrible metric for hardware power.   We live in an age where game engines are scalable.  A wide variety of grossly different hardware can run the same game, just at massively different levels of fidelity.

Last edited by Chrkeller - on 13 January 2024

Chrkeller said:

The bolded part is where we disagree.  You seem to think running the same games makes hardware "basically" the same.  A rtx 2060 runs anything that a rtx 4090 does.  But saying the rtx 2060 is "basically" a rtx 4090 is nonsense.  I mean 1,920 cuda cores versus 16,384 is a slaughter.  

Likely we need to agree to disagree.  But I think "running the same games" is a terrible metric for hardware power.   We live in an age where game engines are scalable.  A wide variety of grossly different hardware can run the same game, just at massively different levels of fidelity.

Anyone who thinks a Rog Ally is a portable PS5 is living in an alternate reality.

Considering you -can- run Starfield on a Core i7 2700K and a Radeon 7850 HD... Which is literally basically what a base Playstation 4 is...

And still get 40fps on low at 1080P... Does that mean a Playstation 4 is basically a Playstation 5?
https://www.gpucheck.com/game-gpu/starfield/amd-radeon-hd-7850/intel-core-i7-2700k-3-50ghz/ultra

You can also do 30fps in Cyberpunk.
https://www.youtube.com/watch?app=desktop&v=5hV2I6bcWyQ

Or hows about Hogwarts legacy?
https://www.youtube.com/watch?app=desktop&v=fgmjlVeQKaI

Keep in mind the PS4 class GPU in these tests.

What this showcases is that the games running on a certain "set" of hardware isn't representative of another set of hardware, but rather it showcases how scalable games are these days... Where it will scale from a mid-range 10+ year old GPU to the top of the line 2024 GPU.



--::{PC Gaming Master Race}::--

Pemalite said:
Chrkeller said:

The bolded part is where we disagree.  You seem to think running the same games makes hardware "basically" the same.  A rtx 2060 runs anything that a rtx 4090 does.  But saying the rtx 2060 is "basically" a rtx 4090 is nonsense.  I mean 1,920 cuda cores versus 16,384 is a slaughter.  

Likely we need to agree to disagree.  But I think "running the same games" is a terrible metric for hardware power.   We live in an age where game engines are scalable.  A wide variety of grossly different hardware can run the same game, just at massively different levels of fidelity.

Anyone who thinks a Rog Ally is a portable PS5 is living in an alternate reality.

Considering you -can- run Starfield on a Core i7 2700K and a Radeon 7850 HD... Which is literally basically what a base Playstation 4 is...

And still get 40fps on low at 1080P... Does that mean a Playstation 4 is basically a Playstation 5?
https://www.gpucheck.com/game-gpu/starfield/amd-radeon-hd-7850/intel-core-i7-2700k-3-50ghz/ultra

You can also do 30fps in Cyberpunk.
https://www.youtube.com/watch?app=desktop&v=5hV2I6bcWyQ

Or hows about Hogwarts legacy?
https://www.youtube.com/watch?app=desktop&v=fgmjlVeQKaI

Keep in mind the PS4 class GPU in these tests.

What this showcases is that the games running on a certain "set" of hardware isn't representative of another set of hardware, but rather it showcases how scalable games are these days... Where it will scale from a mid-range 10+ year old GPU to the top of the line 2024 GPU.

Exactly. Well said and thanks for the links.  

My experience is from a i5 3050 6 vram 16 gb ram laptop compared to my i7 4070 12 vram 32 gb tower....  yeah they both run the same games.  No they are not the same tier. 

Like you said games scale easily, but my tower decimates my laptop.