By using this site, you agree to our Privacy Policy and our Terms of Use. Close
haxxiy said:

The (undocked) Switch is a mobile GPU from 2015 running running at 160 MHz, consuming no more than 4 - 5 Watts.

307.2Mhz is the lowest clock the Switch operates on.
17.7w is actually the highest power consumption when including charging of the batteries.

11w is your typical entire-system power consumption....

https://www.anandtech.com/show/11181/a-look-at-nintendo-switch-power-consumption

8-10w is likely what the SoC is sucking down, depending on clockrates. (I.E. Not all games operate at 307.2Mhz!)

Power consumption is a difficult thing to quantify, but the GPU certainly doesn't operate at that clockspeed... And the SoC certainly does suck down more than 4-5 watts of juice depending on clockrate.

haxxiy said:

The power consumption improvements from node to node continue to diminish and that's already factoring in the fact these chips have considerably lower transistor densities than the node would theoretically allow (meaning leakage is a significant issue even at standard temperatures, so these chips need better electron flow).

Well no. Nodes don't work like that... Leakage isn't an issue at 16/14nm Finfet anyway... Let alone 20nm planar that the base Switch used.

haxxiy said:

That's a 30x gap in power consumption which can't be solved by any amount of underclocking, lower resolution and upcoming nodes / architectures. Certainly not in the scale of a console generation or two, at least. As you mention, of course, a more advanced portable console than the Switch, with its own SoC could potentially receive dowgraded ports from the next generation as the Switch received ports this generation.

Generally correct. A design needs to be built with low power consumption in mind as the chip needs an optimal layout to achieve certain power/clock targets.

In saying that... Every design has an "efficiency curve". - For example Vega 64 was generally regarded as a power hog, but you under-clock and under-volt that chip and it actually offered really good performance/per watt... AMD decided to throw efficiency out the window in order to drive clockrates home for more performance to try and keep up with nVidia.

haxxiy said:

But I don't see Sony indulging in such adventure considering its costs. Specially when streaming from either console or server to a portable screen will be a feasible, "good enough" alternative solution.

Like the WiiU?

haxxiy said:

1 - There is no magic sauce. On all likehood, GPUs releasing the same year will match or significantly exceed its performance-per-watt ratio, since PC suppliers work with better binned parts than console manufacturers. This has happened every generation, by the way.

Navi will likely be replaced by the time Next-Gen lands on PC... And will likely end up being AMD Vega 7 successor.

Alby_da_Wolf said:

All the aforementioned points work in favour of a PS5 Hybrid, the only users that could be moderately disappointed are the graphics whores, but the most die-hard ones would find acceptable high-end PC graphics only anyway.

"Graphics Whores" (I am one) Are actually a sizable part of the market, Sony and Microsoft even recognized that and released mid-generation console updates this generation. - The PC has also thrived thanks to it's constant technical superiority as well... So I wouldn't underestimate that demographic.



--::{PC Gaming Master Race}::--