By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
Soundwave said:

Uh what? 

Both the Switch and the Shield Console run at 768 MHz, Nintendo just opted to lop the 1 GHz off becuase it was useless, it down clocks to 768 MHz for any effective gameplay past a few minutes, so it isn't really a 1 GHz chip to begin with.

Sorry. Hadn't had enough coffee so didn't clarify what I meant in the best way possible.

The Switch is 307.2MHz.
It only runs at 768Mhz when:
1) A developer opts to use it.
AND
2) Whilst docked.

Same goes for it's memory bandwidth. You loose a massive amount of bandwidth unless a developer opts to enable it.

Ego. It would not be far-fetched to say that most games will likely use the slower 307.2mhz mode that is half the speed of the Shield TV, even while docked. It is up to the developers. - And not all of them are proactive in supporting all of a systems various nuances.

******
Plus the Shield has a CPU that is twice as fast, running at a full rate 2ghz and not the Switch's anemic 1ghz CPU.
That costs power don'cha know? The Shield TV is more powerful than the Switch, again not an Apples to Apples comparison.


Soundwave said:
If you're saying the undocked mode is less powerful .... well no shit. It has to run on a battery. The Shield console plugged into wall power consumes 19 watts at max load ... do you have any idea the size of the battery you would need to run such a chip for even 3 hours?


You are asking me this question? Forget the math and proof I provided in the other thread already? ;)

Soundwave said:
Of course the mobile version downclocks, how the fuck would it run for even an hour otherwise? 19 watts for 3 hours = 57 wH battery would be required for even 3 hours of play, no tablet, not even the giant sized 12-inch iPad has a battery that large.


Because there is more to a SoC than only the GPU. *yawn*
Remember the power consumption test that Anandtech did on *just* Tegra's GPU and it only consumed 1.5w of power? Yeah. Thought so.

As for the iPad. We have been over this. It is not an accurate representation of the Switch or Tegra. It is a completely different beast entirely. This horse has been beaten to death already.

The Tegra X1 uses 10 watts alone, sometimes more. There's no way it's using 1.5 watt for dick all other than playing SNES Virtual Console. 

There's nothing Nintendo could do to change any of that either. 

Also other high end mobile GPUs throttle like crazy too ... the Apple A10 which is basically the highest end other mobile SoC throttles a lot:

http://wccftech.com/apple-a10-fusion-gpu-breakdown/

The A10 uses a souped up GT7600 which PowerVR rates about as strong as an XBox 360, though even that is probably misleading because it throttles a lot. It only hits that performance envelope in absolute ideal conditions and can only hold that performance for minutes before it has to clock down. 

Performance well beyond a PS3/360 *sustained* without throttling in a mobile chip running on battery power simply isn't easily doable today, you would need a form factor more equivalent to a laptop with a battery that's well over 10,000 MaH and a large active fan. 

The Snapdragon 820 is the big mobile and that is basically a wash with the Tegra X1:

http://wccftech.com/snapdragon-820-benchmarks/

There is no mythical mobile chip right now that's pushes PS4/XB1 graphics. It doesn't exist, sorry. Even Nvidia's Tegra X1 claims are somewhat dubious .... it's not really a 500 GFLOP processor since it can only hold that performance for like 8-10 minutes max. It's a 384 gigaflop GPU max, which is better than an XBox 360, but to hit that it still needs to gobble down 10+ watts, so your entire SoC is approaching 20 watts total, which is a monstrous amount of electricity for a mobile chip. 

The chip has to be downclocked to 307 MHz in portable, why would you think it's remotely possibly to have it run at full clock in portable mode? You would get 1 hour of battery life even with a relatively large-ish battery (say 5000-6000 MaH).