By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Soundwave said:

It may use less power from the wall but it's not as if magically a 20 watt SoC is going to become a 8 watt SoC.

You sure about that? Decent PSU's in the PC space typically top out at 80% efficient.

Cheaper power supplies can be anywhere afrom 50-60% efficient... And this is without getting into efficiency curves, temperatures, ripple, etc'.

So you could be looking at a reduction down to 16w at 80% efficiency, or 10w if the PSU is 50% efficient, still think it's insignificant?

Electrical Engineering 101.

Soundwave said:


There's a reason why most tablet makers don't use a Tegra X1 and Nvidia was having problems finding vendors for the chip ... it's too beastly of a chip and consumes way too much power. That makes it better suited for the tasks of being a game machine than a tablet.

nVidia is also expensive. It's not just power consumption that OEM's need to contend with.

Plus, most OEM's have their own SoC's.
Apple builds it's own, Samsung builds it's own (And they are both a massive % of the entire market), Huawei builds it's own. etc'.
Then you have Qualcomm for OEM's who want a high-end chip but don't make SoC's themselves like Microsoft, HTC etc'.
MediaTek and AllWinner targeting the more budget orientated devices.

Intel tried to crack the market with it's x86 Atom chips, which other than a couple of design wins, was a colossal dissapointment, everyone had deals and contracts in place, even if Intel was faster, cheaper and used less Power, OEM's went with what they knew.
nVidia is in the same predicament.
nVidia did gain relevency at a few points, such as with Tegra 2 and 3, but that was mostly due to nVidia having a lower price to shift inventory.

Plus nVidia's CPU performance is pretty terrible in the ARM space all things considered. Their A57 + A53 Big.Little configuration was a dime a dozen, thus not attractive in a high-end handset.
Thankfully their Big.Super core layout with their latest Tegra is turning that on it's head.

Soundwave said:

Running the Unreal Engine 4 demo, Nvidia themselves said the GPU alone consumes 10 watts of electricity. Then you have to factor in the RAM, CPU, WiFi, LCD screen, active fan, etc. all needing power too.

The GPU block itself uses about 1.5w of power. I have already linked you to evidence at Anandtech prior who did testing on this.

Soundwave said:

You're probably well past 15 watts ... 15 watts is a ridiculous power consumption for a mobile device.

But the Switch wouldn't be using 15 watts. Or 20 watts. Are you forgetting about everything else?

Soundwave said:

Nvidia is also somewhat loose in how they define power consumption, they admit for example that even the older Tegra K1 can spike to over 10 watts of consumption itself when pushed to its higher peaks. Don't believe all the hype on power consumption with these chips, if they were really getting like 50 GFLOPS per watt, Sony and MS would be using them for their consoles too as that performance is way better the console GPUs they're using. My guess is in reality peak performance and electrical usage is not close to the ideal scenarios companies like Nvidia and PowerVR like to tout.


Uh. You do know that the Geforce 1080 is a 8873 Gflop (boost) GPU, right? It has an 180w TDP. (And uses close to that in power.)

8873 / 180 = 49.29 for Pascal.

Radeon Pro Duo has 16380 of total gflop performance with a 350w TDP for 46Gflop per watt.
Radeon R9 Nano is 8192 Gflops at 175w TDP for 46Gflop per watt.

But here is the kicker.
AMD stated that with the Radeon R9 290X, the GDDR5 memory was responsible for 15-20% of the GPU's total power consumption, which was 50 watts, so DRAM on a GPU consumes allot of power.

So Pascal actually *DOES* exceed 50Gflop of theoretical single precision floating point performance per watt if you discard the dedicated RAM for the GPU.

Not that flops actually mean anything in the real world. You can have a 100 teraflop GPU end up slower than a 100Gflop GPU, but it's interesting blowing up your ascertion into a billion pieces.


Soundwave said:

They know they can get away with it because most people only use their tablet for low level gaming and web browsing ... Switch does not have that luxury, it needs to be able to process relatively high end 3D graphics and that's where performance comes crashing down to the reality of what's possible with a 6000 MaH battery. 

Even if you could put a ridiculous sized 12,000 MaH battery and made this thing a giant device, there are overheating concerns too, this thing has to be held by little 8-year-olds, it can't be pushing itself to the peak and getting hot to the touch. This is also likely why Nintendo does not allow the system to be held in docked mode, really if they wanted just TV streaming a cheap device ala the Google Chromecast type thing would've worked fine. They don't want kids holding the device when its running full tilt because it gets hot IMO.

 


The facts say otherwise. The technology exists. Today. For better performance than the Switch in a Handheld.
All I have to do is whisper seductively "16nm Finfet" to prove my point, end of discussion.

Nintendo cheaped out, plain and simple. They cheaped out on the Wii U's hardware, they cheaped out on the Wii's hardware, they cheaped out on the 3DS hardware, this is just a continuation of that trend.
Nintendo doesn't like to use capital intensive high-end hardware and that's fine for some, but I am a hardware enthusiast who does tend to expect more for the dollarydoo's that are being placed down on the table.

bdbdbd said:

Also, the wireless controllers are going to consume power, as well as possible 3/4G and Wi-Fi. Hardwarewise the tablets and smartphones should be more simple devices than Switch.

The controllers have their own battery's. Otherwise you wouldn't be able to undock them from the Switch.



--::{PC Gaming Master Race}::--