View Post
Pemalite said:
EricHiggin said:

If I'm understanding this correctly, your saying a 20w power supply connected to a device with a 20w rated SOC, wouldn't allow it to actually pull 20w, but would only be able to pull 16w due to 80% efficiency?  That is mostly incorrect. You have the efficiency portion of it correct though.

In the case of Switch, if you have a power supply that is rated for 20w, that means it will be able to output 20w of DC power max. DC is what electronics use. Now since power supplies are not 100% efficient, and the best you can get is around 80%, what that actually means is that the 20w power supply, has an input rating of 25w AC. The power lines and all buldings have AC power (I know some have DC but its so new and minuscule lets not cause confusion)

This is due to the law of conservation, and the first law of thermodynamics. When AC power from the wall is transformed into DC power for the electronic device, there are some energy losses in the form of heat. This is why power supplies used to run so much hotter than they do now. Old power supplies used to be around 50% efficient, so a large amount of heat was created. Todays power supplies are around 80%, and sometimes higher, which create much less heat.

Now for the battery, if the DC voltage of the battery doesn't match the voltage of some of the electronics in the device, then it needs to be converted, but DC to DC is done quite differently than AC to DC, and to make it short, the amount of loss you get with a DC to DC conversion is quite minimal. This of course is all calculated when designing the system, so supplying enough power, whether through the power supply or battery, will feed the device properly with whatever it requires.

If I'm way off base, and this isn't what you meant, I apologize in advance.

I am well aware of the laws of conservation and thermodynamics. (Energy can neither be created nor destroyed, only transferred or transformed.)

And FYI. We likely have our wires crossed.

You cannot pull 25w at the wall. Because it's only pulling 20w. That is what it was recorded at.
You can't consume 20w of power at the SoC level if you are consuming 20w of power at the Wall due to energy losses.

bdbdbd said:

I understand your point to an extent. But what makes you think Nintendo isn't pushing boundaries of the tech?

The 6.2", 720P screen.

bdbdbd said:

And what did the consumer win by MS boosting the clockrate and freeing DRAM? Umm... Nothing, I believe.

Better graphics, better framerates.

bdbdbd said:
I really don't know what is it that Switch should be competetive with, as the device is so much different than anything currently on the market.

It should be competitive with everything on the market. That is what it's competing against.

ElPresidente7 said:
I really dont get peoples obsession with the architecture.
The main difference between Pascal and Maxwell is just the fabrication process. While Maxwell was fab on 28nm .. Switch wil likely be 20nm. From an architectural pov Pascal and Maxwell are rather similar.

Tegra Maxwell is 20nm Planar.
Tegra Pascal is 16nm Finfet. (Basically 20nm with Finfet.)
Tegra Volta is 16nm Finfet and doubles Tegra Pascal.

20nm Planar is not as energy efficient as 16nm/14nm Finfet, which could have allowed for more performance and less power consumption to get better than just a couple of hours of battery life.

nomad said:

You 've said the GPU has more TDP headroom. No,  it does not.

The test was conducted with an underclocked GPU. The test was not to show how much the GPU consumes but how effecient it can be. It shows that X1's GPU's performance at 1.5W is comparable to an iPad Air 2's(A8X) GPU which runs at 2.5W.

Exactly. And that is the point I'm trying to make.

And the way the GPU can have more TDP headroom is by reducing the CPU clock rate. (Which comes with a reduction in voltage which has a direct relationship with power consumption.)
Remember the more power hungry the CPU you use is, the less less power you can reserve for the GPU and vice-versa if you wish to hit a certain power limit.

Soundwave said:

Even at 5 watts, that's 100 GFLOPS/watt, if Nvidia was actually achieving that, everyone would be using Nvidia. 

You could have a 1.4 teraflop machine for just 14 watts, the XBox One die shrunk to 14nm/16nm (even smaller than the Tegra X1's 20nm) still consumes over 60 watts ... there's no way Nvidia has such magic up their sleeves, especially with a chip that is about two years old. 

Generation 1, Graphics Core Next is inefficient. And was never optimized for the 14/16nm processes.

Pavolink said:
*rises hand*
Question: is it posible for the Switch to be a little more powerful than the Wii U, but having noticeable better effects, like particles, lighting, even textures, that will make games look a lot better than Wii U games?

There is absolutely no doubt that the Switch is superior to the Wii U, even if it had less flops.
The only question that remains is... By how much.

Converesly, even after we discovered the clock rates, the devices performance position of between the Wii U and Xbox One hasn't actually changed, but it is closer to the Wii U now.

Expect to see improvements in every aspect of a game... With the biggest benefits being to Texturing (Thanks to better compression, more DRAM.) and geometry. (Thanks to the superior Polymorph engines.)

ElPresidente7 said:

Again .. Pascal or Maxwell does not really make the difference (see my previous comment). Please just stop using these buzzwords if you have no idea what is behind them.

Pascal did bring improvements over Maxwell. Although it's more evolutionary rather than revolutionary... But in something that is as small as Tegra, things like 4th gen Delta Colour compression can mean some nice gains when you are only dealing with 20-25GB/s of bandwidth.
nVidia pegs the gains at around 20%.
Pascal also has better Asynch Compute performance.

AsGryffynn said:

Just to be clear, when docked, it should be at least twice as faster as the Wii U, so it will do 1080p... 

To be clear. The Playstation 2, Gamecube and Original Xbox *could* do 1080P rendering. (Even if they couldn't technically output it.).

It is completely up to the developers... If you have developers pushing graphical fidelity or only building for the Portable mode, then the Switch isn't likely to hit 1080P.

The Switch will solely because the Wii U did. Whether devs use this or not remains something up to them... 

It was Britain, it is America, tomorrow France and next year, the world... 

Warning: This poster has a very negative opinion of Sony and Nintendo, Idea Factory and companies Tecmo Koei, EA, BioWare, Blizzard, Treyarch, Infinity Ward, Kadokawa and Sega. If you have very positive views of these and a negative view of Microsoft or Bethesda Game Studios, AVOID ENGAGEMENT AT ALL COSTS!