By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Tegra X1 (Switch GPU?) Throttles To 768 MHz (Eurogamer Correct Again?)

Soundwave said:
nomad said:

You 've said the GPU has more TDP headroom. No,  it does not.

That 1.5W is a misenterpretation of data. The test was conducted with an underclocked GPU. The test was not to show how much the GPU consumes but how effecient it can be. It shows that X1's GPU's performance at 1.5W is comparable to an iPad Air 2's(A8X) GPU which runs at 2.5W. It's unbelievable for a device that regurlarly hover at ~15W at load to only have a 1.5W GPU which takes up a significant portion of the SoCs size. The A57's are power hungry (which explains switch's heavy downclock) but not that hungry. The X1 GPU regulary perform over 2x that os the A8X in graphics benchmark to, assuming power effeciency linear (it isn't), that is 3W, but since it is nonlinear, lets just say 4W. Even then, I think that's being conservative, I wouldn't be suprise if it was >5W at full clockspeed.

 

Even at 5 watts, that's 100 GFLOPS/watt, if Nvidia was actually achieving that, everyone would be using Nvidia. 

You could have a 1.4 teraflop machine for just 14 watts, the XBox One die shrunk to 14nm/16nm (even smaller than the Tegra X1's 20nm) still consumes over 60 watts ... there's no way Nvidia has such magic up their sleeves, especially with a chip that is about two years old. 

No, this is incorrect. Power consumption increases quadratically with the clock frequency! A good GFLOP/watt ratio is very well possible with a very low clocked GPU or CPU.

Oh and btw. Mdave also showed that the CPU throttles the GPU and vice versa. His tests give a good reason also for the 1Ghz clock rate of the CPU. It seems that Nintendo just fixed the clocks of the CPU and GPU at a reasonable frequency that the system can handle maintain for a longer time without throttling.

If you read the thread, it seems we are still quite a bit away from having a whole picture of whats going on inside the switch. Nintendo typically does quite of customization to the chips. Even a 16nm process isnt out of the question. That would give room for quite a bit more souce.



Around the Network

*rises hand*
Question: is it posible for the Switch to be a little more powerful than the Wii U, but having noticeable better effects, like particles, lighting, even textures, that will make games look a lot better than Wii U games?



Proud to be the first cool Nintendo fan ever

Number ONE Zelda fan in the Universe

DKCTF didn't move consoles

Prediction: No Zelda HD for Wii U, quietly moved to the succesor

Predictions for Nintendo NX and Mobile


Pavolink said:
*rises hand*
Question: is it posible for the Switch to be a little more powerful than the Wii U, but having noticeable better effects, like particles, lighting, even textures, that will make games look a lot better than Wii U games?

Of course everything shares the processing power, unless you have extra processors doing said effects. However, a lighter OS (on the tasks done in the background) and not needing to run two screens simultanously would free some resources for the effects. You also could stream the textures from the card. Working around bottlenecks would also improve performance.



Ei Kiinasti.

Eikä Japanisti.

Vaan pannaan jalalla koreasti.

 

Nintendo games sell only on Nintendo system.

Miyamotoo said:
Scisca said:

Let's be serious, everything they've said has been agreed upon and Nintendo takes full responsibility for it.

Switch chip is Nvidias chip and they talked about it, while like I wrote: "Nintendo still didn't said anything about power, tech, architecture, or about capabilities of Switch". Clear fact is that Nvidia said that not Nintendo, so you cant say in any case that Nintendo lied about that when still they didn't said anything.

If spinning it that way makes you feel better. But there are such things called NDAs, so if NV said this, it means Nintendo allowed them to say this. Either way, it'd be a disappointment and would make me think really hard about the purchase, most probably making me wait for a redesign. If Switch is pased on Pascal, I'm getting it as soon as I can get my hands on it.



Wii U is a GCN 2 - I called it months before the release!

My Vita to-buy list: The Walking Dead, Persona 4 Golden, Need for Speed: Most Wanted, TearAway, Ys: Memories of Celceta, Muramasa: The Demon Blade, History: Legends of War, FIFA 13, Final Fantasy HD X, X-2, Worms Revolution Extreme, The Amazing Spiderman, Batman: Arkham Origins Blackgate - too many no-gaemz :/

My consoles: PS2 Slim, PS3 Slim 320 GB, PSV 32 GB, Wii, DSi.

Pavolink said:
*rises hand*
Question: is it posible for the Switch to be a little more powerful than the Wii U, but having noticeable better effects, like particles, lighting, even textures, that will make games look a lot better than Wii U games?

The most likely scenario seems to be that the switch is very similar in power to NVidias Shield. This is already quite a bit above WiiU, even undocked.

It is possible (though not likely) that Nintendo chose a better fabrication process (16nm). In that case the GPU would cause less heat and there would be room for quite a bit of additional things, like an additional SM. In that case Switch GPU would have about 500GFlops and therefore be quite potent.

The number of available cores of the CPU is also unclear at the mom. It could be that all 4 cores are available for the devs and there are additional CPUs just for the OS. Or it could be that the 4 cores are shared between the OS and gaming.



Around the Network
Scisca said:
Miyamotoo said:

Switch chip is Nvidias chip and they talked about it, while like I wrote: "Nintendo still didn't said anything about power, tech, architecture, or about capabilities of Switch". Clear fact is that Nvidia said that not Nintendo, so you cant say in any case that Nintendo lied about that when still they didn't said anything.

If spinning it that way makes you feel better. But there are such things called NDAs, so if NV said this, it means Nintendo allowed them to say this. Either way, it'd be a disappointment and would make me think really hard about the purchase, most probably making me wait for a redesign. If Switch is pased on Pascal, I'm getting it as soon as I can get my hands on it.

Again .. Pascal or Maxwell does not really make the difference (see my previous comment). Please just stop using these buzzwords if you have no idea what is behind them.



Scisca said:
Miyamotoo said:

Switch chip is Nvidias chip and they talked about it, while like I wrote: "Nintendo still didn't said anything about power, tech, architecture, or about capabilities of Switch". Clear fact is that Nvidia said that not Nintendo, so you cant say in any case that Nintendo lied about that when still they didn't said anything.

If spinning it that way makes you feel better. But there are such things called NDAs, so if NV said this, it means Nintendo allowed them to say this. Either way, it'd be a disappointment and would make me think really hard about the purchase, most probably making me wait for a redesign. If Switch is pased on Pascal, I'm getting it as soon as I can get my hands on it.

Actually you are one that trying to spin things here buy accusing that Nintendo somehow lying (lol), how you can say they lying when they still didnt said anything!? Clear fact is that Nintendo still didn't said anything about Switch hardware, just Nvidia.



Emme said:
Just to be clear: the Wii U did have some games run at 1080p, right ? And the Switch will never exceed 720p, right ?

Just to be clear, when docked, it should be at least twice as faster as the Wii U, so it will do 1080p... 

Before any of you have a meltdown, here are the X1's specs... 

http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/2



EricHiggin said:

If I'm understanding this correctly, your saying a 20w power supply connected to a device with a 20w rated SOC, wouldn't allow it to actually pull 20w, but would only be able to pull 16w due to 80% efficiency?  That is mostly incorrect. You have the efficiency portion of it correct though.

In the case of Switch, if you have a power supply that is rated for 20w, that means it will be able to output 20w of DC power max. DC is what electronics use. Now since power supplies are not 100% efficient, and the best you can get is around 80%, what that actually means is that the 20w power supply, has an input rating of 25w AC. The power lines and all buldings have AC power (I know some have DC but its so new and minuscule lets not cause confusion)

This is due to the law of conservation, and the first law of thermodynamics. When AC power from the wall is transformed into DC power for the electronic device, there are some energy losses in the form of heat. This is why power supplies used to run so much hotter than they do now. Old power supplies used to be around 50% efficient, so a large amount of heat was created. Todays power supplies are around 80%, and sometimes higher, which create much less heat.

Now for the battery, if the DC voltage of the battery doesn't match the voltage of some of the electronics in the device, then it needs to be converted, but DC to DC is done quite differently than AC to DC, and to make it short, the amount of loss you get with a DC to DC conversion is quite minimal. This of course is all calculated when designing the system, so supplying enough power, whether through the power supply or battery, will feed the device properly with whatever it requires.

If I'm way off base, and this isn't what you meant, I apologize in advance.


I am well aware of the laws of conservation and thermodynamics. (Energy can neither be created nor destroyed, only transferred or transformed.)

And FYI. We likely have our wires crossed.

You cannot pull 25w at the wall. Because it's only pulling 20w. That is what it was recorded at.
You can't consume 20w of power at the SoC level if you are consuming 20w of power at the Wall due to energy losses.


bdbdbd said:

I understand your point to an extent. But what makes you think Nintendo isn't pushing boundaries of the tech?

The 6.2", 720P screen.

bdbdbd said:


And what did the consumer win by MS boosting the clockrate and freeing DRAM? Umm... Nothing, I believe.

Better graphics, better framerates.

bdbdbd said:
I really don't know what is it that Switch should be competetive with, as the device is so much different than anything currently on the market.

It should be competitive with everything on the market. That is what it's competing against.


ElPresidente7 said:
I really dont get peoples obsession with the architecture.
The main difference between Pascal and Maxwell is just the fabrication process. While Maxwell was fab on 28nm .. Switch wil likely be 20nm. From an architectural pov Pascal and Maxwell are rather similar.


Tegra Maxwell is 20nm Planar.
Tegra Pascal is 16nm Finfet. (Basically 20nm with Finfet.)
Tegra Volta is 16nm Finfet and doubles Tegra Pascal.

20nm Planar is not as energy efficient as 16nm/14nm Finfet, which could have allowed for more performance and less power consumption to get better than just a couple of hours of battery life.

nomad said:

You 've said the GPU has more TDP headroom. No,  it does not.

The test was conducted with an underclocked GPU. The test was not to show how much the GPU consumes but how effecient it can be. It shows that X1's GPU's performance at 1.5W is comparable to an iPad Air 2's(A8X) GPU which runs at 2.5W.

Exactly. And that is the point I'm trying to make.

And the way the GPU can have more TDP headroom is by reducing the CPU clock rate. (Which comes with a reduction in voltage which has a direct relationship with power consumption.)
Remember the more power hungry the CPU you use is, the less less power you can reserve for the GPU and vice-versa if you wish to hit a certain power limit.

Soundwave said:

Even at 5 watts, that's 100 GFLOPS/watt, if Nvidia was actually achieving that, everyone would be using Nvidia. 

You could have a 1.4 teraflop machine for just 14 watts, the XBox One die shrunk to 14nm/16nm (even smaller than the Tegra X1's 20nm) still consumes over 60 watts ... there's no way Nvidia has such magic up their sleeves, especially with a chip that is about two years old. 

Generation 1, Graphics Core Next is inefficient. And was never optimized for the 14/16nm processes.

Pavolink said:
*rises hand*
Question: is it posible for the Switch to be a little more powerful than the Wii U, but having noticeable better effects, like particles, lighting, even textures, that will make games look a lot better than Wii U games?


There is absolutely no doubt that the Switch is superior to the Wii U, even if it had less flops.
The only question that remains is... By how much.

Converesly, even after we discovered the clock rates, the devices performance position of between the Wii U and Xbox One hasn't actually changed, but it is closer to the Wii U now.

Expect to see improvements in every aspect of a game... With the biggest benefits being to Texturing (Thanks to better compression, more DRAM.) and geometry. (Thanks to the superior Polymorph engines.)

ElPresidente7 said:

Again .. Pascal or Maxwell does not really make the difference (see my previous comment). Please just stop using these buzzwords if you have no idea what is behind them.

Pascal did bring improvements over Maxwell. Although it's more evolutionary rather than revolutionary... But in something that is as small as Tegra, things like 4th gen Delta Colour compression can mean some nice gains when you are only dealing with 20-25GB/s of bandwidth.
nVidia pegs the gains at around 20%.
Pascal also has better Asynch Compute performance.

AsGryffynn said:

Just to be clear, when docked, it should be at least twice as faster as the Wii U, so it will do 1080p... 

To be clear. The Playstation 2, Gamecube and Original Xbox *could* do 1080P rendering. (Even if they couldn't technically output it.).

It is completely up to the developers... If you have developers pushing graphical fidelity or only building for the Portable mode, then the Switch isn't likely to hit 1080P.



--::{PC Gaming Master Race}::--

Pemalite said:
EricHiggin said:

If I'm understanding this correctly, your saying a 20w power supply connected to a device with a 20w rated SOC, wouldn't allow it to actually pull 20w, but would only be able to pull 16w due to 80% efficiency?  That is mostly incorrect. You have the efficiency portion of it correct though.

In the case of Switch, if you have a power supply that is rated for 20w, that means it will be able to output 20w of DC power max. DC is what electronics use. Now since power supplies are not 100% efficient, and the best you can get is around 80%, what that actually means is that the 20w power supply, has an input rating of 25w AC. The power lines and all buldings have AC power (I know some have DC but its so new and minuscule lets not cause confusion)

This is due to the law of conservation, and the first law of thermodynamics. When AC power from the wall is transformed into DC power for the electronic device, there are some energy losses in the form of heat. This is why power supplies used to run so much hotter than they do now. Old power supplies used to be around 50% efficient, so a large amount of heat was created. Todays power supplies are around 80%, and sometimes higher, which create much less heat.

Now for the battery, if the DC voltage of the battery doesn't match the voltage of some of the electronics in the device, then it needs to be converted, but DC to DC is done quite differently than AC to DC, and to make it short, the amount of loss you get with a DC to DC conversion is quite minimal. This of course is all calculated when designing the system, so supplying enough power, whether through the power supply or battery, will feed the device properly with whatever it requires.

If I'm way off base, and this isn't what you meant, I apologize in advance.


I am well aware of the laws of conservation and thermodynamics. (Energy can neither be created nor destroyed, only transferred or transformed.)

And FYI. We likely have our wires crossed.

You cannot pull 25w at the wall. Because it's only pulling 20w. That is what it was recorded at.
You can't consume 20w of power at the SoC level if you are consuming 20w of power at the Wall due to energy losses.

If the wall measurement before the power supply is 20w, assuming thats basically near max output, you would be correct in the fact that approximately 16w is the max power the device/SOC could be pulling. Wires crossed, well said, I clearly misunderstood.