Quantcast
Tegra X1 (Switch GPU?) Throttles To 768 MHz (Eurogamer Correct Again?)

Forums - Nintendo Discussion - Tegra X1 (Switch GPU?) Throttles To 768 MHz (Eurogamer Correct Again?)

KLAMarine said:

Is this because of the Android OS?

Not really. Android OS is fairly flexible.

But the word on the street is, nVidia has done allot of work customizing the software stack for Switch, leveraging Vulkan and optimizing it's drivers.

Plus the Switch will have games built for the hardware and not the entire platform, so developers can leverage the chips various strengths and weakness's to get the most out of it.

The Shield TV mostly just receives last generation ports, usually running at a higher resolution and framerate, it's really uninspiring really.



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:
Soundwave said:

The Tegra X1 uses 10 watts alone, sometimes more. There's no way it's using 1.5 watt for dick all other than playing SNES Virtual Console. 


Are you calling Anandtech a liar? One of the best sources of information for technology on the internet?

Sure. Whatever mate. I have provided evidence of this before. So if you failed to check the Article out, then that is your own fault, live believing otherwise.

Soundwave said:

Also other high end mobile GPUs throttle like crazy too ... the Apple A10 which is basically the highest end other mobile SoC throttles a lot:

http://wccftech.com/apple-a10-fusion-gpu-breakdown/

The A10 uses a souped up GT7600 which PowerVR rates about as strong as an XBox 360, though even that is probably misleading because it throttles a lot. It only hits that performance envelope in absolute ideal conditions and can only hold that performance for minutes before it has to clock down.

 

Why do you keep using Apple as a comparison?
The A10 has a CPU that is much faster than Tegra and actually runs at it's full rate.
It has more memory bandwidth than Tegra.
It has a higher resolution display.
It has various modems and logic for various tasks.

Apple has also modified the A10's GPU, thrown out various blocks and implemented it's own propriety design.

Common I expect better than that from you. When will you learn that such comparisons are utterly pointless and not representative of each other?


Soundwave said:
Performance well beyond a PS3/360 *sustained* without throttling in a mobile chip running on battery power simply isn't easily doable today, you would need a form factor more equivalent to a laptop with a battery that's well over 10,000 MaH and a large active fan.

You keep saying that. But Tegra is built on an old 20nm Planar process. It doesn't even use Finfet.

The Switch is also using a CPU that is operating at only 1ghz. (Massive power saving there.)
Is using a small 6-7" 720P screen. (Power saving also there.)
Doesn't have any LTE/3G/2G modems. (Power saving there.)

Ergo. It is entirely possibly to have performance that exceeds the Switch. Sustained without a 10,000MaH battery.

Soundwave said:

The Snapdragon 820 is the big mobile and that is basically a wash with the Tegra X1:

http://wccftech.com/snapdragon-820-benchmarks/

There is no mythical mobile chip right now that's pushes PS4/XB1 graphics. It doesn't exist, sorry. Even Nvidia's Tegra X1 claims are somewhat dubious .... it's not really a 500 GFLOP processor since it can only hold that performance for like 8-10 minutes max. It's a 384 gigaflop GPU max, which is better than an XBox 360, but to hit that it still needs to gobble down 10+ watts, so your entire SoC is approaching 20 watts total, which is a monstrous amount of electricity for a mobile chip.

You just lost all credability by using flops in such a pointless, inaccurate manner. Please stop it.
You do know that a 500Gflop GPU can be faster than a 1 Teraflop GPU? Right? You do know that there is more to a GPU than their theoretical single precision floating point performance, right? right?

And not once have I ever stated that I expect Playstation 4/Xbox One graphics out of the Switch. I expect "good enough" which the Switch clearly is not.

Soundwave said:
The chip has to be downclocked to 307 MHz in portable, why would you think it's remotely possibly to have it run at full clock in portable mode? You would get 1 hour of battery life even with a relatively large-ish battery (say 5000-6000 MaH).

It doesn't have to be downclocked to 307mhz. It is what Nintendo chose. There is a difference.

The technology exists today where the chip could have been faster and use less power. That is an undeniable fact.

Yes it does have to be downclocked. At full clock it would kill the battery of an iPad Pro (12-inch model) in just over 2 hours, and I'm not even accounting for the electricity the screen needs (since Shield TV doesn't have one). That would likely be another 1-1.5 watts needed per hour, so you're actually talking about battery life there in the range of 1 hour 50 minutes or something. 

That's not even remotely feasible as a portable device. 

And no there isn't a mobile chip available that runs way better today than an X1 without large scale throttling at max performance. 

The chip you want is like a 10nm Tegra Next-Gen, but that's not coming until next year, and probably late next year at that. 



Pemalite said:
KLAMarine said:

Is this because of the Android OS?

Not really. Android OS is fairly flexible.

But the word on the street is, nVidia has done allot of work customizing the software stack for Switch, leveraging Vulkan and optimizing it's drivers.

Plus the Switch will have games built for the hardware and not the entire platform, so developers can leverage the chips various strengths and weakness's to get the most out of it.

The Shield TV mostly just receives last generation ports, usually running at a higher resolution and framerate, it's really uninspiring.

Makes sense. This partnership with Nintendo is probably very valuable to NVidia so they'll want to help make sure the Switch is as good as it can be all things considered.

Thanks for the info.



Are we really STILL getting into multi page scrums over speculation with the actual answers a few days from being revealed?



NoirSon said:
Are we really STILL getting into multi page scrums over speculation with the actual answers a few days from being revealed?

I doubt Nintendo is going to reveal specs, so everything will come down to how Just Dance Switch compares to the PS4/XBone versions.



“Simple minds have always confused great honesty with great rudeness.” - Sherlock Holmes, Elementary (2013).

"Did you guys expected some actual rational fact-based reasoning? ...you should already know I'm all about BS and fraudulence." - FunFan, VGchartz (2016)

Around the Network
Soundwave said:

Yes it does have to be downclocked. At full clock it would kill the battery of an iPad Pro (12-inch model) in just over 2 hours, and I'm not even accounting for the electricity the screen needs (since Shield TV doesn't have one). That would likely be another 1-1.5 watts needed per hour, so you're actually talking about battery life there in the range of 1 hour 50 minutes or something. 

That's not even remotely feasible as a portable device. 

And no there isn't a mobile chip available that runs way better today than an X1 without large scale throttling at max performance. 

The chip you want is like a 10nm Tegra Next-Gen, but that's not coming until next year, and probably late next year at that. 

1) Shield TV is not representative of the Tegra in the Switch. - Have you heard of a thing called binning? No?

If you are basing the 20w~ Tegra power consumption that is recorded from Anandtech here:
http://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/9

Do you fully understand the implications of recording power consumption from the wall?
Happy to educate you on the topic and how it also means the Shield TV actually uses less than 20w of power.

2) Again you use another device as some kind of representation of the Switch.
The iPad Pro has a faster CPU (Uses more power), larger screen (Uses more power), higher resolution screen (Uses more power), more powerful backlight due to the higher resolution screen (Uses more power.) and has various modems and logic in it's large SoC. (Uses more power.)

Now what part of the iPad Pro using more power are you unable to understand and comprehend? It is not an Apples to Apples (Pun intended) comparison.

3) Yes there are mobile chips that even whilst throttled are still faster than Tegra, maybe if you didn't base your opinion on Gflops as some kind of basis of determining performance, you might be able to understand that a little better.

4) No. I am not talking about "Next Gen Tegra". - Next Gen Tegra won't be 10nm either.
The next gen Tegra is the Tegra Xaviar, based on Volta, doubles the performance of Tegra P1 (Pascal) for the same amount of power and at the same node (16nm Finfet.)
I.E. Vastly superior to the Switch. And "Close enough" to the Xbox One in many aspects.

I don't think you fully comprehend how much of an improvement 16nm/14nm Finfet brings over 20nm planar process in terms of power characteristics.



--::{PC Gaming Master Race}::--

shield tv throttle the gpu speed lower than Nintendo switch after a few minutes of gaming

Nintendo Switch docked mode > Nvidia Shield TV in term of performances



Pemalite said:
Soundwave said:

Yes it does have to be downclocked. At full clock it would kill the battery of an iPad Pro (12-inch model) in just over 2 hours, and I'm not even accounting for the electricity the screen needs (since Shield TV doesn't have one). That would likely be another 1-1.5 watts needed per hour, so you're actually talking about battery life there in the range of 1 hour 50 minutes or something. 

That's not even remotely feasible as a portable device. 

And no there isn't a mobile chip available that runs way better today than an X1 without large scale throttling at max performance. 

The chip you want is like a 10nm Tegra Next-Gen, but that's not coming until next year, and probably late next year at that. 

1) Shield TV is not representative of the Tegra in the Switch. - Have you heard of a thing called binning? No?

If you are basing the 20w~ Tegra power consumption that is recorded from Anandtech here:
http://www.anandtech.com/show/9289/the-nvidia-shield-android-tv-review/9

Do you fully understand the implications of recording power consumption from the wall?
Happy to educate you on the topic and how it also means the Shield TV actually uses less than 20w of power.

2) Again you use another device as some kind of representation of the Switch.
The iPad Pro has a faster CPU (Uses more power), larger screen (Uses more power), higher resolution screen (Uses more power), more powerful backlight due to the higher resolution screen (Uses more power.) and has various modems and logic in it's large SoC. (Uses more power.)

Now what part of the iPad Pro using more power are you unable to understand and comprehend? It is not an Apples to Apples (Pun intended) comparison.

3) Yes there are mobile chips that even whilst throttled are still faster than Tegra, maybe if you didn't base your opinion on Gflops as some kind of basis of determining performance, you might be able to understand that a little better.

4) No. I am not talking about "Next Gen Tegra". - Next Gen Tegra won't be 10nm either.
The next gen Tegra is the Tegra Xaviar, based on Volta, doubles the performance of Tegra P1 (Pascal) for the same amount of power and at the same node (16nm Finfet.)
I.E. Vastly superior to the Switch. And "Close enough" to the Xbox One in many aspects.

I don't think you fully comprehend how much of an improvement 16nm/14nm Finfet brings over 20nm planar process in terms of power characteristics.

It may use less power from the wall but it's not as if magically a 20 watt SoC is going to become a 8 watt SoC. 

There's a reason why most tablet makers don't use a Tegra X1 and Nvidia was having problems finding vendors for the chip ... it's too beastly of a chip and consumes way too much power. That makes it better suited for the tasks of being a game machine than a tablet. 

Running the Unreal Engine 4 demo, Nvidia themselves said the GPU alone consumes 10 watts of electricity. Then you have to factor in the RAM, CPU, WiFi, LCD screen, active fan, etc. all needing power too. 

You're probably well past 15 watts ... 15 watts is a ridiculous power consumption for a mobile device. 

Nvidia is also somewhat loose in how they define power consumption, they admit for example that even the older Tegra K1 can spike to over 10 watts of consumption itself when pushed to its higher peaks. Don't believe all the hype on power consumption with these chips, if they were really getting like 50 GFLOPS per watt, Sony and MS would be using them for their consoles too as that performance is way better the console GPUs they're using. My guess is in reality peak performance and electrical usage is not close to the ideal scenarios companies like Nvidia and PowerVR like to tout. 

They know they can get away with it because most people only use their tablet for low level gaming and web browsing ... Switch does not have that luxury, it needs to be able to process relatively high end 3D graphics and that's where performance comes crashing down to the reality of what's possible with a 6000 MaH battery. 

Even if you could put a ridiculous sized 12,000 MaH battery and made this thing a giant device, there are overheating concerns too, this thing has to be held by little 8-year-olds, it can't be pushing itself to the peak and getting hot to the touch. This is also likely why Nintendo does not allow the system to be held in docked mode, really if they wanted just TV streaming a cheap device ala the Google Chromecast type thing would've worked fine. They don't want kids holding the device when its running full tilt because it gets hot IMO. 



Well that make sense and gave us actually great reason why Nintendo locked GPU to 768MHz.



Soundwave said:
FunFan said:
It throttles for the same reason any other modern chip in the world does it. Heat buildup due to lack of proper cooling.

Both Switch and Shield Console have an active fan for cooling. It must need something much larger to be able to run at full clock. 

Just goes to show I guess that getting even Wii U/PS3/360 tier graphics or a bit better is still pretty tough even in a mobile chip. 

 

Soundwave said:
barneystinson69 said:

Doesn't seem to be that strong of a GPU. Definently worried about how its going to play AAA titles.

It's gonna be basically a Wii U/360 tier performance with some better effects and what not. 

Which isn't bad for a portable machine at all, and still pretty good for the more cartoony style of graphics, which is what Nintendo prefers. 

I was watching footage of that Halo LEGO demo that was never released, but man that game looked real good just running off a XBox 360, I think for the majority of Nintendo games much better than the Wii U isn't really needed. They wouldn't invest the money to make their games look that much better anyway. 

Why you keep saying that's WiiU/360 tier performance when is obvious that isn't true at all!? Switch will have around 3x real world performance compared to WiiU/360, going from GPU thatwill have around double more power on paper but bigger real world performance because its latest Nvidia tech/architecture and actualy almost 10 years newer tech/architecture compared to WiiU/360, same goes for ARM A57 CPU, while it will have most likely 6x more RAM for games than Wii U had.

So I really don't understand how exalty is that "basically a Wii U/360 tier performance with some better effects and what not", when Switch will be noticeable stronger even in portable mode than WiiU/360!?