By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Tegra X1 (Switch GPU?) Throttles To 768 MHz (Eurogamer Correct Again?)

Pemalite said:
Soundwave said:

Looks like Eurogamer was probably correct on these leaked Switch clock speeds they got. 


No one doubts the Eurogamer leak. Well. No one credible anyway.

Soundwave said:

A user at NeoGaf named MDave has a Nvidia Shield console which has the Tegra X1 (Maxwell) GPU that supposed is powering the Switch too. 

He ran some tests on the chip, and some interesting things popped up ... most notable the chip will run at a full 1 GHz clock speed .... but only for a couple of minutes (even with a fan). It throttles itself down to 768 MHz after a few minutes of running:

http://www.neogaf.com/forum/showpost.php?p=227883703&postcount=8800

So this is probably the max a Tegra X1 can hit in sustainable performance (without overheating). 768 MHz is the exact clock speed Eurogamer got for the Switch in docked mode. 

EDIT: Meant 768 MHz guys, not GHz. Typo.

Not a valid comparison. We have been over this and why.
So I need to ask, what are you trying to justify?

Soundwave said:
It also maybe goes to show that a lot of these benchmarks given for mobile chips are misleading.

All these chips downclock like crazy, if the Tegra X1 in the Shield has to downclock, and that's with a rather large chasis and an active fan for cooling, mobile chips in smartphones or tablets like down clock even lower.

Even thermal throttled. It is still over twice as fast as the Switch in the higher performing docked mode.
And the Switch has more TDP headroom for the GPU.

Even a full rate older Kepler-based Tegra is faster than the Switch. Most flagship handsets are faster than the Switch.

The Switch just is not a high-end device... Even in the mobile world. And there is *zero* reason for it other than Nintendo desiring to cut corners/costs.

Uh what? 

Both the Switch and the Shield Console run at 768 MHz, Nintendo just opted to lop the 1 GHz off becuase it was useless, it down clocks to 768 MHz for any effective gameplay past a few minutes, so it isn't really a 1 GHz chip to begin with. 

If you're saying the undocked mode is less powerful .... well no shit. It has to run on a battery. The Shield console plugged into wall power consumes 19 watts at max load ... do you have any idea the size of the battery you would need to run such a chip for even 3 hours? 

Of course the mobile version downclocks, how the fuck would it run for even an hour otherwise? 19 watts for 3 hours = 57 wH battery would be required for even 3 hours of play, no tablet, not even the giant sized 12-inch iPad has a battery that large. 



Around the Network
Rab said:
Found this interesting

https://www.youtube.com/watch?v=giqDQrfdkFA

My reactions to that vid:

1. X86 proccessors are great only because Intel proccessors are great. The Jaguar cpu architecture was weaksauce even when it came out in 2013, regardless of it being an x86 architecture. Not saying that the switch cpu is better but I doubt there is going to be a big difference in performance.

2. They say storage is not that important, but they aren't considering all the people that buy digital only.

3. Where did they got the 2.0 ghz cpu number?



“Simple minds have always confused great honesty with great rudeness.” - Sherlock Holmes, Elementary (2013).

"Did you guys expected some actual rational fact-based reasoning? ...you should already know I'm all about BS and fraudulence." - FunFan, VGchartz (2016)

Soundwave said:

Uh what? 

Both the Switch and the Shield Console run at 768 MHz, Nintendo just opted to lop the 1 GHz off becuase it was useless, it down clocks to 768 MHz for any effective gameplay past a few minutes, so it isn't really a 1 GHz chip to begin with.

Sorry. Hadn't had enough coffee so didn't clarify what I meant in the best way possible.

The Switch is 307.2MHz.
It only runs at 768Mhz when:
1) A developer opts to use it.
AND
2) Whilst docked.

Same goes for it's memory bandwidth. You loose a massive amount of bandwidth unless a developer opts to enable it.

Ego. It would not be far-fetched to say that most games will likely use the slower 307.2mhz mode that is half the speed of the Shield TV, even while docked. It is up to the developers. - And not all of them are proactive in supporting all of a systems various nuances.

Plus the Shield has a CPU that is twice as fast, running at a full rate 2ghz and not the Switch's anemic 1ghz CPU.
That costs power don'cha know? The Shield TV is more powerful than the Switch, again not an Apples to Apples comparison.

The good part about it all is, even though the Switch is at a massive performance disadvantage, I fully expect games to look better than any game on Shield, which mostly receives lazy ports of last gen titles.

Soundwave said:
If you're saying the undocked mode is less powerful .... well no shit. It has to run on a battery. The Shield console plugged into wall power consumes 19 watts at max load ... do you have any idea the size of the battery you would need to run such a chip for even 3 hours?


You are asking me this question? Forget the math and proof I provided in the other thread already? ;)

Soundwave said:
Of course the mobile version downclocks, how the fuck would it run for even an hour otherwise? 19 watts for 3 hours = 57 wH battery would be required for even 3 hours of play, no tablet, not even the giant sized 12-inch iPad has a battery that large.


Because there is more to a SoC than only the GPU. *yawn*
Remember the power consumption test that Anandtech did on *just* Tegra's GPU and it only consumed 1.5w of power? Yeah. Thought so.

As for the iPad. We have been over this. It is not an accurate representation of the Switch or Tegra. It is a completely different beast entirely. This horse has been beaten to death already.




--::{PC Gaming Master Race}::--

ktay95 said:
Captain_Yuri said:
Yea I don't think there is any doubts about eurogamer's hardware leaks. They got Pro, Scorpio and the Switch's specs right so.

Hitler was from Europe, can we really trust them??

Damn you StarOcean

Didn't the US illegally invaded Iraq and don't they have Trump in the oval office. I wouldn't dare to thrust US either nor Russia for obvious reasons. What to the Japanse say about the Nintendo Switch.



Please excuse my (probally) poor grammar

Pemalite said:
Soundwave said:

Uh what? 

Both the Switch and the Shield Console run at 768 MHz, Nintendo just opted to lop the 1 GHz off becuase it was useless, it down clocks to 768 MHz for any effective gameplay past a few minutes, so it isn't really a 1 GHz chip to begin with.

Sorry. Hadn't had enough coffee so didn't clarify what I meant in the best way possible.

The Switch is 307.2MHz.
It only runs at 768Mhz when:
1) A developer opts to use it.
AND
2) Whilst docked.

Same goes for it's memory bandwidth. You loose a massive amount of bandwidth unless a developer opts to enable it.

Ego. It would not be far-fetched to say that most games will likely use the slower 307.2mhz mode that is half the speed of the Shield TV, even while docked. It is up to the developers. - And not all of them are proactive in supporting all of a systems various nuances.

******
Plus the Shield has a CPU that is twice as fast, running at a full rate 2ghz and not the Switch's anemic 1ghz CPU.
That costs power don'cha know? The Shield TV is more powerful than the Switch, again not an Apples to Apples comparison.


Soundwave said:
If you're saying the undocked mode is less powerful .... well no shit. It has to run on a battery. The Shield console plugged into wall power consumes 19 watts at max load ... do you have any idea the size of the battery you would need to run such a chip for even 3 hours?


You are asking me this question? Forget the math and proof I provided in the other thread already? ;)

Soundwave said:
Of course the mobile version downclocks, how the fuck would it run for even an hour otherwise? 19 watts for 3 hours = 57 wH battery would be required for even 3 hours of play, no tablet, not even the giant sized 12-inch iPad has a battery that large.


Because there is more to a SoC than only the GPU. *yawn*
Remember the power consumption test that Anandtech did on *just* Tegra's GPU and it only consumed 1.5w of power? Yeah. Thought so.

As for the iPad. We have been over this. It is not an accurate representation of the Switch or Tegra. It is a completely different beast entirely. This horse has been beaten to death already.

The Tegra X1 uses 10 watts alone, sometimes more. There's no way it's using 1.5 watt for dick all other than playing SNES Virtual Console. 

There's nothing Nintendo could do to change any of that either. 

Also other high end mobile GPUs throttle like crazy too ... the Apple A10 which is basically the highest end other mobile SoC throttles a lot:

http://wccftech.com/apple-a10-fusion-gpu-breakdown/

The A10 uses a souped up GT7600 which PowerVR rates about as strong as an XBox 360, though even that is probably misleading because it throttles a lot. It only hits that performance envelope in absolute ideal conditions and can only hold that performance for minutes before it has to clock down. 

Performance well beyond a PS3/360 *sustained* without throttling in a mobile chip running on battery power simply isn't easily doable today, you would need a form factor more equivalent to a laptop with a battery that's well over 10,000 MaH and a large active fan. 

The Snapdragon 820 is the big mobile and that is basically a wash with the Tegra X1:

http://wccftech.com/snapdragon-820-benchmarks/

There is no mythical mobile chip right now that's pushes PS4/XB1 graphics. It doesn't exist, sorry. Even Nvidia's Tegra X1 claims are somewhat dubious .... it's not really a 500 GFLOP processor since it can only hold that performance for like 8-10 minutes max. It's a 384 gigaflop GPU max, which is better than an XBox 360, but to hit that it still needs to gobble down 10+ watts, so your entire SoC is approaching 20 watts total, which is a monstrous amount of electricity for a mobile chip. 

The chip has to be downclocked to 307 MHz in portable, why would you think it's remotely possibly to have it run at full clock in portable mode? You would get 1 hour of battery life even with a relatively large-ish battery (say 5000-6000 MaH). 



Around the Network

Its already widely known that Switch is weak. And at this point its widely known that Eurogamer is pretty reliable. Time to embrace the horror. Switch isn't a god box.



oniyide said:
bdbdbd said:

I'm just quoting a +1 here. It really makes no sense for everyone to run the same games. In terms of AAA games there's only a handful of publishers that make them out of which three are console manufacturers themselves, and they take years to finish anyway, so this can't be a very profitable market in the long term, especially when the majority of the world isn't even interested in the AAA games.

I wouldnt go so far to say people arent interested especially considering how well COD does every year and people are still buying GTA 5 all four SKUs. We just have enough systems that people are already buying for them. Them being on a Ninty system isnt going to help them or Ninty much. Look at the sales of the games that came to Wii U. People here love to say that they were gimped or use it to prove that there is no interest in those games. There is interest, I just think people would rather buy them on their non Ninty system. 

I was just thinking how many people do not buy them, not how many does. The AAA games industry is a small box with high costs in the big picture. Essentially we're talking about games people do not even want to play.

People play the games on the system they're available for. Also, if you already have a system that plays game A, you're not going to get yourself another system for the same game.



Ei Kiinasti.

Eikä Japanisti.

Vaan pannaan jalalla koreasti.

 

Nintendo games sell only on Nintendo system.

Soundwave said:

The Tegra X1 uses 10 watts alone, sometimes more. There's no way it's using 1.5 watt for dick all other than playing SNES Virtual Console. 


Are you calling Anandtech a liar? One of the best sources of information for technology on the internet?

Sure. Whatever mate. I have provided evidence of this before. So if you failed to check the Article out, then that is your own fault, live believing otherwise.

Soundwave said:

Also other high end mobile GPUs throttle like crazy too ... the Apple A10 which is basically the highest end other mobile SoC throttles a lot:

http://wccftech.com/apple-a10-fusion-gpu-breakdown/

The A10 uses a souped up GT7600 which PowerVR rates about as strong as an XBox 360, though even that is probably misleading because it throttles a lot. It only hits that performance envelope in absolute ideal conditions and can only hold that performance for minutes before it has to clock down.

 

Why do you keep using Apple as a comparison?
The A10 has a CPU that is much faster than Tegra and actually runs at it's full rate.
It has more memory bandwidth than Tegra.
It has a higher resolution display.
It has various modems and logic for various tasks.

Apple has also modified the A10's GPU, thrown out various blocks and implemented it's own propriety design.

Common I expect better than that from you. When will you learn that such comparisons are utterly pointless and not representative of each other?


Soundwave said:
Performance well beyond a PS3/360 *sustained* without throttling in a mobile chip running on battery power simply isn't easily doable today, you would need a form factor more equivalent to a laptop with a battery that's well over 10,000 MaH and a large active fan.

You keep saying that. But Tegra is built on an old 20nm Planar process. It doesn't even use Finfet.

The Switch is also using a CPU that is operating at only 1ghz. (Massive power saving there.)
Is using a small 6-7" 720P screen. (Power saving also there.)
Doesn't have any LTE/3G/2G modems. (Power saving there.)

Ergo. It is entirely possibly to have performance that exceeds the Switch. Sustained without a 10,000MaH battery.

Soundwave said:

The Snapdragon 820 is the big mobile and that is basically a wash with the Tegra X1:

http://wccftech.com/snapdragon-820-benchmarks/

There is no mythical mobile chip right now that's pushes PS4/XB1 graphics. It doesn't exist, sorry. Even Nvidia's Tegra X1 claims are somewhat dubious .... it's not really a 500 GFLOP processor since it can only hold that performance for like 8-10 minutes max. It's a 384 gigaflop GPU max, which is better than an XBox 360, but to hit that it still needs to gobble down 10+ watts, so your entire SoC is approaching 20 watts total, which is a monstrous amount of electricity for a mobile chip.

You just lost all credability by using flops in such a pointless, inaccurate manner. Please stop it.
You do know that a 500Gflop GPU can be faster than a 1 Teraflop GPU? Right? You do know that there is more to a GPU than their theoretical single precision floating point performance, right? right?

And not once have I ever stated that I expect Playstation 4/Xbox One graphics out of the Switch. I expect "good enough" which the Switch clearly is not.

Soundwave said:
The chip has to be downclocked to 307 MHz in portable, why would you think it's remotely possibly to have it run at full clock in portable mode? You would get 1 hour of battery life even with a relatively large-ish battery (say 5000-6000 MaH).

It doesn't have to be downclocked to 307mhz. It is what Nintendo chose. There is a difference.

The technology exists today where the chip could have been faster and use less power. That is an undeniable fact.



--::{PC Gaming Master Race}::--

Pemalite said:
Soundwave said:

Uh what? 

Both the Switch and the Shield Console run at 768 MHz, Nintendo just opted to lop the 1 GHz off becuase it was useless, it down clocks to 768 MHz for any effective gameplay past a few minutes, so it isn't really a 1 GHz chip to begin with.

Sorry. Hadn't had enough coffee so didn't clarify what I meant in the best way possible.

The Switch is 307.2MHz.
It only runs at 768Mhz when:
1) A developer opts to use it.
AND
2) Whilst docked.

Same goes for it's memory bandwidth. You loose a massive amount of bandwidth unless a developer opts to enable it.

Ego. It would not be far-fetched to say that most games will likely use the slower 307.2mhz mode that is half the speed of the Shield TV, even while docked. It is up to the developers. - And not all of them are proactive in supporting all of a systems various nuances.

Plus the Shield has a CPU that is twice as fast, running at a full rate 2ghz and not the Switch's anemic 1ghz CPU.
That costs power don'cha know? The Shield TV is more powerful than the Switch, again not an Apples to Apples comparison.

The good part about it all is, even though the Switch is at a massive performance disadvantage, I fully expect games to look better than any game on Shield, which mostly receives lazy ports of last gen titles.

Is this because of the Android OS?



bdbdbd said:
oniyide said:

I wouldnt go so far to say people arent interested especially considering how well COD does every year and people are still buying GTA 5 all four SKUs. We just have enough systems that people are already buying for them. Them being on a Ninty system isnt going to help them or Ninty much. Look at the sales of the games that came to Wii U. People here love to say that they were gimped or use it to prove that there is no interest in those games. There is interest, I just think people would rather buy them on their non Ninty system. 

I was just thinking how many people do not buy them, not how many does. The AAA games industry is a small box with high costs in the big picture. Essentially we're talking about games people do not even want to play.

People play the games on the system they're available for. Also, if you already have a system that plays game A, you're not going to get yourself another system for the same game.

Actually this made me really curious - what is the market share of AAA games? I'm sure someone has made a nice pie chart :D