By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Tegra X1 (Switch GPU?) Throttles To 768 MHz (Eurogamer Correct Again?)

Pemalite said:

Because there is more to a SoC than only the GPU. *yawn*
Remember the power consumption test that Anandtech did on *just* Tegra's GPU and it only consumed 1.5w of power? Yeah. Thought so.

You 've said the GPU has more TDP headroom. No,  it does not.

That 1.5W is a misenterpretation of data. The test was conducted with an underclocked GPU. The test was not to show how much the GPU consumes but how effecient it can be. It shows that X1's GPU's performance at 1.5W is comparable to an iPad Air 2's(A8X) GPU which runs at 2.5W. It's unbelievable for a device that regurlarly hover at ~15W at load to only have a 1.5W GPU which takes up a significant portion of the SoCs size. The A57's are power hungry (which explains switch's heavy downclock) but not that hungry. The X1 GPU regulary perform over 2x that os the A8X in graphics benchmark to, assuming power effeciency linear (it isn't), that is 3W, but since it is nonlinear, lets just say 4W. Even then, I think that's being conservative, I wouldn't be suprise if it was >5W at full clockspeed.

 



Around the Network
nomad said:
Pemalite said:

Because there is more to a SoC than only the GPU. *yawn*
Remember the power consumption test that Anandtech did on *just* Tegra's GPU and it only consumed 1.5w of power? Yeah. Thought so.

You 've said the GPU has more TDP headroom. No,  it does not.

That 1.5W is a misenterpretation of data. The test was conducted with an underclocked GPU. The test was not to show how much the GPU consumes but how effecient it can be. It shows that X1's GPU's performance at 1.5W is comparable to an iPad Air 2's(A8X) GPU which runs at 2.5W. It's unbelievable for a device that regurlarly hover at ~15W at load to only have a 1.5W GPU which takes up a significant portion of the SoCs size. The A57's are power hungry (which explains switch's heavy downclock) but not that hungry. The X1 GPU regulary perform over 2x that os the A8X in graphics benchmark to, assuming power effeciency linear (it isn't), that is 3W, but since it is nonlinear, lets just say 4W. Even then, I think that's being conservative, I wouldn't be suprise if it was >5W at full clockspeed.

 

Even at 5 watts, that's 100 GFLOPS/watt, if Nvidia was actually achieving that, everyone would be using Nvidia. 

You could have a 1.4 teraflop machine for just 14 watts, the XBox One die shrunk to 14nm/16nm (even smaller than the Tegra X1's 20nm) still consumes over 60 watts ... there's no way Nvidia has such magic up their sleeves, especially with a chip that is about two years old. 



All this talk and the only thing thats gonna matter to the average consumer is how pretty the games look. Thats not to say we shouldnt talk about this or lower our expectations(id prefer a powerhouse too), but lets remember whats really important here - how many major games will we miss out on on the Switch platform



AngryLittleAlchemist said:
All this talk and the only thing thats gonna matter to the average consumer is how pretty the games look.

Well, specs do tend to have a direct correlation with how "pretty" games look.



curl-6 said:
AngryLittleAlchemist said:
All this talk and the only thing thats gonna matter to the average consumer is how pretty the games look.

Well, specs do tend to have a direct correlation with how "pretty" games look.

Dont talk that way to me Curl. 



Around the Network
AngryLittleAlchemist said:
curl-6 said:

Well, specs do tend to have a direct correlation with how "pretty" games look.

Dont talk that way to me Curl. 

What way? I'm not saying specs are all that determines beauty, simply that, for example, PS4 games are on average prettier than Wii U games. Yes, people care about how pretty games are, but that in turn means that they tend to care about specs also.



curl-6 said:
AngryLittleAlchemist said:

Dont talk that way to me Curl. 

What way? I'm not saying specs are all that determines beauty, simply that, for example, PS4 games are on average prettier than Wii U games. Yes, people care about how pretty games are, but that in turn means that they tend to care about specs also.

Calm down lol, i was joking. 

 

Sorry. Every once in a while i find myself saying generic ass statements like that. Its somewhat disengenous ill concide. 

 

It really depends on how much worse the games will look scaled back, and we wont know until we see them. I think the average consumer could probably overlook scaled back graphics though... especially when the difference is from high to medium(but thats just ue4 ... just an engine and a specific one at that). I think even a 720p standard resolution for docked hardware intensive games wouldnt do much. i Think whats more concerning will be what customers read on the back of the box(i.e. 900p, 720p sub hd etc) than what they see. Customers might not be able to tell a difference from a resolution standpoint but will care when your upfront about it. Maybe im not giving them enough credit. Then again, theres a lot of people that still cant see the difference in 30 and 60 fps



AngryLittleAlchemist said:
curl-6 said:

What way? I'm not saying specs are all that determines beauty, simply that, for example, PS4 games are on average prettier than Wii U games. Yes, people care about how pretty games are, but that in turn means that they tend to care about specs also.

Calm down lol, i was joking. 

Sorry. Every once in a while i find myself saying generic ass statements like that. Its somewhat disengenous ill concide. 

It really depends on how much worse the games will look scaled back, and we wont know until we see them. I think the average consumer could probably overlook scaled back graphics though... especially when the difference is from high to medium(but thats just ue4 ... just an engine and a specific one at that). I think even a 720p standard resolution for docked hardware intensive games wouldnt do much. i Think whats more concerning will be what customers read on the back of the box(i.e. 900p, 720p sub hd etc) than what they see. Customers might not be able to tell a difference from a resolution standpoint but will care when your upfront about it. Maybe im not giving them enough credit. Then again, theres a lot of people that still cant see the difference in 30 and 60 fps

My bad, I'm absolutely terrible at detecting sarcasm or jokes. XD



curl-6 said:
AngryLittleAlchemist said:

Calm down lol, i was joking. 

Sorry. Every once in a while i find myself saying generic ass statements like that. Its somewhat disengenous ill concide. 

It really depends on how much worse the games will look scaled back, and we wont know until we see them. I think the average consumer could probably overlook scaled back graphics though... especially when the difference is from high to medium(but thats just ue4 ... just an engine and a specific one at that). I think even a 720p standard resolution for docked hardware intensive games wouldnt do much. i Think whats more concerning will be what customers read on the back of the box(i.e. 900p, 720p sub hd etc) than what they see. Customers might not be able to tell a difference from a resolution standpoint but will care when your upfront about it. Maybe im not giving them enough credit. Then again, theres a lot of people that still cant see the difference in 30 and 60 fps

My bad, I'm absolutely terrible at detecting sarcasm or jokes. XD

Lol I felt like a dick when you didn't get what i was saying XD 



Soundwave said:
nomad said:

You 've said the GPU has more TDP headroom. No,  it does not.

That 1.5W is a misenterpretation of data. The test was conducted with an underclocked GPU. The test was not to show how much the GPU consumes but how effecient it can be. It shows that X1's GPU's performance at 1.5W is comparable to an iPad Air 2's(A8X) GPU which runs at 2.5W. It's unbelievable for a device that regurlarly hover at ~15W at load to only have a 1.5W GPU which takes up a significant portion of the SoCs size. The A57's are power hungry (which explains switch's heavy downclock) but not that hungry. The X1 GPU regulary perform over 2x that os the A8X in graphics benchmark to, assuming power effeciency linear (it isn't), that is 3W, but since it is nonlinear, lets just say 4W. Even then, I think that's being conservative, I wouldn't be suprise if it was >5W at full clockspeed.

 

Even at 5 watts, that's 100 GFLOPS/watt, if Nvidia was actually achieving that, everyone would be using Nvidia. 

You could have a 1.4 teraflop machine for just 14 watts, the XBox One die shrunk to 14nm/16nm (even smaller than the Tegra X1's 20nm) still consumes over 60 watts ... there's no way Nvidia has such magic up their sleeves, especially with a chip that is about two years old. 

100GFLOPs/W, wow, i wasn't even thinking it through, just throwing numbers out there. We all know flops aren't comparable across the board even with X1 and maxwell-based graphics cards, but that number is still unreal. On that note, i take back my initial estimate figures.