By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Tegra X1 (Switch GPU?) Throttles To 768 MHz (Eurogamer Correct Again?)

Soundwave said:

It may use less power from the wall but it's not as if magically a 20 watt SoC is going to become a 8 watt SoC.

You sure about that? Decent PSU's in the PC space typically top out at 80% efficient.

Cheaper power supplies can be anywhere afrom 50-60% efficient... And this is without getting into efficiency curves, temperatures, ripple, etc'.

So you could be looking at a reduction down to 16w at 80% efficiency, or 10w if the PSU is 50% efficient, still think it's insignificant?

Electrical Engineering 101.

Soundwave said:


There's a reason why most tablet makers don't use a Tegra X1 and Nvidia was having problems finding vendors for the chip ... it's too beastly of a chip and consumes way too much power. That makes it better suited for the tasks of being a game machine than a tablet.

nVidia is also expensive. It's not just power consumption that OEM's need to contend with.

Plus, most OEM's have their own SoC's.
Apple builds it's own, Samsung builds it's own (And they are both a massive % of the entire market), Huawei builds it's own. etc'.
Then you have Qualcomm for OEM's who want a high-end chip but don't make SoC's themselves like Microsoft, HTC etc'.
MediaTek and AllWinner targeting the more budget orientated devices.

Intel tried to crack the market with it's x86 Atom chips, which other than a couple of design wins, was a colossal dissapointment, everyone had deals and contracts in place, even if Intel was faster, cheaper and used less Power, OEM's went with what they knew.
nVidia is in the same predicament.
nVidia did gain relevency at a few points, such as with Tegra 2 and 3, but that was mostly due to nVidia having a lower price to shift inventory.

Plus nVidia's CPU performance is pretty terrible in the ARM space all things considered. Their A57 + A53 Big.Little configuration was a dime a dozen, thus not attractive in a high-end handset.
Thankfully their Big.Super core layout with their latest Tegra is turning that on it's head.

Soundwave said:

Running the Unreal Engine 4 demo, Nvidia themselves said the GPU alone consumes 10 watts of electricity. Then you have to factor in the RAM, CPU, WiFi, LCD screen, active fan, etc. all needing power too.

The GPU block itself uses about 1.5w of power. I have already linked you to evidence at Anandtech prior who did testing on this.

Soundwave said:

You're probably well past 15 watts ... 15 watts is a ridiculous power consumption for a mobile device.

But the Switch wouldn't be using 15 watts. Or 20 watts. Are you forgetting about everything else?

Soundwave said:

Nvidia is also somewhat loose in how they define power consumption, they admit for example that even the older Tegra K1 can spike to over 10 watts of consumption itself when pushed to its higher peaks. Don't believe all the hype on power consumption with these chips, if they were really getting like 50 GFLOPS per watt, Sony and MS would be using them for their consoles too as that performance is way better the console GPUs they're using. My guess is in reality peak performance and electrical usage is not close to the ideal scenarios companies like Nvidia and PowerVR like to tout.


Uh. You do know that the Geforce 1080 is a 8873 Gflop (boost) GPU, right? It has an 180w TDP. (And uses close to that in power.)

8873 / 180 = 49.29 for Pascal.

Radeon Pro Duo has 16380 of total gflop performance with a 350w TDP for 46Gflop per watt.
Radeon R9 Nano is 8192 Gflops at 175w TDP for 46Gflop per watt.

But here is the kicker.
AMD stated that with the Radeon R9 290X, the GDDR5 memory was responsible for 15-20% of the GPU's total power consumption, which was 50 watts, so DRAM on a GPU consumes allot of power.

So Pascal actually *DOES* exceed 50Gflop of theoretical single precision floating point performance per watt if you discard the dedicated RAM for the GPU.

Not that flops actually mean anything in the real world. You can have a 100 teraflop GPU end up slower than a 100Gflop GPU, but it's interesting blowing up your ascertion into a billion pieces.


Soundwave said:

They know they can get away with it because most people only use their tablet for low level gaming and web browsing ... Switch does not have that luxury, it needs to be able to process relatively high end 3D graphics and that's where performance comes crashing down to the reality of what's possible with a 6000 MaH battery. 

Even if you could put a ridiculous sized 12,000 MaH battery and made this thing a giant device, there are overheating concerns too, this thing has to be held by little 8-year-olds, it can't be pushing itself to the peak and getting hot to the touch. This is also likely why Nintendo does not allow the system to be held in docked mode, really if they wanted just TV streaming a cheap device ala the Google Chromecast type thing would've worked fine. They don't want kids holding the device when its running full tilt because it gets hot IMO.

 


The facts say otherwise. The technology exists. Today. For better performance than the Switch in a Handheld.
All I have to do is whisper seductively "16nm Finfet" to prove my point, end of discussion.

Nintendo cheaped out, plain and simple. They cheaped out on the Wii U's hardware, they cheaped out on the Wii's hardware, they cheaped out on the 3DS hardware, this is just a continuation of that trend.
Nintendo doesn't like to use capital intensive high-end hardware and that's fine for some, but I am a hardware enthusiast who does tend to expect more for the dollarydoo's that are being placed down on the table.

bdbdbd said:

Also, the wireless controllers are going to consume power, as well as possible 3/4G and Wi-Fi. Hardwarewise the tablets and smartphones should be more simple devices than Switch.

The controllers have their own battery's. Otherwise you wouldn't be able to undock them from the Switch.



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:

Nintendo cheaped out, plain and simple. They cheaped out on the Wii U's hardware, they cheaped out on the Wii's hardware, they cheaped out on the 3DS hardware, this is just a continuation of that trend.
Nintendo doesn't like to use capital intensive high-end hardware and that's fine for some, but I am a hardware enthusiast who does tend to expect more for the dollarydoo's that are being placed down on the table.

bdbdbd said:

Also, the wireless controllers are going to consume power, as well as possible 3/4G and Wi-Fi. Hardwarewise the tablets and smartphones should be more simple devices than Switch.

The controllers have their own battery's. Otherwise you wouldn't be able to undock them from the Switch.

Yeah, the controllers have their own batteries, but judging by the Switch trailer, the controllers are wireless, meaning you need to have some sort of a receiver/transmitter for the controllers, and I'd be willing to bet it consumes power.

If you're a high-end tech enthusiast, what's the point in debating about the tech in videogame consoles you know them not being high-end anyways. I can understand the point being for the sake of discussion, but even then it's pointless if your only argument is that "company X shouldn't be doing a product for it's customers".



Ei Kiinasti.

Eikä Japanisti.

Vaan pannaan jalalla koreasti.

 

Nintendo games sell only on Nintendo system.

Miyamotoo said:
DonFerrari said:

Not talking to anyone in particular, hence no quotes or naming. It is true to any fanbase or person that when you expect too much you can get frustrated with the result.

Agree, it always better to have expectations low, but sometime it's hard to maintain hype. :D

Well I always end up being overhyped a lot of times... Sony usually delivered on my hypeness or close to it, the only issue I saw with some rumors on switch is that they didn't made sense in real world (power x consumption) and that Nintendo since Wii haven't been worried with being very powerfull (and even we had one speach from CEO saying they weren't going for power on NX).

I don't remember you being one of the people that were buying into rumors of very powerfull HW, but certainly if I were a Nintendo fan I would allow myself to often think "well, it could be powerfull, I would like it, so perhaps I can get excited".

Snip

Pemalite said:

I see you talking often about Gflops not being everything, and on the surface I can understand that GHz aren't precise measure of capacity, Gflops not being the only variable, that you also have bandwidth, efficiency, coding, etc...

But just not to stay on the hyperbolic a 100Tflop could run worse than a 100Gflop CPU or GPU... on the real world what would be expected deviations... like last gen GPU performing at 1Tflop would be roughly equivalent to this gen GPU at 850Glops, etc.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:
Miyamotoo said:

Agree, it always better to have expectations low, but sometime it's hard to maintain hype. :D

Well I always end up being overhyped a lot of times... Sony usually delivered on my hypeness or close to it, the only issue I saw with some rumors on switch is that they didn't made sense in real world (power x consumption) and that Nintendo since Wii haven't been worried with being very powerfull (and even we had one speach from CEO saying they weren't going for power on NX).

I don't remember you being one of the people that were buying into rumors of very powerfull HW, but certainly if I were a Nintendo fan I would allow myself to often think "well, it could be powerfull, I would like it, so perhaps I can get excited".

 

 

Snip

We already know more-less about power of Switch, I am hyped about games, to see rumored games and games we don't know anything about running on Switch.



Miyamotoo said:

my expectations about Switch are very realistic and actualy on pair what we know about Switch.

Like when you predicted Switch would be 600 Gigaflops or more, claimed bonzo's prediction of 400 was unrealistically low, and that Switch would "gets lots of PS4/XBox One games with no noticeable downgrades"?



Around the Network
Scisca said:
If it turns out to be true that Switch uses the old Maxwell chip instead of Pascal and the older ARM CPU... I think I'm gonna hold on with my purchase. If it's true, I'm gonna be pissed at Nintendo for lying to me, since they said about the top GPU architecture, which Maxwell isn't even close to being. If it's Pascal I'm gonna buy it, if it's Maxwell, I'm gonna wait for the inevitable (imho) revision that will be using newer technology. If only for battery life.

Nintendo, don't mess this up.

This train of thought really confuses me, why dont you just buy the device based on the games and features it has? Why is the difference between Pascal/Maxwell so important?



When the herd loses its way, the shepard must kill the bull that leads them astray.

Miyamotoo said:
Scisca said:
If it turns out to be true that Switch uses the old Maxwell chip instead of Pascal and the older ARM CPU... I think I'm gonna hold on with my purchase. If it's true, I'm gonna be pissed at Nintendo for lying to me, since they said about the top GPU architecture, which Maxwell isn't even close to being. If it's Pascal I'm gonna buy it, if it's Maxwell, I'm gonna wait for the inevitable (imho) revision that will be using newer technology. If only for battery life.

Nintendo, don't mess this up.

Lol, Nintendo never said that.

Sources saying that Pascal's couldn't be done on time (Tegra Pascal isnt on market yet) and even Nvidias newest Shield deviaces (that are intriduced few days ago) are using Tegra X1 Maxwell chip also, and actual only real difference between Pascal and Maxwell Tegra is efficiency, and sources saying that Nintendo is aiming 5-8 hours with Switch battery.

"The Nintendo Switch will be using the same architecture as the world's top performing Geforce graphics cards"

So yeah, they did. It's not on the market? Not my trouble. I'm not the one who lied about the architecture. There's no spinning this, Pascal architecture would be better in every way. Battery life, performance, it would allow for cheaper production and an earlier price drop.

I hate it when companies bs me like this.



Wii U is a GCN 2 - I called it months before the release!

My Vita to-buy list: The Walking Dead, Persona 4 Golden, Need for Speed: Most Wanted, TearAway, Ys: Memories of Celceta, Muramasa: The Demon Blade, History: Legends of War, FIFA 13, Final Fantasy HD X, X-2, Worms Revolution Extreme, The Amazing Spiderman, Batman: Arkham Origins Blackgate - too many no-gaemz :/

My consoles: PS2 Slim, PS3 Slim 320 GB, PSV 32 GB, Wii, DSi.

zorg1000 said:
Scisca said:
If it turns out to be true that Switch uses the old Maxwell chip instead of Pascal and the older ARM CPU... I think I'm gonna hold on with my purchase. If it's true, I'm gonna be pissed at Nintendo for lying to me, since they said about the top GPU architecture, which Maxwell isn't even close to being. If it's Pascal I'm gonna buy it, if it's Maxwell, I'm gonna wait for the inevitable (imho) revision that will be using newer technology. If only for battery life.

Nintendo, don't mess this up.

This train of thought really confuses me, why dont you just buy the device based on the games and features it has? Why is the difference between Pascal/Maxwell so important?

Battery life and performance mainly. The kicker being that going for Maxwell makes a future Pascal revision a no-brainer, so I might as well wait for it.



Wii U is a GCN 2 - I called it months before the release!

My Vita to-buy list: The Walking Dead, Persona 4 Golden, Need for Speed: Most Wanted, TearAway, Ys: Memories of Celceta, Muramasa: The Demon Blade, History: Legends of War, FIFA 13, Final Fantasy HD X, X-2, Worms Revolution Extreme, The Amazing Spiderman, Batman: Arkham Origins Blackgate - too many no-gaemz :/

My consoles: PS2 Slim, PS3 Slim 320 GB, PSV 32 GB, Wii, DSi.

bdbdbd said:

Yeah, the controllers have their own batteries, but judging by the Switch trailer, the controllers are wireless, meaning you need to have some sort of a receiver/transmitter for the controllers, and I'd be willing to bet it consumes power.

It will likely use Bluetooth which is extremely energy efficient.

bdbdbd said:

If you're a high-end tech enthusiast, what's the point in debating about the tech in videogame consoles you know them not being high-end anyways. I can understand the point being for the sake of discussion, but even then it's pointless if your only argument is that "company X shouldn't be doing a product for it's customers".

First and foremost, I am a PC gamer. I do enjoy my tech.

But I also wish for Nintendo, Microsoft and Sony to succeed and be competitive, competition breeds innovation and allows for tech to advance even more rapidly.
And just so we are clear, there is more to tech than just the performance of a system.

Thus, I will happily ridicule any platform which doesn't strive to push boundries in technology, that's not a bad thing either, that's a good thing, these companies need constructive criticism to change and get better and to appeal to our wallets.
Being apologetic does nothing.

For example, Microsoft was heavily ridiculed for the Xbox One, it's price, it's performance, so Microsoft boosted the Clockrate of it's SoC, got rid of Kinect which freed up GPU resources and DRAM... Optimized it's various software stacks. You name it. The consumer won.

DonFerrari said:
Pemalite said:

I see you talking often about Gflops not being everything, and on the surface I can understand that GHz aren't precise measure of capacity, Gflops not being the only variable, that you also have bandwidth, efficiency, coding, etc...

But just not to stay on the hyperbolic a 100Tflop could run worse than a 100Gflop CPU or GPU... on the real world what would be expected deviations... like last gen GPU performing at 1Tflop would be roughly equivalent to this gen GPU at 850Glops, etc.

You are correct. It is hyperbole.
But it's not impossible, I was using it as an example.

If you took a 100 Petaflop GPU, gave it no cache, no GDDR memory, 1 Render-Out-Put Pipeline of questionable capability, 1 Texture mapping unit of questionable capability, 1MB/s of bandwidth to system Ram... Then it will be slower than a 100Gflop GPU which suffers from none of those shortfalls.

If we took a Radeon 5870 for example with it's 2720 Gflop of performance and compare it against a Radeon 7850 which is 1761 Gflop, the Radeon 7850 is going to be faster, even in compute tasks, despite it being almost a Teraflop slower.
But don't take my word for it: http://www.anandtech.com/bench/product/1062?vs=1076

The Radeon 7850 is based on Graphics Core Next which is arguably the true successor to the Radeon 5870's VLIW5 architecture as well. (VLIW4 being a cost-efficient reworking based on VLIW5.)

Flops alone tell us nothing. It's literally a theoretical number, that ignores the rest of the GPU.



--::{PC Gaming Master Race}::--

Pemalite said:
Soundwave said:

It may use less power from the wall but it's not as if magically a 20 watt SoC is going to become a 8 watt SoC.

You sure about that? Decent PSU's in the PC space typically top out at 80% efficient.

Cheaper power supplies can be anywhere afrom 50-60% efficient... And this is without getting into efficiency curves, temperatures, ripple, etc'.

So you could be looking at a reduction down to 16w at 80% efficiency, or 10w if the PSU is 50% efficient, still think it's insignificant?

Electrical Engineering 101.

As a licensed electrician it would kill me not to say anything about this statement, which if I understand correctly, is incorrect.

If I'm understanding this correctly, your saying a 20w power supply connected to a device with a 20w rated SOC, wouldn't allow it to actually pull 20w, but would only be able to pull 16w due to 80% efficiency?  That is mostly incorrect. You have the efficiency portion of it correct though.

In the case of Switch, if you have a power supply that is rated for 20w, that means it will be able to output 20w of DC power max. DC is what electronics use. Now since power supplies are not 100% efficient, and the best you can get is around 80%, what that actually means is that the 20w power supply, has an input rating of 25w AC. The power lines and all buldings have AC power (I know some have DC but its so new and minuscule lets not cause confusion)

This is due to the law of conservation, and the first law of thermodynamics. When AC power from the wall is transformed into DC power for the electronic device, there are some energy losses in the form of heat. This is why power supplies used to run so much hotter than they do now. Old power supplies used to be around 50% efficient, so a large amount of heat was created. Todays power supplies are around 80%, and sometimes higher, which create much less heat.

Now for the battery, if the DC voltage of the battery doesn't match the voltage of some of the electronics in the device, then it needs to be converted, but DC to DC is done quite differently than AC to DC, and to make it short, the amount of loss you get with a DC to DC conversion is quite minimal. This of course is all calculated when designing the system, so supplying enough power, whether through the power supply or battery, will feed the device properly with whatever it requires.

If I'm way off base, and this isn't what you meant, I apologize in advance.