By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Wii U CPU + GPU clock speeds increased? Is it even possible?

fillet said:
BlkPaladin said:
hsrob said:
First question. Is it now drawing 45W of power versus the ~35W it was using initially? Surely that can't be too hard for someone to figure out.


All it takes is a Watt Meter and a bit of math and then factoring out the other devices drain (disc drive, etc).


35W -> 45W would mean nowhere near an increase of 1.5Ghz-3Ghz+ on the CPU.

Google some power draw stats on desktop CPUs, doesn't add up.

I had already brought that out in an earlier post. Just using basic math that is only a 22.22% change and that is maximum theoretical increase which cannot be obtained because of the power lost from heat etc.

I was just answering on how to figure out if it is draining more. I forgot to add you will have to have a base reading from a machine that hasn't underwent the patch in addition to see if there was a change.



Around the Network

I overclocked my i7 to 10GHz with a little firmware tweak so this is totally feasible too.



BaldrSkies said:
I overclocked my i7 to 10GHz with a little firmware tweak so this is totally feasible too.

What are you cooling it with? Liquid Hydrogen? I doubt even liquid Nitrogen would be able to maintain a stable cooling at 10 GHz over a longer period of time...



JEMC said:
Zero999 said:
JEMC said:

The rumored new speed for the CPU is unbelievable. Really. No one can believe that they almost tripled its speed with no serious consequences.

But a more modest increase could be possible. Heck, Nintendo said that the WiiU was designed to use about 45W when playing yet when it was launched and the media tryed it, they recorded a consumption of "only" 35W.

is it that uncommon for a new console not being pushed to it's limits? also, i think 75W is wii u limit.

abot the cpu/gpu clock increse, i have no idea if it can be done or not but i read it wouldn't cause any consequence if the cooling and rest of the system were designed with that in mind to begin with. if true it would be kind of a surprise to everyone since increasing power was only possible with add ons.

No, that's the rated wattage for the power brick.

Of those 72 or 75W you have to substract what it is lost due to the efficiency of the power brick (most of the new PC Power supplies are about 85% of efficiency or higher), then there is the overhead that must be left for safety reasons (you don't really want it to be always at 100% of its capacity) and other things that also make you lose some W.


Doesn't work like that.

A PSU has a rated maximum it can provide to a system, the deduction of efficieny/wasted energy is made PRIOR to the output so it has no bearing at all on anything we are discussing (but might be a good topic on a Greenpeace type forum).

A power brick rated at 75W can provide a consisten 75W of power to a device. The Wii-U would obviously not be going at those limits 100% but it is perfectly "ok" from a design perspective to be using 75W 100% of the time as that is what it is rated to. That's not the "limit", the "limit" would be higher than 75W but 24/7 use can't be guranteed at a higher draw than that.

Computer PSUs are the same a rated "650W" PSU has to be able to provide 650W continuously to the system (with caviats of certain rails only being allowed certain amount of ratio of that power draw etc, but that's a separate PC thing). The actual power draw is going to be around 750-800W on a fairly decent PSU, that gives 80%+ efficiency rating.

Note as said above, those "80%+" guranteed PSUs aren't outputting 80% of 650W for example :)



It's certainly possible. The Wii U may have been running under spec so that they could make their claims of "most efficent" console, which I'm sure matters to someone out there in the world. It wouldn't be tough to up its clock speed and power consumption if it was designed to be able to run at that speed from the outset. It could very well be that Nintendo was running it below max since all of the games were either Mario and NintendoLand or current-gen ports. Perhaps with more demanding games on the way and with some bugs worked out it was time to up the power of the system.

I do think that the rumor sounds like garbage though. It seems completely unfounded.



Around the Network
BaldrSkies said:
I overclocked my i7 to 10GHz with a little firmware tweak so this is totally feasible too.


Useless post of the day goes to you. As this guy said ^ that's not even possible on No2



I think Neogaf debunked this.

Maybe it's possible, but this update didn't do anything.



http://gamrconnect.vgchartz.com/profile/92109/nintendopie/ Nintendopie  Was obviously right and I was obviously wrong. I will forever be a lesser being than them. (6/16/13)

Wait people are taking my post seriously?

It is about as credible as the numbers posted in this rumor though, I suppose...



Nem said:
fillet said:
Nem said:
fillet said:
KHlover said:
Why shouldn't it be? The PSP also did this.


PSP was underclocked by default for power reasons.

That is to say it was running under specification so clock speeds weren't "increased" they were just set as high as they should have been in the first place.

Don't know if that's the case with the Wii-U, incredibly unlikely, or unbelieveable (genuinely, not in the "amazing unbelivable" way.)

Going from circa 1.5Ghz to 3.xx Ghz is just asking a bit much to take seriously that Nintendo would have released the console at only 1.5Ghz considering the moaning the low CPU speed has caused and being responsible for the frame rate issues in the current ports.

There's no reason for the Wii-U to have been underclocked like there was with the PSP, power isn't an issue obviously as it doesn't run on batteries.

 

Yes there was. Because of the nuclear disaster in Japan, there are very pressing concerns about energy consumption in Japan.

I am sure nintendo was aware of this and tried to avoid any bad situations with the media.

 

Though i have to admit, this is a master move by Nintendo if its true. They should have revealed it sooner though.

That's up there in the old UFO conspiracy theories for believeability.

...edit...unless you're joking, hope so!!!!!???


I think you're not showing tact here. It has been said that it was, unfortunely i dont recall where i read it.

It is a very serious business i dont understand why you're treating it like its nothing. It is a big deal in Japan because they want to shut down the nuclear plants. There has been a huge debate about it in the country. Its only obvious a japanese company would take that into its marketing analysis.

What would you think would happen if there was a nuclear meltdown near your town. How would people be feeling about the need of nuclear power and how to diminish that need. Nintendo is branded as a family friendly company. They cant be involved in such scandals, it would severely damage their image and business.

And am i saying thats why they hid the true speed of the clock? No. But it was a very serious concern when designing the system for the Japanese market.

You misinterpret my words, I'm not saying that companies might be taking a general stance on that. But I don't believe for a second there's any direct evidence to support such a direction on noticeable measurable level.

You give too much credit to the world of business and giving a crap about being green unless it makes that business look good. If this came out as being true well it just contradicts the point you make because a company releasing a "green" product only to "unlease the beast within" as something that actually draws at least 25% more power (which this would have to do as power draw isn't linear as CPU speed increases)...is hardly going to look good to the public.

I'm not anti-green, or all that rubbish, it just sounds like a theory with no evidence.

If you could post a link to explain your statement though I'd be grateful as it would be an interesting read and I have no problem changing my apparently narrow minded view, which I know I can be a bit like that sometimes.

(seriously, that's not a piss take) :)



fillet said:
BlkPaladin said:
hsrob said:
First question. Is it now drawing 45W of power versus the ~35W it was using initially? Surely that can't be too hard for someone to figure out.


All it takes is a Watt Meter and a bit of math and then factoring out the other devices drain (disc drive, etc).


35W -> 45W would mean nowhere near an increase of 1.5Ghz-3Ghz+ on the CPU.

Google some power draw stats on desktop CPUs, doesn't add up.

I wasn't suggesting that would be proof of the claims in the OP, I don't for a moment believe the CPU numbers, but if it's not drawing any more power than before that would surely rule out any kind of up/over-clocking.  If the power draw has increased then clearly something has changed that might warrant further investigation.