By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Talk about amateur hour journalism.

 

So, let me preface this by explaining that I feel the developers are completely misunderstanding the Wii U. As with any new generation of consoles, there are bound to be some kinks the developers don’t yet understand with the new hardware. Out the gate, lets tackle the main complaint. The “slow” CPU. You see, the clock speed of the CPU is slow. In fact, it’s intentionally slow. It’s true that the CPU of the Wii U is a Power-Based custom IBM processor. For those wondering, it’s not a Power 7 based. Essentially, it means it’s built upon the same technology as the Wii. That’s on purpose mind you.

The the CPU is more than just the Wii’s version times three. For starters, it uses a newer technology from IBM called 45nm SOI process. This is a massive improvement over the 90nm SOI process the Wii version had. Essentially, this makes the processor “overclock” at higher end speeds while reducing power consumption. Essentially, it makes the Wii U’s processor run even faster than it’s clock speeds with higher efficiency and lower power output. Of course, that’s only going to make the processor run just a bit better, so it still is fundamentally slow. However, the CPU uses a significant amount of eDRAM. What this allows the Wii U to do, essentially, is move more data at once compared to the Wii processor times three. You can store more data in the eDRAM and as such, move more at once as well. So, the processor is slower, but similar to the Power 7 tech, it moves more data at once at that slower rate than the standard processor at such rates.

Again though, the CPU is not Power 7 based, so it only increases the performance level by a certain amount. Certainly though, the performance increase does fundamentally make it have a higher efficiency than whats in the current gaming consoles. It just wont win any awards in the next generation. However, thanks in large part to the super beefy GPU (by console standards), the Wii U doesn’t need a beefy processor. The GPU is based on a Radeon HD 5670. As the console break down shows, thanks to even more eDRAM, the Wii U version of the Radeon HD 5670 that has been custom built is actually more powerful than the original PC version of the card. This is extremely vital, because the HD 5670 is in fact a GPGPU.

A GPGPU are quickly become standard in gaming rig PCs for a few reasons, but chief among them is the fact that GPGPU’s actually handle several functions that the CPU traditionally does, except it does them better. In layman’s terms, some of the most CPU (processor) heavy applications like Physics are actually handled by the Graphics Card (GPU). This means, in a Wii U, while the CPU can handle these tasks, the console is actually built around the notion that the GPU will tackle said tasks.

The downfall of the GPGPU is that if it’s not being used in the fashion it’s intended to, which means taking on heavy graphical processing and extreme mathematical equations to perform in the game (aka, if you’re not heavily using the GPU and it’s processing), it actually performs slower. So if you run the bare bones through it, throw everything else at the processor, it not only over strains a processor not meant to handle such functions, it actually causes the GPU to under-perform, and as such… create problems. This is chiefly why games like Mass Effect 3 are having issues, and why Batman Arkahm City Armored Edition are having massive frame rate issues.

In essence, it’s not that the Wii U is underpowered. Rather, it’s the the Wii U uses more advanced technology that actually takes away the stress normally associated with a CPU, and in turn lowers power consumption, heat level, and ultimately leads to producing some of the best graphical capabilities on the market. Essentially, with everything customly aimed towards gaming, the Wii U is like a middle of the road gaming PC. A middle of the row gaming rig absolutely destroys the PS3 and Xbox 360.

Now, this doesn’t mean everything is peachy. Unless the PS4 and the 720 are being silly, both should also feature GPGPU’s like the Wii U, and will both likely have better processors than the Wii U. So, the Wii U will be behind, but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U. In fact, since they may have better processors (or, should based on normal conventions), it’s conceivable the other consoles in turn have less powerful GPGPU’s since the processor can help maintain some of what is lost.

I know, it’s a lot of technical jargin, but armed with this information you can see that the Wii U is actually going to hold it’s own quite nicely. It just requires people, like the DICE dev, to fundamentally change the way in which they make games take advantage of the hardware. Similar issues arose with the PS3 and it’s CPU back in the day, and now the issue rises up with the Wii U and it’s GPU.

In the future, developers would be wise to truly take hold of the beefy GPGPU in the Wii U and push it as hard as they can. With all the extra eDRAM running around, and all the processing power the GPU has to handle such aspects like Physics, it’s no wonder current generation games are struggling. Current generation (or last generation for us Wii U owners) are extremely CPU based games. Heavily reliant on the CPU pushing through. In the next generation, this is going to change significantly. The Wii U is already there. It’s just going to take time for console developers to get into the mindset to take advantage of the GPGPU featured in the Wii U, just like what will happen in the other next generation consoles.

In layman’s terms: The Wii U is a sexy, misunderstood, true next generation gaming system that requires developers to change the way in which they program their games in order to get significantly better looking, and better performing, games than what are currently out there. Any concerns over a slow processor are completely negated by the fact the Graphics Card on the Wii U can actually handle some of the larger processes, like physics, usually reserved for the processor.

The Wii U is going to be just fine folks. While it wont be as powerful as the PS4 or 720, that beefy GPGPU is going to ensure the Wii U can stay up with the new systems that are coming. Naturally, the biggest reason to go against console convention and use a GPGPU like the Wii U does is costs. GPU’s are just a lot cheaper, and when you start to be able to toss the larger processes at them like Physics, it pretty much increases the efficiency of the entire console 10 fold. The Wii U is doing something that the PS3 and 360 can only dream of doing presently. Truly, they have to catch up, as do the developers, because the way the hardware processes games is changing, and the Wii U is perfectly set up for that shift.

 

 

I thought so much of this was out and out wrong and ridiculous showing a complete lack of understanding of even the most basic knowledge of hardware it needed to be posted here....

Is this guy trying to educate developers about CPU power and he thinks they misunderstand!!!?

Articles like this just go to show, natural selection really has left us for good.

 

HD 5670 a GPGPU? Bullshit.

45nm process Vs 90nm process means CPU will be faster at same clock speed? Bullshit.

Explanation of benefit of eDRAM to CPU? Bullshit.

Something about the Wii-U being equivalent to a mid-range PC!!!? Bullshit, the PC version of the GPU in the Wii-U cost £50 2.5 years ago....Fuck it, it's so full of shit I'm starting to feel like I'm reading the comments from a fanboy who's deliberately trying to wind people up.

Havem't seen an article this bad on something of this nature for a while though



Around the Network

Problem sounds very similar to the PS3 when it first came out. So to make sure I understand Wii U games are underperforming due to the games that were optimized for ps360 not being optimized for Wii U, which is similar to 360 vs PS3 early this gen...still kinda going on currently?



Essentially, this makes the processor “overclock” at higher end speeds while reducing power consumption.

The main effects of recent shrinks are cost and power consumption, not clocks. Overclocking is also the wrong word.

Essentially, it makes the Wii U’s processor run even faster than it’s clock speeds with higher efficiency and lower power output. 

Process has nothing to do with efficiency.

What this allows the Wii U to do, essentially, is move more data at once

No. It allows the Wii U to access that data in fewer cycles than a RAM fetch.

So, the processor is slower, but similar to the Power 7 tech, it moves more data at once at that slower rate than the standard processor at such rates.

Its memory bandwidth is actually a major weak spot.

Again though, the CPU is not Power 7 based, so it only increases the performance level by a certain amount. 

We don't know anything about the architecture. It might be more modern than Power7. Who knows.

Certainly though, the performance increase does fundamentally make it have a higher efficiency than whats in the current gaming consoles. 

No. The 360 and this are both on the same process node and similar architecture, so they are roughly equal in efficiency per transistor or for cost or power.

he Wii U version of the Radeon HD 5670 that has been custom built is actually more powerful than the original PC version of the card

No it isn't, because it consumes much less power than a 5670. I suspect it is either not as many shaders or has been downclocked.

A GPGPU are quickly become standard in gaming rig PCs for a few reasons, but chief among them is the fact that GPGPU’s actually handle several functions that the CPU traditionally does, except it does them better. 

GPGPU is the ability of the GPU to make general calculations. It's not something extra on the GPU. Running those things on a CPU is more power efficient. PhysX, the only reasonable GPU Physics implementation, is proprietary to Nvidia. The other solutions suck.

he console is actually built around the notion that the GPU will tackle said tasks.

No it isn't.

The downfall of the GPGPU is that if it’s not being used in the fashion it’s intended to, which means taking on heavy graphical processing and extreme mathematical equations to perform in the game (aka, if you’re not heavily using the GPU and it’s processing), it actually performs slower.

GPGPU capabilities are in all modern GPUs from AMD and Nvidia and not using them therefore doesn't slow anything. You can't take them out.

 Rather, it’s the the Wii U uses more advanced technology that actually takes away the stress normally associated with a CPU

The Wii U will still want to run game physics on the CPU for power and dev time reasons.

and in turn lowers power consumption, heat level, and ultimately leads to producing some of the best graphical capabilities on the market.

GPGPU isn't going to lower power consumption.

essentially, with everything customly aimed towards gaming, the Wii U is like a middle of the road gaming PC.

It's nowhere near that capable.

Unless the PS4 and the 720 are being silly, both should also feature GPGPU’s like the Wii U, and will both likely have better processors than the Wii U. 

They have no choice. All graphics cards they could choose are GPGPU capable.

So, the Wii U will be behind, but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U.

The PS4 and 720 will have much better GPUs, because they will not be limiting themselves to 30W for the whole console.

it’s conceivable the other consoles in turn have less powerful GPGPU’s

GPGPU is not more or less 'powerful'. It's a capability of modern architectures that probably will not be used in any 8th gen console for economic reasons.

It just requires people, like the DICE dev, to fundamentally change the way in which they make games take advantage of the hardware.

No. Wii U is similar in programming model to Wii and in architecture to either Wii or 360. It is easy to program for and I think we're close to getting the max out of the hardware already. This is nothing like as bad as PS3 was.

n the future, developers would be wise to truly take hold of the beefy GPGPU in the Wii U and push it as hard as they can. 

They won't.

Current generation (or last generation for us Wii U owners) are extremely CPU based games. Heavily reliant on the CPU pushing through. In the next generation, this is going to change significantly.

Wow something actually true. Modern PC games are more GPU bound now, indeed. Devs will probably need to adjust for weak Wii U CPU and stronger GPU.

ny concerns over a slow processor are completely negated by the fact the Graphics Card on the Wii U can actually handle some of the larger processes, like physics, usually reserved for the processor.

Again no.

The Wii U is going to be just fine folks. While it wont be as powerful as the PS4 or 720, that beefy GPGPU is going to ensure the Wii U can stay up with the new systems that are coming.

GPGPU is irrelevant to whether it will keep up. Which it won't, because those two will be much more expensive and use several times the power.

Naturally, the biggest reason to go against console convention and use a GPGPU like the Wii U does is costs. 

GPGPU isn't something added to the Wii U, per above.

 GPU’s are just a lot cheaper, and when you start to be able to toss the larger processes at them like Physics, it pretty much increases the efficiency of the entire console 10 fold

It really doesn't. Even Nvidia haven't been able to persuade many game companies to adopt PhysX and that's the best implementation (other than terrible quality video encoders)

The Wii U is doing something that the PS3 and 360 can only dream of doing presently.

No.



Xxain said:
Problem sounds very similar to the PS3 when it first came out. So to make sure I understand Wii U games are underperforming due to the games that were optimized for ps360 not being optimized for Wii U, which is similar to 360 vs PS3 early this gen...still kinda going on currently?

I think this is exactly the situation here.



Soleron said:

 

Essentially, this makes the processor “overclock” at higher end speeds while reducing power consumption.

The main effects of recent shrinks are cost and power consumption, not clocks. Overclocking is also the wrong word.

Essentially, it makes the Wii U’s processor run even faster than it’s clock speeds with higher efficiency and lower power output. 

Process has nothing to do with efficiency.

What this allows the Wii U to do, essentially, is move more data at once

No. It allows the Wii U to access that data in fewer cycles than a RAM fetch.

So, the processor is slower, but similar to the Power 7 tech, it moves more data at once at that slower rate than the standard processor at such rates.

Its memory bandwidth is actually a major weak spot.

Again though, the CPU is not Power 7 based, so it only increases the performance level by a certain amount. 

We don't know anything about the architecture. It might be more modern than Power7. Who knows.

Certainly though, the performance increase does fundamentally make it have a higher efficiency than whats in the current gaming consoles. 

No. The 360 and this are both on the same process node and similar architecture, so they are roughly equal in efficiency per transistor or for cost or power.

he Wii U version of the Radeon HD 5670 that has been custom built is actually more powerful than the original PC version of the card

No it isn't, because it consumes much less power than a 5670. I suspect it is either not as many shaders or has been downclocked.

A GPGPU are quickly become standard in gaming rig PCs for a few reasons, but chief among them is the fact that GPGPU’s actually handle several functions that the CPU traditionally does, except it does them better. 

GPGPU is the ability of the GPU to make general calculations. It's not something extra on the GPU. Running those things on a CPU is more power efficient. PhysX, the only reasonable GPU Physics implementation, is proprietary to Nvidia. The other solutions suck.

he console is actually built around the notion that the GPU will tackle said tasks.

No it isn't.

The downfall of the GPGPU is that if it’s not being used in the fashion it’s intended to, which means taking on heavy graphical processing and extreme mathematical equations to perform in the game (aka, if you’re not heavily using the GPU and it’s processing), it actually performs slower.

GPGPU capabilities are in all modern GPUs from AMD and Nvidia and not using them therefore doesn't slow anything. You can't take them out.

 Rather, it’s the the Wii U uses more advanced technology that actually takes away the stress normally associated with a CPU

The Wii U will still want to run game physics on the CPU for power and dev time reasons.

and in turn lowers power consumption, heat level, and ultimately leads to producing some of the best graphical capabilities on the market.

GPGPU isn't going to lower power consumption.

essentially, with everything customly aimed towards gaming, the Wii U is like a middle of the road gaming PC.

It's nowhere near that capable.

Unless the PS4 and the 720 are being silly, both should also feature GPGPU’s like the Wii U, and will both likely have better processors than the Wii U. 

They have no choice. All graphics cards they could choose are GPGPU capable.

So, the Wii U will be behind, but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U.

The PS4 and 720 will have much better GPUs, because they will not be limiting themselves to 30W for the whole console.

it’s conceivable the other consoles in turn have less powerful GPGPU’s

GPGPU is not more or less 'powerful'. It's a capability of modern architectures that probably will not be used in any 8th gen console for economic reasons.

It just requires people, like the DICE dev, to fundamentally change the way in which they make games take advantage of the hardware.

No. Wii U is similar in programming model to Wii and in architecture to either Wii or 360. It is easy to program for and I think we're close to getting the max out of the hardware already. This is nothing like as bad as PS3 was.

n the future, developers would be wise to truly take hold of the beefy GPGPU in the Wii U and push it as hard as they can. 

They won't.

Current generation (or last generation for us Wii U owners) are extremely CPU based games. Heavily reliant on the CPU pushing through. In the next generation, this is going to change significantly.

Wow something actually true. Modern PC games are more GPU bound now, indeed. Devs will probably need to adjust for weak Wii U CPU and stronger GPU.

ny concerns over a slow processor are completely negated by the fact the Graphics Card on the Wii U can actually handle some of the larger processes, like physics, usually reserved for the processor.

Again no.

The Wii U is going to be just fine folks. While it wont be as powerful as the PS4 or 720, that beefy GPGPU is going to ensure the Wii U can stay up with the new systems that are coming.

GPGPU is irrelevant to whether it will keep up. Which it won't, because those two will be much more expensive and use several times the power.

Naturally, the biggest reason to go against console convention and use a GPGPU like the Wii U does is costs. 

GPGPU isn't something added to the Wii U, per above.

 GPU’s are just a lot cheaper, and when you start to be able to toss the larger processes at them like Physics, it pretty much increases the efficiency of the entire console 10 fold

It really doesn't. Even Nvidia haven't been able to persuade many game companies to adopt PhysX and that's the best implementation (other than terrible quality video encoders)

The Wii U is doing something that the PS3 and 360 can only dream of doing presently.

No.

 


That was a legendary post and echoed my every thought as I read the article. Only put much much better than I could hope to.

His last paragraph is especially amusing, GPGPU being a "new" development, when the GPU inside the Wii-U isn't even a GPGPU, but it isn't even a new development, GPGPU style processing has been around on a consumer level now for about 5 years or so. He then goes on to say that developers need to get with the times and learn to code for the Wii-U,

OMG, this is beyond awful, it's just......christ.



Around the Network

@Soleron - Maybe he'll write you back?



tl;dr for me. Summary?



Slimebeast said:
tl;dr for me. Summary?



My first post.



Slimebeast said:
tl;dr for me. Summary?


You should read it, basically it's about a very ignorant person preaching about the power of the Wii-U although he doesn't have a clue what he's talking about, he's factually incorrent, chronologically incorrect.

He thinks reducing the process a CPU is made on for example automatically speeds up the processor. He thinks he knows more than developers basically, when in fact he knows less than some guy sitting in a dressing gown munching peanuts drinking cold coffee who doesn't work. (That'll be me).



Xxain said:
Problem sounds very similar to the PS3 when it first came out. So to make sure I understand Wii U games are underperforming due to the games that were optimized for ps360 not being optimized for Wii U, which is similar to 360 vs PS3 early this gen...still kinda going on currently?

Wii U is actually relatively easy and familiar to code for, it is underperforming due to a slow CPU and weak memory bandwidth. These things can't really be addressed by 'optimising'. And certainly not by 'GPGPU'. If PS4 and 360 are as powerful as they can be for $300-400*, Wii U will be far behind them.

*Subtract the $100+ GamePad from the Wii U sale price