By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Talk about amateur hour journalism.

fillet said:
Soleron said:

 

Essentially, this makes the processor “overclock” at higher end speeds while reducing power consumption.

The main effects of recent shrinks are cost and power consumption, not clocks. Overclocking is also the wrong word.

Essentially, it makes the Wii U’s processor run even faster than it’s clock speeds with higher efficiency and lower power output. 

Process has nothing to do with efficiency.

What this allows the Wii U to do, essentially, is move more data at once

No. It allows the Wii U to access that data in fewer cycles than a RAM fetch.

So, the processor is slower, but similar to the Power 7 tech, it moves more data at once at that slower rate than the standard processor at such rates.

Its memory bandwidth is actually a major weak spot.

Again though, the CPU is not Power 7 based, so it only increases the performance level by a certain amount. 

We don't know anything about the architecture. It might be more modern than Power7. Who knows.

Certainly though, the performance increase does fundamentally make it have a higher efficiency than whats in the current gaming consoles. 

No. The 360 and this are both on the same process node and similar architecture, so they are roughly equal in efficiency per transistor or for cost or power.

he Wii U version of the Radeon HD 5670 that has been custom built is actually more powerful than the original PC version of the card

No it isn't, because it consumes much less power than a 5670. I suspect it is either not as many shaders or has been downclocked.

A GPGPU are quickly become standard in gaming rig PCs for a few reasons, but chief among them is the fact that GPGPU’s actually handle several functions that the CPU traditionally does, except it does them better. 

GPGPU is the ability of the GPU to make general calculations. It's not something extra on the GPU. Running those things on a CPU is more power efficient. PhysX, the only reasonable GPU Physics implementation, is proprietary to Nvidia. The other solutions suck.

he console is actually built around the notion that the GPU will tackle said tasks.

No it isn't.

The downfall of the GPGPU is that if it’s not being used in the fashion it’s intended to, which means taking on heavy graphical processing and extreme mathematical equations to perform in the game (aka, if you’re not heavily using the GPU and it’s processing), it actually performs slower.

GPGPU capabilities are in all modern GPUs from AMD and Nvidia and not using them therefore doesn't slow anything. You can't take them out.

 Rather, it’s the the Wii U uses more advanced technology that actually takes away the stress normally associated with a CPU

The Wii U will still want to run game physics on the CPU for power and dev time reasons.

and in turn lowers power consumption, heat level, and ultimately leads to producing some of the best graphical capabilities on the market.

GPGPU isn't going to lower power consumption.

essentially, with everything customly aimed towards gaming, the Wii U is like a middle of the road gaming PC.

It's nowhere near that capable.

Unless the PS4 and the 720 are being silly, both should also feature GPGPU’s like the Wii U, and will both likely have better processors than the Wii U. 

They have no choice. All graphics cards they could choose are GPGPU capable.

So, the Wii U will be behind, but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U.

The PS4 and 720 will have much better GPUs, because they will not be limiting themselves to 30W for the whole console.

it’s conceivable the other consoles in turn have less powerful GPGPU’s

GPGPU is not more or less 'powerful'. It's a capability of modern architectures that probably will not be used in any 8th gen console for economic reasons.

It just requires people, like the DICE dev, to fundamentally change the way in which they make games take advantage of the hardware.

No. Wii U is similar in programming model to Wii and in architecture to either Wii or 360. It is easy to program for and I think we're close to getting the max out of the hardware already. This is nothing like as bad as PS3 was.

n the future, developers would be wise to truly take hold of the beefy GPGPU in the Wii U and push it as hard as they can. 

They won't.

Current generation (or last generation for us Wii U owners) are extremely CPU based games. Heavily reliant on the CPU pushing through. In the next generation, this is going to change significantly.

Wow something actually true. Modern PC games are more GPU bound now, indeed. Devs will probably need to adjust for weak Wii U CPU and stronger GPU.

ny concerns over a slow processor are completely negated by the fact the Graphics Card on the Wii U can actually handle some of the larger processes, like physics, usually reserved for the processor.

Again no.

The Wii U is going to be just fine folks. While it wont be as powerful as the PS4 or 720, that beefy GPGPU is going to ensure the Wii U can stay up with the new systems that are coming.

GPGPU is irrelevant to whether it will keep up. Which it won't, because those two will be much more expensive and use several times the power.

Naturally, the biggest reason to go against console convention and use a GPGPU like the Wii U does is costs. 

GPGPU isn't something added to the Wii U, per above.

 GPU’s are just a lot cheaper, and when you start to be able to toss the larger processes at them like Physics, it pretty much increases the efficiency of the entire console 10 fold

It really doesn't. Even Nvidia haven't been able to persuade many game companies to adopt PhysX and that's the best implementation (other than terrible quality video encoders)

The Wii U is doing something that the PS3 and 360 can only dream of doing presently.

No.

 


That was a legendary post and echoed my every thought as I read the article. Only put much much better than I could hope to.

His last paragraph is especially amusing, GPGPU being a "new" development, when the GPU inside the Wii-U isn't even a GPGPU, but it isn't even a new development, GPGPU style processing has been around on a consumer level now for about 5 years or so. He then goes on to say that developers need to get with the times and learn to code for the Wii-U,

OMG, this is beyond awful, it's just......christ.


Actual Gpcpu  has  been around. Even before xbox360 which also uses it.

fail is fail?



"Excuse me sir, I see you have a weapon. Why don't you put it down and let's settle this like gentlemen"  ~ max

Around the Network

It's not journalism lol.

It's blogging, by guys in their 30's who make 15k a year lol...



ninetailschris said:
fillet said:
Soleron said:

 

Essentially, this makes the processor “overclock” at higher end speeds while reducing power consumption.

The main effects of recent shrinks are cost and power consumption, not clocks. Overclocking is also the wrong word.

Essentially, it makes the Wii U’s processor run even faster than it’s clock speeds with higher efficiency and lower power output. 

Process has nothing to do with efficiency.

What this allows the Wii U to do, essentially, is move more data at once

No. It allows the Wii U to access that data in fewer cycles than a RAM fetch.

So, the processor is slower, but similar to the Power 7 tech, it moves more data at once at that slower rate than the standard processor at such rates.

Its memory bandwidth is actually a major weak spot.

Again though, the CPU is not Power 7 based, so it only increases the performance level by a certain amount. 

We don't know anything about the architecture. It might be more modern than Power7. Who knows.

Certainly though, the performance increase does fundamentally make it have a higher efficiency than whats in the current gaming consoles. 

No. The 360 and this are both on the same process node and similar architecture, so they are roughly equal in efficiency per transistor or for cost or power.

he Wii U version of the Radeon HD 5670 that has been custom built is actually more powerful than the original PC version of the card

No it isn't, because it consumes much less power than a 5670. I suspect it is either not as many shaders or has been downclocked.

A GPGPU are quickly become standard in gaming rig PCs for a few reasons, but chief among them is the fact that GPGPU’s actually handle several functions that the CPU traditionally does, except it does them better. 

GPGPU is the ability of the GPU to make general calculations. It's not something extra on the GPU. Running those things on a CPU is more power efficient. PhysX, the only reasonable GPU Physics implementation, is proprietary to Nvidia. The other solutions suck.

he console is actually built around the notion that the GPU will tackle said tasks.

No it isn't.

The downfall of the GPGPU is that if it’s not being used in the fashion it’s intended to, which means taking on heavy graphical processing and extreme mathematical equations to perform in the game (aka, if you’re not heavily using the GPU and it’s processing), it actually performs slower.

GPGPU capabilities are in all modern GPUs from AMD and Nvidia and not using them therefore doesn't slow anything. You can't take them out.

 Rather, it’s the the Wii U uses more advanced technology that actually takes away the stress normally associated with a CPU

The Wii U will still want to run game physics on the CPU for power and dev time reasons.

and in turn lowers power consumption, heat level, and ultimately leads to producing some of the best graphical capabilities on the market.

GPGPU isn't going to lower power consumption.

essentially, with everything customly aimed towards gaming, the Wii U is like a middle of the road gaming PC.

It's nowhere near that capable.

Unless the PS4 and the 720 are being silly, both should also feature GPGPU’s like the Wii U, and will both likely have better processors than the Wii U. 

They have no choice. All graphics cards they could choose are GPGPU capable.

So, the Wii U will be behind, but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U.

The PS4 and 720 will have much better GPUs, because they will not be limiting themselves to 30W for the whole console.

it’s conceivable the other consoles in turn have less powerful GPGPU’s

GPGPU is not more or less 'powerful'. It's a capability of modern architectures that probably will not be used in any 8th gen console for economic reasons.

It just requires people, like the DICE dev, to fundamentally change the way in which they make games take advantage of the hardware.

No. Wii U is similar in programming model to Wii and in architecture to either Wii or 360. It is easy to program for and I think we're close to getting the max out of the hardware already. This is nothing like as bad as PS3 was.

n the future, developers would be wise to truly take hold of the beefy GPGPU in the Wii U and push it as hard as they can. 

They won't.

Current generation (or last generation for us Wii U owners) are extremely CPU based games. Heavily reliant on the CPU pushing through. In the next generation, this is going to change significantly.

Wow something actually true. Modern PC games are more GPU bound now, indeed. Devs will probably need to adjust for weak Wii U CPU and stronger GPU.

ny concerns over a slow processor are completely negated by the fact the Graphics Card on the Wii U can actually handle some of the larger processes, like physics, usually reserved for the processor.

Again no.

The Wii U is going to be just fine folks. While it wont be as powerful as the PS4 or 720, that beefy GPGPU is going to ensure the Wii U can stay up with the new systems that are coming.

GPGPU is irrelevant to whether it will keep up. Which it won't, because those two will be much more expensive and use several times the power.

Naturally, the biggest reason to go against console convention and use a GPGPU like the Wii U does is costs. 

GPGPU isn't something added to the Wii U, per above.

 GPU’s are just a lot cheaper, and when you start to be able to toss the larger processes at them like Physics, it pretty much increases the efficiency of the entire console 10 fold

It really doesn't. Even Nvidia haven't been able to persuade many game companies to adopt PhysX and that's the best implementation (other than terrible quality video encoders)

The Wii U is doing something that the PS3 and 360 can only dream of doing presently.

No.

 


That was a legendary post and echoed my every thought as I read the article. Only put much much better than I could hope to.

His last paragraph is especially amusing, GPGPU being a "new" development, when the GPU inside the Wii-U isn't even a GPGPU, but it isn't even a new development, GPGPU style processing has been around on a consumer level now for about 5 years or so. He then goes on to say that developers need to get with the times and learn to code for the Wii-U,

OMG, this is beyond awful, it's just......christ.


Actual Gpcpu  has  been around. Even before xbox360 which also uses it.

fail is fail?


GPGPU in the practical sense has been around about 4 years first debuting on Fermi from nvidia I think. There's different interpretations of it in real terms. Technically speaking running  a physics API that can be accelerated by the GPU could be considered GPGPU, but that isn't running actual code on the GPU itself. The Xenon is not GPGPU, I don't even think the HD 5670 is to be honest.

I'm not that clued up to be honest, just remember the buzz when Fermi architecture from nvidia was shown and there was a big hubub about GPGPU processing. I believe that was 3-4 years ago.



Great read i knew the cpu is better then what so called developers are saying.
Its a nice piece of hardware and so is the gpu...



VITA 32 GIG CARD.250 GIG SLIM & 160 GIG PHAT PS3

Not really commenting on the blog, but the Wii U does have a GPGPU, and yes if used properly it would have way more than enough processing power to handle anything being developed for consoles currently available.



Around the Network
Soleron said:

 

No. Wii U is similar in programming model to Wii and in architecture to either Wii or 360. It is easy to program for and I think we're close to getting the max out of the hardware already. This is nothing like as bad as PS3 was.

 

I hope you're kidding.



@ninetailschris - If we go back in history, even Amiga's blitter can be considered GPGPU in a way, but I think in modern sense GPU can hardly be called GPGPU if it can't handle double-precision math. Which none of potential GPUs inside WiiU can.

That does not mean devs can't offload some of the things to it (as it seems they already have with WO3), as well as to audio DSP (which @Snowdog stands for Digital Signal Processor, not Sound), but expecting that it will solve all the problems of underpowered CPU is.....optimistic, to put it mildly.

Both NextBox and PS4 will have very powerful CPUs inside, and devs will use them to its full potential, no matter how powerful their GPUs are (and if rumours are true they can turn out to be quite powerful, as in 3-4x as powerful as WiiU's) - their is just simply myriad of tasks in game codes that are more suited for CPU execution.

That said, I don't doubt for one second that Nintendo's R&D knew exactly what they're doing, and made console that will best serve their 1st parties and their business philosophy, so I'm expecting titles like Zelda to have no issues and look stunning as always.



the_dengle said:
Soleron said:

 

No. Wii U is similar in programming model to Wii and in architecture to either Wii or 360. It is easy to program for and I think we're close to getting the max out of the hardware already. This is nothing like as bad as PS3 was.

 

I hope you're kidding.

He's either kidding or is a few pence short of a quid. The architecture is completely new, nothing like the Wii or the 360. Even typing 'Wii and 360' in that sentence above makes no sense as the Wii and 360 both have different architectures. The Wii has a GPU with fixed functions to display shader effects and the 360 has a GPU with programable shaders to display shader effects.

To say that developers are close to getting the max out of a console at launch is a ridiculous statement, developers have only had final dev kits for the last 6 months...the majority of development has been done on unfinished/underclocked dev kits with constant SDK revisions, which will still go on for the next few years. You'll have devlopers close to getting the max out of the hardware during the console's third generation of games, not before.



snowdog said:
the_dengle said:
...

He's either kidding or is a few pence short of a quid. The architecture is completely new, nothing like the Wii or the 360. Even typing 'Wii and 360' in that sentence above makes no sense as the Wii and 360 both have different architectures. The Wii has a GPU with fixed functions to display shader effects and the 360 has a GPU with programable shaders to display shader effects.

To say that developers are close to getting the max out of a console at launch is a ridiculous statement, developers have only had final dev kits for the last 6 months...the majority of development has been done on unfinished/underclocked dev kits with constant SDK revisions, which will still go on for the next few years. You'll have devlopers close to getting the max out of the hardware during the console's third generation of games, not before.

In 2005, the Xbox had the whole new idea of multiple threads and programmable shaders. Nothing like them had been seen before. The PS3's Cell was also a completely new design that we had no idea how to program for.

The Wii U consists of a GPU every programmer is familiar with, a memory architecture (eDRAM+RAM) the same as the Wii, and a CPU architecture that is either very similar to the Wii or to the 360 (don't know yet). IBM didn't do a whole new custom architecture like they did for Cell.

So I'm speaking in relative terms. It will be a lot faster to get games close to optimised than it was for either of the 360 or PS3. As we saw from Nintendo with the GC, some of their very first games were GC's best looking because they used the GC's shader effects, so first-party games will be there even sooner.

What I'm saying is don't expect it to improve from where it is now to CLEARLY ahead of the 360 and PS3. It's not capable of that.

--

My point about GPGPU was that, even though the Wii U's GPU is capable of it, it will not be used to the extent that it will help performance. Even high-end PC games don't use it in that way except for PhysX. GPGPU in the current meaning arose in the GPU generation after the PS3/360 came out: Nvidia 8000 and AMD 2000 series. 



Ok, I don't know about many of the said details. The thing is: I already find it impressive that the ports, rushed as they have been, are so good. Developers have had the other consoles for years, and the first games for, say, the X360 looked like cripes (Perfect Dark, anyone?).

Besides, as a computer gamer, I'll take better graphics over higher CPU anytime.