By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Wii U's power is great!

gentii said:

x86 architechture runs on what is called CISC. Complex Instruction Set Computing. This allows for developers to have a lot of resources available at the beginning, but very little room to grow. Power-based architecture runs on what is called RISC. Reduced Instruction Set Computing. The memory footprint is much smaller and much faster than that of a CISC-based x86 CPU which are the type used in the XB1 and PS4.

RISC and CISC both have advantages AND disadvantages. Modern x86-CPUs and APUs are hybrids to take advantage of both instruction sets.



Around the Network
freedquaker said:
gentii said:

Ps4 and Xbox One run on an x86 while a Wii U runs on PowerPC. x86 architechture runs on what is called CISC. ...Power-based architecture runs on what is called RISC. ...The memory footprint is much smaller and much faster than that of a CISC-based x86 CPU which are the type used in the XB1 and PS4.... The Wii U does not need 4GB of RAM to do the same level of performance as the PS4 and XB1 ...The bandwidth of the eDRAM in the Wii U can be clocked as high as 1TB per second ... PowerPC architecture is superior to that of x86, ... If the Wii U had an 8-core PowerPC CPU and 8GB of RAM along with the 32MB of eDRAM it has it would absolutely rip the PS4 and XB1 to shreds.

 

 

This is an unbeliveably ignorant post, obviously written by someone who knows very little about the issues mentioned, and wrong in almost all accounts.

a) PowerPC architecture is not necessarily more efficient than x86, it depends. Yes, RISC is generally better at floating point performance (which is important for games) and PowerPC is based on RISC but a processor and its efficiency does not only consist of the instruction code, but a lot more. On most of the criteria today, Power PC architecture is ancient, not even comparable to the x86. This is why IBM stopped developing it (at least actively).

b) The memory footprint of RISC, on the contrary, is higher than CISC. The programs written for risc will almost always take more space and use more memory simply because the instructions are simpler (reduced ISC), and you need more of them to do the same job! What an ignorant claim!

c) Memory size is hardly anything to do with the performance. It's more about the graphics size, game content etc. X360 and XB1 may have the same size worlds but with significantly higher quality and more interactivity on XB1; the same goes with PS3 vs PS4.

d) WiiU has 3 CPUs but only one CPU is the main CPU, while the others are tiny coprocessors. PS4 has coprocessors as well, probably beats the shit out of WiiUs cooprocessors. Well, those coprocessors means nothing in today's console with 8 cores anyway! The ARM coprocessor in the PS4 is not there for the extra performance but for power efficiency!.

e) PowerPC is most definitely INFERIOR to the x86. Superiority is not only about efficiency but also architectural flexibility, scalability, manufacturing process etc... The only real rival to x86 today is the Arm architecture. By the way, the WiiU CPU architecture is based on 1998-99 design, and ancient in every sense of the word.

f) "If WiiU had an octo-core processor with 8 GB RAM...?" ..... Well, that is a BIG "IF". Because you cannot just magically have 8 cores in a processor, you need to design around that and need to address to 8 GB. That's not that easy. PowerPC today doesn't seem to be handling this. You cannot just say "if" without real life examples, or even anything that  comes close!!!

g) The claim that wiiU's edram can be clocked as high as 1 TB is utterly bullshit. No need for further comment! My goodness!

h) Finally, your whole argument is about a complete misconception that "consoles' performance depends on the CPU and RISC is better CISC" mumbo jumbo... this is utterly wrong because the CPU performance is rarely the factor in a console's performance, and it is usually the GPU and RAM bandwidth, both of which are several magnitudes slower than PS4 & XB1.

In summary, the whole OP is wrong, where the only shred of truth is that RISC is more efficient CISC (not PowerPC vs x86) while it has almost no relevance to the performance of the WiiU and other consoles today.

 

actually your comment on wii u edram is utterly bullshit, no need to further comment but i can explain tha with this

 

see?

10.4GB/s for 1MB on the old gamecube

why would be impossible to have 1 terabyte of bandidth with 32MB of 10year new edram at 40nm and not 180nm and also running at 550mhz and not 165mhz?

 

not to mention that those 1024bits per macro of 4MB isnt any surprise cause even gamecube edram had 512bits for the 1MB of texture cache

 

as for the powerpc you must take into account that is a custom design so has parts that 750 doesnt and also performs better. JUst to let you know, is very unlikely to be just a 3 core broadway not just for the mention of new things like full support of out of order, but also cause broadway and 750 were n, already the bebox was a good example of that in the past, you have tot design for multicore environment and thus their performance would be very low, to be prepared for multicore there would have to bemany changes in harwdare like the cache controllers for the cache coherence or changes in the bus at hardware level, etc, so it would be easier to just get a new processor that already supports what you need and do minor modificactions to make it compatible with some stuff the broadway packed, doing it the other way is possible but could be more difficult and also coul mean more money investment



vivster said:
snowdog said:
vivster said:

Deja vu. Just replace every instance of WiiU with PS3 and PowerPC with Cell.

"Games look bad because game devs are lazy and unskilled"

 Also someone should tell him that the CPU only has a supporting role in providing graphics. On the graphic side Wii U uses a slow and outdated AMD chip.

gentii said:

If the Wii U had an 8-core PowerPC CPU and 8GB of RAM along with the 32MB of eDRAM it has it would absolutely rip the PS4 and XB1 to shreds.

This one cracked me up the most^^


Latte isn't outdated, we know for a fact that it has a DX11 equivalent feature set and we have no idea what the clockspeed is.

Latte's performance is actually VERY impressive for what we believe its power draw to be. I'm still convinced that Nintendo have evolved the GameCube/Wii TEV Unit in some way so that porting PS3/PS4/360/One code is easily done but giving 'free' use of common shader operations without the nonstandard rendering pipeline that Nintendo's last two consoles suffered from.

There's definitely something going on under the hood that we're unaware of otherwise the likes of Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU wouldn't be possible.

We've seen a great deal of games in 720p native, 60fps and with v-synch enabled which is impressive. There hasn't been a single developer to my knowledge that has complained about the GPU, and plenty that have praised it.

There hasn't been a dev complaining because no one would dare to compare it to the other next gen consoles.

So no one knows huh?

http://en.wikipedia.org/wiki/Latte_(graphics_chip)

Best case scenario is HD6000 @ 550MHz which was already outdated at the time of launch when its successor was already a year on the market. To top it off, the HD6000 wasn't even a good architecture when it was still relevant.

Nintendo should've waited another year with the WiiU and decide on a proper architecture and current hardware. But instead they were so full of themselves that tehy thought they could get away with anything. That's their conscious decision and shouldn't be defended.


There are comparisons. Just read this mini thread about it...

WiiU vs PS4 GPU power, architectural differences and efficiency

http://gamrconnect.vgchartz.com/thread.php?id=173912&page=1



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

megafenix said:
freedquaker said:
 

 

 

actually your comment on wii u edram is utterly bullshit, no need to further comment but i can explain tha with this

 

see?

10.4GB/s for 1MB on the old gamecube

why would be impossible to have 1 terabyte of bandidth with 32MB of 10year new edram at 40nm and not 180nm and also running at 550mhz and not 165mhz?

 

not to mention that those 1024bits per macro of 4MB isnt any surprise cause even gamecube edram had 512bits for the 1MB of texture cache


So you nitpicked one thing, which you misunderstood, so your whole effort was unnecessary. I wasn't referring to the edram technology as a whole, just its implementation on the WiiU. It's possible but completely unnecessary for the WiiU, and just "not there".

Regardless, the bandwidth of the edram on WiiU is the LEAST of the worries inside the machine. There are countless other limitations of the machine. Its a relatively efficient implementation of a decade old (or older) technology (CPU is over a decade old, GPU is half a decade old etc...)



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

marley said:
Kane1389 said:
marley said:
Kane1389 said:
MegaDrive08 said:
Kane1389 said:
MegaDrive08 said:
Lol let these Sony gamers have there screen resolutions and frames per second because that's all they have, and it doesn't take away the fact Playstation is a boring brand and they've been playing the same way since 1995, Sony will never have the essence of been a video games company, there just an imitation, your playing a Playstation 1 with better graphics, that's what you have there

 



Yes a wiimote and nunchuk, which Sony again tried to imitate, remember!

Sony had a motion controller in 2001 actually

Wow, if you want to play that game - Nintendo had a motion controller in 1989 actually. 

Does that matter?  Those motion control inputs were vastly different.  The 'move' motion controller was incredibly similar and tacked on only after the obvious financial success of the wii-mote. 

You are talking about the power glove? That wasnt made by Nintendo :)

I didn't say it was made by Nintendo.  I said they had a motion controller in 1989.  Nintendo did release the Power Pad in the 80's, which is also a motion controller (not that it really matters). 

No one was claiming that Nintendo invented (or manufactured) the first motion input.   The claim was that the Playstation Move was created to mimic the Wii-mote because of the clear success it was having in the marketplace.  No one can prove that it was or wasn't a copy, but it certainly seems pretty clear that it was. 

Power Pad wasn't a motion controller by any stretch on imagination. Its not anything like a Wii mote or Sony's wand,  so I don't know why you brought it up.

 

By definition,  its not a copy if you've already donr it before,  so no, we can say with full certainty that its not a copy



Around the Network
freedquaker said:
megafenix said:
freedquaker said:
 

 

 

actually your comment on wii u edram is utterly bullshit, no need to further comment but i can explain tha with this

 

see?

10.4GB/s for 1MB on the old gamecube

why would be impossible to have 1 terabyte of bandidth with 32MB of 10year new edram at 40nm and not 180nm and also running at 550mhz and not 165mhz?

 

not to mention that those 1024bits per macro of 4MB isnt any surprise cause even gamecube edram had 512bits for the 1MB of texture cache


So you nitpicked one thing, which you misunderstood, so your whole effort was unnecessary. I wasn't referring to the edram technology as a whole, just its implementation on the WiiU. It's possible but completely unnecessary for the WiiU, and just "not there".

Regardless, the bandwidth of the edram on WiiU is the LEAST of the worries inside the machine. There are countless other limitations of the machine. Its a relatively efficient implementation of a decade old (or older) technology (CPU is over a decade old, GPU is half a decade old etc...)


may seem unnecesary for you, but not for the developers

here

http://www.ign.com/articles/2005/05/20/e3-2005-microsofts-xbox-360-vs-sonys-playstation-3?page=3

"""

E3 2005: Microsoft's Xbox 360 vs. Sony's PlayStation 3

With Sony's specs out, Microsoft has sent us its a comparitive analysis. What's the outcome?

Bandwidth 
The PS3 has 22.4 GB/s of GDDR3 bandwidth and 25.6 GB/s of RDRAM bandwidth for a total system bandwidth of 48 GB/s. The Xbox 360 has 22.4 GB/s of GDDR3 bandwidth and a 256 GB/s of EDRAM bandwidth for a total of 278.4 GB/s total system bandwidth.

Why does the Xbox 360 have such an extreme amount of bandwidth?

Even the simplest calculations show that a large amount of bandwidth is consumed by the frame buffer. For example, with simple color rendering and Z testing at 550 MHz the frame buffer alone requires 52.8 GB/s at 8 pixels per clock. The PS3's memory bandwidth is insufficient to maintain its GPU's peak rendering speed, even without texture and vertex fetches.

The PS3 uses Z and color compression to try to compensate for the lack of memory bandwidth. The problem with Z and color compression is that the compression breaks down quickly when rendering complex next-generation 3D scenes.

HDR, alpha-blending, and anti-aliasing require even more memory bandwidth. This is why Xbox 360 has 256 GB/s bandwidth reserved just for the frame buffer. This allows the Xbox 360 GPU to do Z testing, HDR, and alpha blended color rendering with 4X MSAA at full rate and still have the entire main bus bandwidth of 22.4 GB/s left over for textures.

CONCLUSION 
When you break down the numbers, Xbox 360 has provably more performance than PS3. Keep in mind that Sony has a track record of over promising and under delivering on technical performance. The truth is that both systems pack a lot of power for high definition games and entertainment.

However, hardware performance, while important, is only a third of the puzzle. Xbox 360 is a fusion of hardware, software and services. Without the software and services to power it, even the most powerful hardware becomes inconsequential. Xbox 360 games—by leveraging cutting-edge hardware, software, and services—will outperform the PlayStation 3. 

"""

doesnt need it uh?

 and how come ports work fine on wii u if doesnt have a bandwidth on par with 360 edram or more?(dont introduce the frmaerate cuase that has more to do with the cpu and the framerate aint that different to the 360, just more ineastbale n most of the ports that are lazy, but not in ports were some optimizations were done like need for speed)

the machine may have limits with traditional approach of rendering polygons just like that, but using tesselation you can achieve better graphics, and we have to wait and see, of course that bandwidth and latency are important things to take into account besides power. Even if wiiu is about 400 or 500 gigaflops that should be enough cause AMD already showed that by trading off about 30% performance of your gpu or 33fps you can achieve about 400x more polygons with tesselation

come on dude, everybodyknows that gus need bandwidth lots of bandwidth, be it low, medium or high end that doesnt change

and remember, shinen can store a 1080p framebufferin 16MB of edram on wii u, so if xbox 360 needs the shole 10MB for the 720p, wouldnt that mean wii u edram packs more bandwidth?(do the math, if 1080p=16MB then 720p=7.11MB)

 




@freedquaker, you really need to stop thinking of Expresso being 3 Broadways duct-tapped together and Latte being 5 years old. Whilst Expresso is related to Broadway you should remember that Broadway is a single core processor unable to be pushed to 1GHz or more.

Have you actually seen the die shots of both..?

As with Latte, Expresso is a completely custom chip. Unfortunately the Latte discussion thread got locked on Gaf but I think the Expresso one is still going. You might want to give them a read and find out just how little we know about both chips.

And it should also be noted that Broadway is a VERY capable CPU anyway.



snowdog said:
@freedquaker, you really need to stop thinking of Expresso being 3 Broadways duct-tapped together and Latte being 5 years old. Whilst Expresso is related to Broadway you should remember that Broadway is a single core processor unable to be pushed to 1GHz or more.

Have you actually seen the die shots of both..?

As with Latte, Expresso is a completely custom chip. Unfortunately the Latte discussion thread got locked on Gaf but I think the Expresso one is still going. You might want to give them a read and find out just how little we know about both chips.

And it should also be noted that Broadway is a VERY capable CPU anyway.


actually, can just be based, but cant be just a broadway cause 750 were not made for multicore and thats a harwdare problem not software, we can see that here at ibm

https://www-01.ibm.com/chips/techlib/techlib.nsf/techdocs/291C8D0EF3EAEC1687256B72005C745C#10

"

1.10 Can these processors be used in SMP designs? 

750CXe/FX/GX (and other 750 processors) can work in an SMP environment; it just takes extra work in the software and OS kernel, and there will be extra bus traffic. The fundamental problem is that the cache management instructions (in particular dcbfdcbstdcbi) only operate on the local CPU's caches by default; they are not broadcast on the 60x bus for other processors to see unless ABE is set. Other SMP-capable PowerPC implementations broadcast these operations so they act on all caches in the system. In addition, the 750 family doesn't broadcast TLB invalidations, and it doesn't snoop instructions on the bus, so it wouldn't pick up these other operations even if they were broadcast.So using these processors in an SMP design would require having the software and OS ensure that each CPU in the system performed each of these tasks every time it needed to be done by one of the CPUs.

Basically, SMP operation can be done, but it will require a lot of software overhead, which may impact overall performance for both the kernel and user application code. As with other performance characteristics, it will depend heavily on the application.We have no quantitative data, but if two MEI processors are used without consideration to how tasks are partitioned between the processors, there will be a penalty due to shared data that will be continuously flushed out of one processor when the other processor needs it, along with the maintenance problems of tlbie and dcb operations. If tasks can be partitioned such that there is very little data sharing, then there will be correspondingly very little overhead for maintaining coherency between the two processors.

"

 

MEI processors are not prepared for multicore, you need a MESI OR MERSI coherent ptocessor for that, so wither nintendo made lots of changes on the cpu for multicore or just pick another processor and make the necessary changes to understand understand the broadway code



snowdog said:
@freedquaker, you really need to stop thinking of Expresso being 3 Broadways duct-tapped together and Latte being 5 years old. Whilst Expresso is related to Broadway you should remember that Broadway is a single core processor unable to be pushed to 1GHz or more.

Have you actually seen the die shots of both..?

As with Latte, Expresso is a completely custom chip. Unfortunately the Latte discussion thread got locked on Gaf but I think the Expresso one is still going. You might want to give them a read and find out just how little we know about both chips.

And it should also be noted that Broadway is a VERY capable CPU anyway.

just read this, it is a very reasonable and realistic comparison...

WiiU vs PS4 GPU power, architectural differences and efficiency

http://gamrconnect.vgchartz.com/thread.php?id=173912&page=1



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

even if wiiu is just about 400gigaflops to 500 gigaflops and has a bandwidth between 563.2GB/s to the terabyte of bandwidth for its edram of 32MB, wiiu can still provide graphics that will look ahead of the previous generation only if that power is correctly used in new things like tesselation and things like deferred shading and better textures and so