By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Clearing up a major misconception about PowerPC

walsufnir said:
captain carot said:

As for Espresso, it likely isnt a 970. Its as well no stock 750 PPC. THere where already changes made for Gekko, like some SIMD, FPU...

I go with marcan in this context as he is quite a knowledgable guy when it comes to Wii/WiiU hardware and many things we know about the Espresso is coming from him and his findings. In general it's of course not a stock 750 but derived from it and I guess if he says it's 750 we can likely believe him in this context. 

 

Edit: Concerning the ISA I found this: http://www.radgametools.com/bnkhist.htm

"Added Wii-U support for Bink 2 - play 30 Hz 1080p or 60 Hz 720p video! We didn't think this would be possible - the little non-SIMD CPU that could!" I didn't know it doesn't feature SIMD...


This thing about SIMD and changes in terms of FPU was actually somewhat official from IBM years ago, when Nintendo tended to be not so secretive. Even had some saved websites with stuff like that but lost it.

The basic 750 did not have SMP or SIMD, Intel by that time just had brought MMX out. It is official that some changes where made for Gekko and that SIMD was very limited AFAIR. Something like only 32 Bit operations (paired singles) via FPU. Still, that is basic SIMD. Not as specialized as MMX or 3DNow where, and even those two are ancient now.

As for Espresso, i'm just saying that changes we dont know would be possible and actually there could be enough space on the die for some additions. It's not like i dont believe in people like marcan. But everyone who knows 100% for sure has put his name under an NDA, which leaves some things in the dark.

And that goes not only for the CPU but for the GPU as well. Some devs have been talking about a shader model 5 like featureset, but that the API would need some workarounds. That would mean Latte does not have an R700 based GPU but one based on Evergreen. Funnily, the 5550 is an almost perfect fit. ROP's, TMU's, number of shader units and actually even the power consumption fits in.

 

There are just to much things with Wii U, we dont know 100% for sure or actually do not know at all but make an educated guess.


That said, it definitely isnt powerful.



Around the Network
WolfpackN64 said:

I know this is in the Nintendo Discussion, but that's because they still use PowerPC chips in their console.

Nearly everytime a topic handles the internals of the Wii U, some comments talk about the "dated PowerPC architecture". This is complete bollocks. The Wii U chipset is derived from a dated PowerPC chip (Espresso, derived from Broadwell, derived from Gecko, derived from PowerPC 750), that is correct, but the PowerPC Architecture itself is NOT dated.

The PowerPC architecture saw the light of day in 1992 with the current form Power ISA v2.07 released in 2013. For comparison, x86 started in 1978 with the most recent implementation x86-64 or AMD64 being 2003.

There, needed to get that of my chest.

Newly designed hardware is never dated, really, but that doesn't change the fact that the industry is currently moving away from that style of hardware and moving on to a different one(even tho that tech is dated as well, apparently). PowerPC hardware is dated in the sense that last years car model is dated. 



“What I say is, a town isn't a town without a bookstore. It may call itself a town, but unless it's got a bookstore it knows it's not fooling a soul.”  - Neil Gaiman

Arkaign said:
In essence, Power8 is a jackhammer made from old melted down cannonballs when you need a sharp chisel (unless you run a datacenter which has existing infrastucture well suited and already up and running on 7+). You must remember that the Power8 is a 22nm product at 250+W TDP, lol.

 

Guess what. Its 22nm by purpose. 250+W TDP ? who cares - it's installed into a clima controlled area anyway. 

You simply don't have any clues about whats needed to drive high thruput transaction applications. Thats why IBM and Oracle/Sun still are around and have no serious competition. Of which to speak - they will gain even more lead against "simple" x86 CPUs as they are putting software into silicon...



fleischr said:
Good post.

Every time this topic comes around, there's always some moron who spouts a bunch of nonsense and puts out the myth that the WiiU's internals are exactly the same as the GameCube's.

 

It doesn't matter what how old the Wii U's architecture is.  The fact is that the practice of using PowerPC is outdated to use since x86 and ARM are far easier to program and have become the standard architectures of gaming.



Captain_Tom said:
fleischr said:
Good post.

Every time this topic comes around, there's always some moron who spouts a bunch of nonsense and puts out the myth that the WiiU's internals are exactly the same as the GameCube's.

 

It doesn't matter what how old the Wii U's architecture is.  The fact is that the practice of using PowerPC is outdated to use since x86 and ARM are far easier to program and have become the standard architectures of gaming.


Far easier to program for? What? Especially arm... no, the power isa is known since many years and IBM is known for especially good compilers.



Around the Network
walsufnir said:
Captain_Tom said:
fleischr said:
Good post.

Every time this topic comes around, there's always some moron who spouts a bunch of nonsense and puts out the myth that the WiiU's internals are exactly the same as the GameCube's.

 

It doesn't matter what how old the Wii U's architecture is.  The fact is that the practice of using PowerPC is outdated to use since x86 and ARM are far easier to program and have become the standard architectures of gaming.


Far easier to program for? What? Especially arm... no, the power isa is known since many years and IBM is known for especially good compilers.

At the very least it is better known by gaming programmers, and thus easier.  All major game engines are built to work on them, making a port to Wii U annoying.



Captain_Tom said:

At the very least it is better known by gaming programmers, and thus easier.  All major game engines are built to work on them, making a port to Wii U annoying.

How does that factor into making it easier to program for those devices when a compiler could just simply generate instructions for multiple ISAs ?



fatslob-:O said:
Captain_Tom said:

At the very least it is better known by gaming programmers, and thus easier.  All major game engines are built to work on them, making a port to Wii U annoying.

How does that factor into making it easier to program for those devices when a compiler could just simply generate instructions for multiple ISAs ?


LOL it's not just that "Simple."  I am only saying what has been repeated by numerous devs...and common sense of course.



Captain_Tom said:
fatslob-:O said:

How does that factor into making it easier to program for those devices when a compiler could just simply generate instructions for multiple ISAs ?


LOL it's not just that "Simple."  I am only saying what has been repeated by numerous devs...and common sense of course.


Not common sense. Common sense is that comilers do the hard work for the programmer. The most important thing the developer can do is studying the effects of their data structures, data and program flow on the CPU caches and act accordingly.

So: as the Wii U has MORE L2 cache per Core than the XOne and PS4 Cores. Code runs on the Wii U cores more efficient. But the HD twins make it more then up with more cores and - of course - a better GPU.

BTW: take a look at the PS4 developer presentations. The small L2 cache is hurting BIG time when CPU AND GPU are accessing the GDDR...

Or the other way round: the Wii Us big L2 cache, big eDRAM and balanced CPU core / GPU enabled to deliver more than most people expected from such a configuration...

 

 

 



Captain_Tom said:


LOL it's not just that "Simple."  I am only saying what has been repeated by numerous devs...and common sense of course.

So you base your argument on informants rather than technical aspects, OK then ...