By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:

@Bold you realize that GCN has to abreviations such a gamecube nintendo or graphics core next which is the newest gpu architecture from amd. 

"you're still making it sound like Wii U's GPGPU capabilities aren't too useful, why? "

You already know the reason, say it with me, VLIW for gpu compute is inefficient :P (BTW VLIW is practically bad in almost all cases for compute.)

"Anyway, even though floating point is supposed to be Espresso's weak points, if used correctly it's not so bad at all. I recall  that a NeoGaf member tested several CPUs in floating point / SIMD, and a PPC750CL (Broadway) didn't do too bad at all compared to other CPUs. (if you havne't seen it before, I'll link you to the post, and if I could find it.) And if Espresso is 3 Broadways with higher clocks and more cache, then it's not to hard to estimate how capable Espresso should be."

Woah there bro, I wasn't talking about the ibm espresso in my last post, It's fine for what it does but don't just hope for mind blowing ingame physics.

We're talking about AMD GPU architecture right? How the hell did you get Gamecube out of that? I'm talking about Graphics Core Next. If I ever refer to the Gamecube when speaking of AMD architecture (no idea why I would do that) I'll use the actual term "Gamecube"

*sigh* Dude, I've known that VLIW 4/5 are inefficent compared to GCN since the announcement of GCN, but you're still making it seem like it had might as well be ignored. If it's documented on the Wii U SDK, and if it works how they want, then they'll use it. If it's too much work to even bother, then they won't. 

Well, I did say that the CPU floating point SIMD wasn't its strongest points, and that GPGPU could allow some leeway, and then you said that we shouldn't expect anything amazing from the CPU because GPGPU won't help with some things that are best done on the CPU (which is what I agreed with). Well, that post in particular was a response to your comment about the CPU not being able to do much with floating point/SIMD type code. Compared to what we will see from PS4/X1 CPUs in terms of core-for-core comparison (since they're Wii U's competition), it's not far off.