By using this site, you agree to our Privacy Policy and our Terms of Use. Close
forethought14 said:
fatslob-:O said:

 
@Bold I got it from this "And why are you comparing Wii U to GCN at all?"

Next time refer the wii u as VLIW.

I don't know about you bro but a 64 bit SIMD engine is pathetic the PS2 had a 128 bit SIMD engine so why not the wii u ?

BTW the PS4 and X1 probably features a 256 bit SIMD engine

Now that I look at the cpu more and more it's just modified ibm broadway or gekko only this time it has 3 cores with higher clock speeds and maybe some new instruction sets. That same ibm broadway processor also features the same 64 bit SIMD engine as the ibm gekko. Shame on nintendo for not moving on to a more advanced cpu architecure but instead they keep it practically the same as gamecube to get teh backwards compatibility.

We were talking about GPGPU, and then we strayed to how VLIW is inefficient compared to GCN. Later, I then asked why are you comparing Wii U (VLIW) to GCN when they're not the same architecture. I'm still confused on how you brought in "Gamecube" into this. I merely stated that GPGPU will play an important factor in all 3 consoles with GCN being easier to use than VLIW. If you thought I was talking about Gamecube, well your reading comprehension skills have failed you at that line. 

http://www.neogaf.com/forum/showpost.php?p=50767125&postcount=3756

Pathetic? Well, maybe compared to an i5 and a multi-hundred dollar CPU, but for a 32-bit x 2 Paired singles CPU (which is technically a "form" of SIMD) clocked at 1.24ghz (if we use Broadway as a base, just increase clocks) it's actually very competitive with a more modern design like Bobcat, which is the predecessor to the Jaguar (2 of the 8 cores are reserved) that we'll be seeing in PS4/X1. Core-for-core, taking advantage of eachother's capabilities (and based on these tests), Espresso and Jaguar should not be too far off.

You don't refer to VLIW as wii u, use proper codnames next time on your part to avoid confusion. 

As for my question of why I compare it to GCN, gpu architectures are getting more modernized by the days but it's not the wii u's fault plus it's going up against next gen consoles so it only far. If I want I could have compared it to fermi because there pretty similar and kepler isn't too far off even though it lost it's hardware schedular. 

If we take the fact that the ibm broadway can pull off 2.9 Gflops and scale it to the the ibm espresso it can only muster up 15 Gflops.

BTW what is featured in the PS4/X1 will be able to pull off around 100 Gflops.

You overestimated it's capabilites.

Edit: Holy crap I just realized something, The PS4/X1's cpu is 30% as strong as the WII U GPU. 

Edit 2: How does that sh#t happen like seriously I thought a gpu was supposed to be wayyyyy stronger than a cpu especially since it's VLIW which is like a god send for Gflops/watt and price. I think nintendo got ripped off by AMD LOLOL.

Edit 3: This thing has being destroyed by those new haswell CPU's which feature up to 400 Gflops of performance which is cpu alone and that's not even counting it's igpu. 

Edit 4: Dude this thing is weaker than a PS3 (flops wise ofcourse) which can pull off 400 Gflops combined compared to the WII U's 365 Gflops combined.

Edit 5: Now I seriously know why 4A games didn't bother with a port of metro last light seeing as how this is inadequate for any sort of rigid body physics in that game. 

BTW I don't hate the WII U I expect some good games on it too.