By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mazty said:
Viper1 said:

The only claims we have on the CPU is that it appears to be clocked slower.   And that makes sense.    But being clocked slower does not make it weaker.  Just compare AMD to Intel back in the late 90's or Intel against AMD since mid 2000's.   

If a dev is using brute clock speed to handle AI routines or the number of enemies on the screen, then yes, a slower clock may be an issue.  But it's not about how high your clock rate is but how much you can do with each clock cycle.   If a dev isn't yet familiar with working with the CPU, they may not yet be able to squeeze out that extra performance.

For example.   Say a CPU runs at 1 Ghz and can perform 100 operations per clock cycle.   That's 100 billion operations per second.   Say a second CPU runs at just 600 Mhz but can perform 200 operations per clock cycle.  That second CPU is actually capable of 120 billion operations per second.   See how the second CPU is actually more powerful than the one with the faster clock rate?  

As for the GPU, well, we already know it's more powerful.  That's been stated by several developers.

So again, your own definition still grants the Wii U "next gen" status.

Notably better is subjective.   Notably to you, me and notably to everyone else simply won't be the same.   You'd need a defined specification increase.   The problem is that's a damn hard thing to actually calculate.    Total system Flops is one way.   But even that doesn't work because the PS3 is a Flops beast but can't use them all in game.  Flop for Flop, the PS3 crushes the X360 but you can't see that in game at all.

Since you really can't define a parameter, that can be amicably agreed upon, to denote next generation, the predecessor/successor relationship of flagship consoles has been the indsutry accepted definition for a long while now.


Low clock speeds, low power consumption, tiny heat sink. 
The CPU will suck.

The WiiU doesn't show any sizeable improvement in technology. It's not running anything close to something like even the 5870 or even the 5770. I have a GTX560 Ti that's almost now 2 years old. It wasn't a just released card when I bought it and yet my PC will still be considerably more powerful in everyway then a "next-gen" console. Should that be the case? I don't think so. 

Funny, I'm using a tiny CPU on a computer in my workshop with lower clock speeds, low power consumption and a tiny heat sink and it's more powerful than the 6 year old AMD CPU I have in another desktop next to it.    If you odn't know anything about how CPU's work, try not to disucss them too much.   You are making yourself look bad.

PC's have a combination of 12-13 month generation cycle and energy options a console could only dream of.   That HD 5870 has a TDP of 228 watts.  Your GTX 560 Ti is 170 watts.    Not even the PS4 and Next X will have GPU's that eat up that many watts.   In fact, it will barely even be half of that.

I highly advise you to just stop.   Go back and do some research.  Learn about PC's, CPU's, GPU's, game consoles, etc...   Because every point you are trying to make is making you look bad.  And I'm being serious.   I'd rather have a good debate with you when you know what you are talking about rather than debate you as you are now because I'm starting to feel sorry for you.



The rEVOLution is not being televised