By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Nintendo - NX power - View Post

Pemalite said:

But it's still based on P6. The P6 evolved into the Pentium 2, Pentium 3... Then with a momentary lapse of judgement with Netburst... Was used in Pentium M, which then formed the basis of Core Solo/Duo. - Then Intel developed the Core architecture, an iteration of P6.

The point was, despite them all sharing the same "starting point" they all increased performance over the Pentium Pro for one reason or another, sometimes those changes were tiny fundamentally and Intel simply took advantage of clockspeeds and better interface technologies.

Fact of the matter is, Nintendo's PowerPC derived CPU would have gone through a few revisions and respines and changes for the WiiU, it's a CPU based on broadway, it's not an exact clone with higher clocks... And let's be honest, Broadway/Gecko weren't completely inadequate anyway, they managed to get the job done.
In the end I personally think the WiiU has a better CPU to GPU performance balance than what the Xbox One or Playstation 4 does, even if it is still laughable in the grand scheme of things.

To put it into perspective anyway...
A Pentium Tualatin released in 2000 on a clock for clock, core to core basis would still be faster than an In-order Intel Atom released almost a decade later.

The starting point doesn't matter much and sure Intel took advantage of Dennard scaling but those days are over ... 

@Bold The problem with that statement is Marcan didn't disclose that it wasn't a clone. He straight up said it was 3 broadways on die! 

I think it is the X1 that has the best CPU to GPU performance ratio. In terms of floating point performance, it is the X1 that has the least skewed ratio when considering the Latte has 176 GFlops for a fair estimate. In terms of integer performance, it is also the X1 since Nintendo's PPC 750 derivative is weaker in this aspect than it's floating point performance. For branching, I think this is where Espresso may have an advantage but AMD's VLIW5 architecture was notorious for been poor in that aspect. With GCN you can actually write highly performant uber-shader code just like other modern GPU architectures and it even supports indirect branching too which further puts VLIW5 to shame. You can very much get more CPU performance on the HD twins in other ways like programming a GPU like GCN as if it were a CPU! Afterall the only thing special to a GPU are it's fixed function units ...

Pemalite said:

Subpixel Morphological Anti-Aliasing or SMAA gained traction too. Was used for Watch Dogs and Dying Light and other games. Pretty much every Unreal 3 powered game late in the generation used MLAA or a variation of.

For the majority of last gen it was FXAA, SMAA only started getting interesting on current gen ...

Pemalite said:

Xenos wasn't VLIW. Or "Very Long Instruction Word" in the traditional terascale sense.

VLIW is a "SPU" or a "Unit" Broken down into 4-5 processing units, each capable of executing individual instructions in parallel.

Now, you *could* trace that back to the Radeon 9000 days, before the Radeon x1900 series, before the Radeon x800 series, but Xeno's being Terascale like? It's not.
Xenos shares more of it's funademental architecture principles with the Radeon x19xx series, which is actually a good thing, the Radeon 2xxx series was a dog.

It definitely has some differences with Terascale but Xenos is most certainly VLIW ...

Pemalite said:

I used to think so too. But I suggest you take a look at Morrowind with Tessellation, you might just walk away surprised on the difference that a "Primitive Tessellator" can make.

Games as well as hardware used to be very different back then. Most of the meshes in that time were only made up of hundreds of vertices and the truform was only built for small amounts of data expansion so that devs wouldn't go around abusing it too much. Games today are compute limited and increasing the amount of fragments will have some large impacts on shading and rasterization performance ...

Pemalite said:

Normally I would say "Wait for the software". - But considering how poorly the console is selling... Everyone has abandoned that ship.

From my own playing around with a Radeon 6450, you would be surprised how well you can make something like Unigen Heaven look, even with low level factor Tessellation whilst keeping it at 30fps.

We did wait for more software and you'd be hard-pressed to find anyone arguing that the WII U has a definitive or absolute edge over the sub-HD twins in performance ...

An HD 6450 is pathetic even in Unigine Heaven ...