By using this site, you agree to our Privacy Policy and our Terms of Use. Close

http://www.beyond3d.com/content/articles/94/

Good article on why ray-tracing isn't necessarily the way of the future. And before anyone slags off the site, these guys are some of the best analysts of graphical architectures and techniques around.

OT, x86 fails on a GFlops per mm2, which indirectly becomes GFlops per $. PowerPC is much more cost effective, being able to cram more power into each mm2 of silicon. For a company like Nintendo, I'd say that would matter more than having the bleeding edge. Furthermore in many ways IBM's fabs are more advanced than Intel's, keeping the processors cooking on up to date processes.

And lastly, Larrabee is currently vapourware. The first version never saw the light of day, and there is no guarantee the second version will get out the door. And the only reason Intel focusses on raytracing so much is because by its very design Larabee 1 sucked at rasterisation, the main justification Intel had for shitcanning Larabee 1 was that the performance just wasn't competitive with the then competitors GPU's for the amount of silicon required.

If we think that Wii2 will come out 2012-13, the design will need to be finalised by latest mid 2011. There has been no indication from Intel that Larabee 2 would be close to ready by then.

Personally, I think there is more chance Nintendo will get a Cell-lite off IBM than getting x86 from Intel. That way compatibility is maintained, performance is massively increased, and more importantly project risk is reduced. Going forward I see consoles relying less and less on raw CPU power and more and more on GPU power, in line with PC's.