Soleron said: I think AMD provides for that need better. Not now, of course, but with Llano. |
BTW, nothing forbids an AMD-IBM joint venture to do a GPU + PPC CPU instead of a GPU + x86 CPU chip. AMD and IBM relationship is very good.
Soleron said: I think AMD provides for that need better. Not now, of course, but with Llano. |
BTW, nothing forbids an AMD-IBM joint venture to do a GPU + PPC CPU instead of a GPU + x86 CPU chip. AMD and IBM relationship is very good.
something tells me that nintendo will not abandon IBM , has be working well for than so far , and i think that it will work in the next gen ...
GO PATS! 2012 THE YEAR OF NEW ENGLAND PATRIOTS'S 4TH SUPER BOWL!
A patriot to the end. GO PATS!
Now playing> THE LAST STORY (Wii) Best RPG I EVER PLAYED. *-*
Nintendo could u please just take my money and give me back my 3DS?!
Alby_da_Wolf said:
BTW, nothing forbids an AMD-IBM joint venture to do a GPU + PPC CPU instead of a GPU + x86 CPU chip. AMD and IBM relationship is very good. |
I suppose not, but Fusion (done right) is an extremely complex piece of design and I doubt you could just swap out the CPU like that, to a different architecture altogether. The market for the chips would be 50-100m units, and I'm not sure if that's enough to fund the design effort.
So assuming they're using an AMD GPU, it makes sense to use either Llano or Ontario (Bobcat + GPU) as the design is already done and they get the integration work at low cost. It'll still be on an IBM fabrication process (32nm SOI).
Squilliam said:
PowerPC is likely out because I believe Nintendo will want to have a combined CPU+GPU on the same die. It means a great savings in power, heat, packaging size and board complexity. It only really leaves AMD and Intel as viable options. |
If the machine was designed for realtime ray-tracing it wouldn't necessarily need a GPU. Furthermore, the papers I've read involving the RPU (a different ray-tracing chip) seem to conclude that the kind of architecture you would want to pair with ray-tracing hardware is a so-called "broadband" architecture (high memory bandwidth).
alephnull said:
If the machine was designed for realtime ray-tracing it wouldn't necessarily need a GPU. Furthermore, the papers I've read involving the RPU (a different ray-tracing chip) seem to conclude that the kind of architecture you would want to pair with ray-tracing hardware is a so-called "broadband" architecture (high memory bandwidth). |
Nintendo is not a graphics technology company. As we've seen with the Wii and DS, they don't feel extra graphics power beyond a certain point (at the expense of cost, power and development effort) sells the console or its games.
Ray-tracing is far into the 'technology for its own sake' bracket, because its addition wouldn't sell those games that drive console momentum (Wii Sports, Wii Fit, NSMB Wii).
They will let competitors do this kind of thing first, so they can do the technologies that matter (touch screen, motion control) first.
Nintendo chose fixed-function graphics hardware for the Wii, when unified-shader chips were availible. If Nintendo were going to use ray-tracing they would need a CPU faster than all current CPUs by a few orders of magnitude at FP ops, or a GPU more flexible than any present GPU in fundamental architecture. Neither exist at retail in 2010, 2011 or 2012 in a small enough power envelope or price (Larrabee was 300W before it was cancelled and still not fast enough to raytrace anything).
http://www.beyond3d.com/content/articles/94/
Good article on why ray-tracing isn't necessarily the way of the future. And before anyone slags off the site, these guys are some of the best analysts of graphical architectures and techniques around.
OT, x86 fails on a GFlops per mm2, which indirectly becomes GFlops per $. PowerPC is much more cost effective, being able to cram more power into each mm2 of silicon. For a company like Nintendo, I'd say that would matter more than having the bleeding edge. Furthermore in many ways IBM's fabs are more advanced than Intel's, keeping the processors cooking on up to date processes.
And lastly, Larrabee is currently vapourware. The first version never saw the light of day, and there is no guarantee the second version will get out the door. And the only reason Intel focusses on raytracing so much is because by its very design Larabee 1 sucked at rasterisation, the main justification Intel had for shitcanning Larabee 1 was that the performance just wasn't competitive with the then competitors GPU's for the amount of silicon required.
If we think that Wii2 will come out 2012-13, the design will need to be finalised by latest mid 2011. There has been no indication from Intel that Larabee 2 would be close to ready by then.
Personally, I think there is more chance Nintendo will get a Cell-lite off IBM than getting x86 from Intel. That way compatibility is maintained, performance is massively increased, and more importantly project risk is reduced. Going forward I see consoles relying less and less on raw CPU power and more and more on GPU power, in line with PC's.
Ray tracing is a tech feature. It won't help gaming get better.
A flashy-first game is awesome when it comes out. A great-first game is awesome forever.
Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs
Soleron said: I think AMD provides for that need better. Not now, of course, but with Llano. |
/thread
Soleron nailed why I think the Fusion is the best option for Nintendo.
alephnull said:
If the machine was designed for realtime ray-tracing it wouldn't necessarily need a GPU. Furthermore, the papers I've read involving the RPU (a different ray-tracing chip) seem to conclude that the kind of architecture you would want to pair with ray-tracing hardware is a so-called "broadband" architecture (high memory bandwidth). |
I like the idea of larrabee as an acceptable compromise as it can do everything. However it looks like the jig is up and my nefarious scheme for stimulating debate about more technical ideas in the Nintendo forum is coming to an end.
Tease.
Squilliam said:
I like the idea of larrabee as an acceptable compromise as it can do everything. However it looks like the jig is up and my nefarious scheme for stimulating debate about more technical ideas in the Nintendo forum is coming to an end. |
They already missed a raft of release dates, then all but cancelled it a few months ago (it should reappear in another form at some point, but...). Nintendo wouldn't bet on an unproven chip, they're not Sony.