By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Nintendo + Intel will be the killer next generation combo.

Soleron said:

I think AMD provides for that need better. Not now, of course, but with Llano.

Consoles need a good GPU more than a CPU, and what they really need is low cost and therefore tight integration. Llano has four improved Phenom II cores, and ~400SPs of DX11-class graphics power which is equivalent to a current HD56xx chip. The memory controller eliminates the traditional integrated graphics bottleneck.

Llano is a two-chip [CPU+GPU, SB] solution (as opposed to Intel's three {CPU+sucky GPU, good GPU, SB)). Intel's graphics are nowhere near good enough to power a console so would require an AMD or Nvidia graphics chip anyway, but AMD's solution gives you a good-enough CPU with an AMD GPU. Larrabee is not going to be ready in time, even Intel's 2013 CPU (Haswell) doesn't use a Larrabee GPU.

Llano's power use will also finally be competitive with Intel's mobile offerings, unlike current AMD chips, due to power gating and the C1E state.

Better yet, the technology is proven. Nintendo is conservative, but Llano has an existing CPU and an existing GPU. The new part is the low power focus and integration work. Llano is very late Q4 2010 or early Q1 2011 too so it would easily fit with a late 2011 or 2012 Wii successor, if Nintendo started planning this.

Essentially, if Nintendo is to carry on using an AMD GPU like the last three consoles, AMD will easily be able to tempt them into throwing a CPU on as well, given the unrivalled integration.

BTW, nothing forbids an AMD-IBM joint venture to do a GPU + PPC CPU instead of a GPU + x86 CPU chip. AMD and IBM relationship is very good.



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


Around the Network

something tells me that nintendo will not abandon IBM , has be working well for than so far , and i think that it will work in the next gen ...



GO PATS! 2012 THE YEAR OF NEW ENGLAND PATRIOTS'S 4TH SUPER BOWL!

A patriot to the end. GO PATS!

Now playing> THE LAST STORY (Wii) Best RPG I EVER PLAYED. *-*

Nintendo could u please just take my money and give me back my 3DS?!

Alby_da_Wolf said:
Soleron said:

...

BTW, nothing forbids an AMD-IBM joint venture to do a GPU + PPC CPU instead of a GPU + x86 CPU chip. AMD and IBM relationship is very good.

I suppose not, but Fusion (done right) is an extremely complex piece of design and I doubt you could just swap out the CPU like that, to a different architecture altogether. The market for the chips would be 50-100m units, and I'm not sure if that's enough to fund the design effort.

So assuming they're using an AMD GPU, it makes sense to use either Llano or Ontario (Bobcat + GPU) as the design is already done and they get the integration work at low cost. It'll still be on an IBM fabrication process (32nm SOI).



Squilliam said:
Joelcool7 said:

Nintendo's a loyal company they will likely stick with IBM. Not to mention backwards compatability issues. Its better to stick with PowerPC!

PowerPC is likely out because I believe Nintendo will want to have a combined CPU+GPU on the same die. It means a great savings in power, heat, packaging size and board complexity. It only really leaves AMD and Intel as viable options.

If the machine was designed for realtime ray-tracing it wouldn't necessarily need a GPU. Furthermore, the papers I've read involving the RPU (a different ray-tracing chip) seem to conclude that the kind of architecture you would want to pair with ray-tracing hardware is a so-called "broadband" architecture (high memory bandwidth).



alephnull said:
...

If the machine was designed for realtime ray-tracing it wouldn't necessarily need a GPU. Furthermore, the papers I've read involving the RPU (a different ray-tracing chip) seem to conclude that the kind of architecture you would want to pair with ray-tracing hardware is a so-called "broadband" architecture (high memory bandwidth).

Nintendo is not a graphics technology company. As we've seen with the Wii and DS, they don't feel extra graphics power beyond a certain point (at the expense of cost, power and development effort) sells the console or its games.

Ray-tracing is far into the 'technology for its own sake' bracket, because its addition wouldn't sell those games that drive console momentum (Wii Sports, Wii Fit, NSMB Wii).

They will let competitors do this kind of thing first, so they can do the technologies that matter (touch screen, motion control) first.

 

Nintendo chose fixed-function graphics hardware for the Wii, when unified-shader chips were availible. If Nintendo were going to use ray-tracing they would need a CPU faster than all current CPUs by a few orders of magnitude at FP ops, or a GPU more flexible than any present GPU in fundamental architecture. Neither exist at retail in 2010, 2011 or 2012 in a small enough power envelope or price (Larrabee was 300W before it was cancelled and still not fast enough to raytrace anything).



Around the Network

http://www.beyond3d.com/content/articles/94/

Good article on why ray-tracing isn't necessarily the way of the future. And before anyone slags off the site, these guys are some of the best analysts of graphical architectures and techniques around.

OT, x86 fails on a GFlops per mm2, which indirectly becomes GFlops per $. PowerPC is much more cost effective, being able to cram more power into each mm2 of silicon. For a company like Nintendo, I'd say that would matter more than having the bleeding edge. Furthermore in many ways IBM's fabs are more advanced than Intel's, keeping the processors cooking on up to date processes.

And lastly, Larrabee is currently vapourware. The first version never saw the light of day, and there is no guarantee the second version will get out the door. And the only reason Intel focusses on raytracing so much is because by its very design Larabee 1 sucked at rasterisation, the main justification Intel had for shitcanning Larabee 1 was that the performance just wasn't competitive with the then competitors GPU's for the amount of silicon required.

If we think that Wii2 will come out 2012-13, the design will need to be finalised by latest mid 2011. There has been no indication from Intel that Larabee 2 would be close to ready by then.

Personally, I think there is more chance Nintendo will get a Cell-lite off IBM than getting x86 from Intel. That way compatibility is maintained, performance is massively increased, and more importantly project risk is reduced. Going forward I see consoles relying less and less on raw CPU power and more and more on GPU power, in line with PC's.



Ray tracing is a tech feature. It won't help gaming get better.



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

Soleron said:

I think AMD provides for that need better. Not now, of course, but with Llano.

Consoles need a good GPU more than a CPU, and what they really need is low cost and therefore tight integration. Llano has four improved Phenom II cores, and ~400SPs of DX11-class graphics power which is equivalent to a current HD56xx chip. The memory controller eliminates the traditional integrated graphics bottleneck.

Llano is a two-chip [CPU+GPU, SB] solution (as opposed to Intel's three {CPU+sucky GPU, good GPU, SB)). Intel's graphics are nowhere near good enough to power a console so would require an AMD or Nvidia graphics chip anyway, but AMD's solution gives you a good-enough CPU with an AMD GPU. Larrabee is not going to be ready in time, even Intel's 2013 CPU (Haswell) doesn't use a Larrabee GPU.

Llano's power use will also finally be competitive with Intel's mobile offerings, unlike current AMD chips, due to power gating and the C1E state.

Better yet, the technology is proven. Nintendo is conservative, but Llano has an existing CPU and an existing GPU. The new part is the low power focus and integration work. Llano is very late Q4 2010 or early Q1 2011 too so it would easily fit with a late 2011 or 2012 Wii successor, if Nintendo started planning this.

Essentially, if Nintendo is to carry on using an AMD GPU like the last three consoles, AMD will easily be able to tempt them into throwing a CPU on as well, given the unrivalled integration.

/thread

Soleron nailed why I think the Fusion is the best option for Nintendo.



alephnull said:
Squilliam said:
Joelcool7 said:

Nintendo's a loyal company they will likely stick with IBM. Not to mention backwards compatability issues. Its better to stick with PowerPC!

PowerPC is likely out because I believe Nintendo will want to have a combined CPU+GPU on the same die. It means a great savings in power, heat, packaging size and board complexity. It only really leaves AMD and Intel as viable options.

If the machine was designed for realtime ray-tracing it wouldn't necessarily need a GPU. Furthermore, the papers I've read involving the RPU (a different ray-tracing chip) seem to conclude that the kind of architecture you would want to pair with ray-tracing hardware is a so-called "broadband" architecture (high memory bandwidth).

I like the idea of larrabee as an acceptable compromise as it can do everything. However it looks like the jig is up and my nefarious scheme for stimulating debate about more technical ideas in the Nintendo forum is coming to an end.



Tease.

Squilliam said:
...

I like the idea of larrabee as an acceptable compromise as it can do everything. However it looks like the jig is up and my nefarious scheme for stimulating debate about more technical ideas in the Nintendo forum is coming to an end.

They already missed a raft of release dates, then all but cancelled it a few months ago (it should reappear in another form at some point, but...). Nintendo wouldn't bet on an unproven chip, they're not Sony.