By using this site, you agree to our Privacy Policy and our Terms of Use. Close
joeorc said:
Soleron said:
...

I do not think it's about direct replacement of GPU's , it's about Helping the dedicated GPU.

The way i see it is for a way to help the GPU get better result's, with more of the thing's that the CPU/GPU could do to offset the process's that could be better suited for the CPU/GPU doing those and to let the GPU do what it does best and that is DRAW.

yes the quality of the chip could indeed be used in embedded system's to lower the overall cost but I think the main goal is for it to overcome

some of the much needed memory wall problem's that plague current design's now.

Intel seem to be pushing it as a GPU replacement. Not a CPU replacement or a third processor. They're getting board partners like AMD/Nvidia have. They were even targeting next-gen consoles though delays and missinh performance targets stopped that.

The eventual goal, of both AMD and Intel, is to put the GPU on the CPU die then use each one for the tasks its suited for. Hence AMD's Fusion, and Intel's GPU die on CPU package.

Larrabee is certainly a high-end product, considering it needs its own card and will draw like 300W of power. For low-power and embedded, Intel is putting their G4x type graphics on the CPU Package (Clarkdale, Arrandale (Nehalem dual + GPU_ and Pinetrail (Atom+GPU)).