By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Intel courting Microsoft/Sony for Larrabe (GPU) deal.

alephnull said:
Deneidez said:
If you have a car and a boat, those will performance much better than a boatcar. Similarly 3xCELLs was like nothing compared to RSX graphical abilities.

1) not necessarily true, just look at the performance gain achieved by amd when the moved the memory controller on chip in their days of obvious superiority.

Ability to perform at what? Maybe I want to drive through the everglades, boatcar would win hands down.

Anyways the modern GPUs are all turning into boatcars anyhow.

Except GPU:s aren't sacrificing their graphical performance for general-purpose computing. Their graphical performance is just wrapped for general-purpose computing.



Around the Network
alephnull said:
Squilliam said:
alephnull said:

This all assumes intel can even make a decent compiler for it. Remember the Itanic?

 

They are basing it off the x86 P1 processor. Since its got an X86 lineage I doubt they will be hurting for a compiler.

The Itanic was a completely new architecture, this is x86 based.

 

 

I think you may be confusing the microarchitecture and the ISA http://en.wikipedia.org/wiki/Instruction_set. The cell uses the POWER ISA, does that mean it can use the same compiler as the xenon which also uses the POWER ISA?

They most likely based their first compiler on the POWER ISA. They use a combination of old and new elements, and the new elements make up a larger proportion of the whole than Larrabee.

 



Tease.

Squilliam said:
Rainbird said:
Squilliam said:
MrBubbles said:
would this be a good deal for microsoft or are there better options?

They could make Intel give them a pretty nice CPU as part of the deal and get it CHEAP. It would be quite an awesome deal if it did go through for both parties.

Intel gets one of the biggest/best software tool development companies in the world working on their chip and gains mass market adoption. Microsoft gets a cheap deal. So its a win/win if they can agree really.

 

Except it could put Microsoft in the position so well known from the PS3, with complaining developers. And as far as I understood, Larrabee is a combined CPU and GPU..?

 

 

Except Microsoft makes excellent tools and has an awesome developer relations. If they had made it, most likely the tools available in 2005 would have been much better.

 

easy maybe

but memory whores

look at visual studio UGH its pain to work even .net with it.



Deneidez said:
alephnull said:
Deneidez said:
If you have a car and a boat, those will performance much better than a boatcar. Similarly 3xCELLs was like nothing compared to RSX graphical abilities.

1) not necessarily true, just look at the performance gain achieved by amd when the moved the memory controller on chip in their days of obvious superiority.

Ability to perform at what? Maybe I want to drive through the everglades, boatcar would win hands down.

Anyways the modern GPUs are all turning into boatcars anyhow.

Except GPU:s aren't sacrificing their graphical performance for general-purpose computing. Their graphical performance is just wrapped for general-purpose computing.

From my understanding of the literature (not fantastic) there is a tradeoff, however the flexibility is considered worth it. I am not an EE, so take my opinion with a grain of salt, however I do work with some EE guys who hold this opinion. It makes sense to me, invariants simplify problems.


i think intel will have to show first that larrabee can do what they claim it can. x86 chips are good multipurpose chips so good cpus, but if you need just a limited instruction set thats specialised on doing gfx output the current ati/nvidia chips have a lot more power.
So there is still the question how good a highly parralellised x86 based chip (which larrabee should be) can compete with the chips that are specially built for just gfx purposes. you can also see how much power the current gfx chips from nvidia have as an example when you compare physx performance on them with physx performance on a intel cpu or even a physx card (yes nviddia gfx cards destroy even these specialised physics boards in tehse ebnchmarks, and amd gpus would most likely do it too if they would be supported)

add to that the history of intel when it comes to big announcements for gfx chips that neevr lived up to expectations once tehy came out and finally ended in their chipset graphics, and then you see why many in the industry are on a wait and see position when it comes to larrabee. if larrabee can deleiver what intel promises then i'm sure they will sell, but people want to see first if it delivers what it promises.



Around the Network

How will this affect b. compatibility? MSFT, if they do this, will have 3 different GPUs from each of the 3 consoles.



If Intel covers the whole of the hardware side of things - design, chips etc then Microsoft would be free do do whats best and tap into the billions of dollars worth of software engineering talent and interface research and dream up something nice to compete with the Wii2/PS4.

Its quite the partnership which would let Microsoft leverage its x86 strength and it should make it darn easy to develop for.

8 Core Sandybridge x86 (Nehalem successor) vs  Cell 2.... Which one would developers want to use? xD

@Halo gamer it would have full backwards compatibility with Xbox 1 (same architecture) The Xbox 360 would if they spent some time working on the compatibility *I think*

Quaiky "i think intel will have to show first that larrabee can do what they claim it can. x86 chips are good multipurpose chips so good cpus, but if you need just a limited instruction set thats specialised on doing gfx output the current ati/nvidia chips have a lot more power.
So there is still the question how good a highly parralellised x86 based chip (which larrabee should be) can compete with the chips that are specially built for just gfx purposes. you can also see how much power the current gfx chips from nvidia have as an example when you compare physx performance on them with physx performance on a intel cpu or even a physx card (yes nviddia gfx cards destroy even these specialised physics boards in tehse ebnchmarks, and amd gpus would most likely do it too if they would be supported)

add to that the history of intel when it comes to big announcements for gfx chips that neevr lived up to expectations once tehy came out and finally ended in their chipset graphics, and then you see why many in the industry are on a wait and see position when it comes to larrabee. if larrabee can deleiver what intel promises then i'm sure they will sell, but people want to see first if it delivers what it promises."

Intels problem is a chicken/egg relationship. The power is likely going to be there Larrabee is specced to have 1 Tflop of double precision power compared to 120 Gflops from the R700 and 20 from the Cell 1. The problem they have is if they don't get developers working on it, tools won't be developed to exploit its power and if its power isn't exploited then developers won't work on it.

Don't worry about physx, its a highly parellel architecture its likely to do fine on their own brand of physics *Havok* so they don't actually need to run Physx anyway.

 



Tease.

a cpu gpu combo would do wonders to reduce the price of a console.. when are they going to have the hard drive built onto the cpu?



Squilliam said:
Noobie said:
PS4 with 4 Cell chips with each having 8 SPUs and 32 to 48 core larabee GPU... that will be insane... but hugely expensive too i think.

You got the insane part right, and thats why they wouldn't do it.

 

 

Sony will do the Cell cpu part in PS4.  I'd go with 8 (so 64 SPUs) at 6ghz (IBM has already run Cells at this speed).  It'll be really really cheap by then, and devs will be really really good with Cell architecture by then.

Larrabee, who knows.

 



Squilliam said:

 

 

Except Microsoft makes excellent tools and has an awesome developer relations. If they had made it, most likely the tools available in 2005 would have been much better.

Except the lack of tools is a one-off mistake that Sony can learn from and overcome.  And it already has, as evidenced by the tools it's developing and giving away, as well as, apparently, one free visit from the Sony CELL experts (I think some are in Santa Monica, some in London) to devs wanting to see how to best optimize their shit.  I certainly wouldn't count on Sony making a similar mistake (lack of early tools/support) next gen.