By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Rumour: Playstation 4 GPU is by Intel

Wow! Would be an epic fight!

Big bad Sony/Intel versus underdog MS/AMD.

Go AMD!



Around the Network

I'd find it hard to believe.  Intel's not a big player in the GPU market and it's history with it's own graphics core is very spotty.  The only thing that would lend some credence to the rumour is that Sony needs a partner to drive down the cost of the PS4 even though it might take a hit performance wise.  The other reason to use Intel is to hope that Nvidia or ATI can counter with a better offer or chip.



hduser said:

I'd find it hard to believe.  Intel's not a big player in the GPU market and it's history with it's own graphics core is very spotty.  The only thing that would lend some credence to the rumour is that Sony needs a partner to drive down the cost of the PS4 even though it might take a hit performance wise.  The other reason to use Intel is to hope that Nvidia or ATI can counter with a better offer or chip.

Intel's completely redesigning their GPU. It will be nothing like their integrated graphics. They've put up to 48 modified Pentium cores on a chip, redesigned their FP capability and memory bus, and ended up with a workabel GPU that runs both PC software AND graphics.

To give you an idea of the performance:

"Intel's SIGGRAPH 2008 paper describes simulations of Larrabee's projected performance.[7] Graphs show how many 1 GHz Larrabee cores are required to maintain 60 FPS at 1600x1200 resolution in several popular games. Roughly 25 cores are required for Gears of War with no antialiasing, 25 cores for F.E.A.R with 4x antialiasing, and 10 cores for Half-Life 2: Episode 2 with 4x antialiasing. It is likely that Larrabee will run faster than 1 GHz, so these numbers are conservative.[13] Another graph shows that performance on these games scales nearly linearly with the number of cores up to 32 cores. At 48 cores the performance scaling is roughly 90% of linear."

So 48 cores will be enough to run next-gen console games for certain.

 



Double post.



Soleron said:
hduser said:

I'd find it hard to believe.  Intel's not a big player in the GPU market and it's history with it's own graphics core is very spotty.  The only thing that would lend some credence to the rumour is that Sony needs a partner to drive down the cost of the PS4 even though it might take a hit performance wise.  The other reason to use Intel is to hope that Nvidia or ATI can counter with a better offer or chip.

Intel's completely redesigning their GPU. It will be nothing like their integrated graphics. They've put up to 48 modified Pentium cores on a chip, redesigned their FP capability and memory bus, and ended up with a workabel GPU that runs both PC software AND graphics.

To give you an idea of the performance:

"Intel's SIGGRAPH 2008 paper describes simulations of Larrabee's projected performance.[7] Graphs show how many 1 GHz Larrabee cores are required to maintain 60 FPS at 1600x1200 resolution in several popular games. Roughly 25 cores are required for Gears of War with no antialiasing, 25 cores for F.E.A.R with 4x antialiasing, and 10 cores for Half-Life 2: Episode 2 with 4x antialiasing. It is likely that Larrabee will run faster than 1 GHz, so these numbers are conservative.[13] Another graph shows that performance on these games scales nearly linearly with the number of cores up to 32 cores. At 48 cores the performance scaling is roughly 90% of linear."

So 48 cores will be enough to run next-gen console games for certain.

 

 

 48 cores? Do each of these cores need to be accessed individually or can devs just throw code at the GPU and let it sort it out itself?



Around the Network

It would be fun if Sony chose a pricey but unsure hard-ware solution again.

Instead of "OMG the Cell" this gen, it would be a constant "OMG raytracing" if Sony put Laughabee in the PS4.



Soleron said:
nojustno said:
http://www.techradar.com/news/gaming/sony-shoots-down-intel-gpu-in-ps4-rumours-525563

/thread

Yep, I did say it was a rumour. But notice they didn't deny it outright.

And Intel's GPU is in about the same state as any potential ATI or Nvidia future GPU. All of them are equally 'vapourware' from the perspective of a console designer. It's not impossible that a next-gen console will use Larrabee.

 

 

Strictly, that's true.

However, future projects from the two major suppliers of GPU's and a future GPU from Intel aren't exactly the same.





Current-gen game collection uploaded on the profile, full of win and good games; also most of my PC games. Lucasfilm Games/LucasArts 1982-2008 (Requiescat In Pace).

Mistershine said:
...

 

 48 cores? Do each of these cores need to be accessed individually or can devs just throw code at the GPU and let it sort it out itself?

Both (if everything goes to plan). Intel will have a standard DirectX/OpenGL driver which will 'look' like a normal GPU to a game and that will divide it up to the cores, OR you can program each one individually like a CPU. I think they'll also have their own special API like CUDA or Stream where the task splitting is semi-automated.

 



Do we need a rumour for everything? I'm getting sick of all the rumors. No offense to the thread creator.



My Mario Kart Wii friend code: 2707-1866-0957

NJ5 said:
Do we need a rumour for everything? I'm getting sick of all the rumors. No offense to the thread creator.

I think it's fun reading rumours, as long as you don't take them seriously. But I agree it can be annoying (and I'm well aware I'm doing it wrong too) Perhaps all rumours should go in Off Topic regardless of subject? Could that be a forum rule?