By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Rumour: Playstation 4 GPU is by Intel

Intel GPUs aren't that great...besides, Sony is not in any position to make another console, financial-wise.



Around the Network

What? PS4? But, that thing won't be out 'till late 2016. Kinda early to start talking about it already, now isn't it?



Nintendo Network ID: Cheebee   3DS Code: 2320 - 6113 - 9046

 

I really didnt understand any of that technical speak



I hope my 360 doesn't RRoD
         "Suck my balls!" - Tag courtesy of Fkusmot

@Soleron: those are simulated cores, not the real thing!

@Thread: Theres no real reason to duplicate general purpose compute between a Cell variant and LRB, they would be better off with a simple but fast GPU with better perf per mm^2 than LRB and keep the compute operations on the Cell processor.



Tease.

i think sony might consider making their own gpu again , instead of using nvidia. since nvidia is underpreforming this gen for them.



GAMERTAG IS ANIMEHEAVEN X23

PSN ID IS : ANIMEREALM 

PROUD MEMBER OF THE RPG FAN CLUB THREAD

ALL-TIME FAVORITE JRPG IS : LOST ODYSSEY

http://gamrconnect.vgchartz.com/thread.php?id=52882&page=1

Around the Network
Bitmap Frogs said:
You mean Sony has decided to use an untested architecture without available tools (besides for an extremely small number of people) and no track record to speak of?

After the troubles they've been having because the exotic Cell?

I don't think so. I'm willing to eat crow on this one a few years down the road if it turns out to be true tho...

MS continuing their partnership with ATI is interesting, given how intertwined MS, the X360 and DirectX are. This symbiotic relationship will probably become tighter down the road...

Doesn't that fit Sony's current approach of making things hard to code for?  I thought Kaz said that is their secret ingredient.

 



richardhutnik said:
Bitmap Frogs said:
You mean Sony has decided to use an untested architecture without available tools (besides for an extremely small number of people) and no track record to speak of?

After the troubles they've been having because the exotic Cell?

I don't think so. I'm willing to eat crow on this one a few years down the road if it turns out to be true tho...

MS continuing their partnership with ATI is interesting, given how intertwined MS, the X360 and DirectX are. This symbiotic relationship will probably become tighter down the road...

Doesn't that fit Sony's current approach of making things hard to code for?  I thought Kaz said that is their secret ingredient.

 

 

Indeed.

Such an approach worked well for them in the past (or rather it didn't harm them enough, I'm not sure which one is correct) however these days after the cell hardships they might have a different perspective on the issue. To be honest I'm just speculating on Sony execs fulfilling the pendulum swing which is something that might not necessarily happen.





Current-gen game collection uploaded on the profile, full of win and good games; also most of my PC games. Lucasfilm Games/LucasArts 1982-2008 (Requiescat In Pace).

If I were to explain all the reasons why Sony won't go with Larrabee, my post would span 2 pages. To keep things short, I will just list a few things and clarify them on requiest.

1.) Another unproven architecture will tick off developers.
2.) Intel wouldn't let Sony own the IP, meaning that the chip will always be expensive. Remember Nvidia and the xbox?
3.) Sony adding Larrabee in the PS4 would help intel push it to developers but it won't be 1/10th as effective as is Microsoft did. Unlike Microsoft, Sony doesn't use DirectX or OpenGl, they use their own graphics API, something which would make PS4 - > PC ports hard.
4.) Would break BC with the PS3.

#2 and #1 are really the biggest ones.



Good news Everyone!

I've invented a device which makes you read this in your head, in my voice!

Soleron said:
hduser said:

I'd find it hard to believe.  Intel's not a big player in the GPU market and it's history with it's own graphics core is very spotty.  The only thing that would lend some credence to the rumour is that Sony needs a partner to drive down the cost of the PS4 even though it might take a hit performance wise.  The other reason to use Intel is to hope that Nvidia or ATI can counter with a better offer or chip.

Intel's completely redesigning their GPU. It will be nothing like their integrated graphics. They've put up to 48 modified Pentium cores on a chip, redesigned their FP capability and memory bus, and ended up with a workabel GPU that runs both PC software AND graphics.

To give you an idea of the performance:

"Intel's SIGGRAPH 2008 paper describes simulations of Larrabee's projected performance.[7] Graphs show how many 1 GHz Larrabee cores are required to maintain 60 FPS at 1600x1200 resolution in several popular games. Roughly 25 cores are required for Gears of War with no antialiasing, 25 cores for F.E.A.R with 4x antialiasing, and 10 cores for Half-Life 2: Episode 2 with 4x antialiasing. It is likely that Larrabee will run faster than 1 GHz, so these numbers are conservative.[13] Another graph shows that performance on these games scales nearly linearly with the number of cores up to 32 cores. At 48 cores the performance scaling is roughly 90% of linear."

So 48 cores will be enough to run next-gen console games for certain.

 

I must agree more with hduser: a GPU based on modified general purpose Pentium cores and able to run both x86 SW and graphics may look a good idea at a first thought, but thinking again the risk is high that it will result in either a jack of all trades, but not really outstanding at graphics, or a bloated device with horrible performance/power consumption ratio.

I may be wrong, but Intel record in this field is nothing to write home about.

 



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW! 
 


They put a lot of R&D into getting the CELL to work. Unless the CELL has been more trouble for Sony than they've let on (a lot more), then they're going to want to stick with it and optimize their investment, especially in this day of fiscal conservatism. It'll be all CELL for the PS4, as they'll have it doing what it was originally supposed to do: serve as the integrated solution

 

Unless things have gone vastly wrong with the CELL, Sony has every incentive to keep using it, and to use it for graphics as well



Monster Hunter: pissing me off since 2010.