By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Dual GPU processing...for future consoles?

I think that Sony/MS will go with one chip with multiple cores(integrated CPU and GPU and in near future CPU,GPU and HDD-that thing exist now with 1TB HDD).Something like Larebo maybe but a little diferent.Major plus is MUCH fuster data transmission and lower power consumption.I can see PS4 with integrated CPU and GPU in one chip with,let's say 60 cores.This will gonna happen.It's only a matter of time.



Around the Network
disolitude said:
Mr Khan said:

My thoughts were that we were moving away from separate GPUs entirely and moving towards integrated chipsets. I imagine that Sony is at least going to have the CELL pull the whole workload for PS4 like it was originally intended to do for PS3, and certainly not to multi-GPU rendering.


I honestly don't see that happening.

I mean...sure they can try that and spend billions of dollars in developing a CPU which has the ability to substitute a GPU. In the meantime, xbox720 will just phone ATI and have them deliver a current gen GPU.

I think Sony has learned their lesson with the PS3 and all the money they lost. I read that cell's yield rate when they started making PS3s was somewhere around 25%...with 75% of the chips being a waste. And this is with the downgraded cell processor which is missing one of the SPEs...

Haven't they already done that? They've basically spent the money and proven that you can get excellent (for a console) graphics performance with a relatively weak GPU. Devs are now familiar enough with t3h C3ll that it's not going to be a barrier to game development for PS4 to keep going along the same path, and given the main development work has been done it's not going to be billions in R&D to beef up the horsepower enough to rid themselves entirely of 3rd party GPUs. PS4 could figuritively be 2 PS3s duct taped together, only with the GPU removed couldn't it? I bet Sony could beef up the Cell (is 2 x Cell1.5 a possibility?) into a console cheaper than what MS would have to pay for a current gen CPU/GPU combination.

With 360 having a superior GPU but developers struggling to match PS3 graphics, and unable to surpass PS3 graphics (at present), Sony is basically showing the world that GPUs can be a thing of the past, therefore why would Sony turn around and pursue a dual GPU route in any way shape or form?

Perhaps Sony has a bigger picture in mind with diminishing the importance of GPUs, or eventually making them entirely irrelevant, in its console. If you can eliminate as much reliance on 3rd party hardware as possible then it means the profits are going to you, not them. Or rather your losses are diminished.



“The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.” - Bertrand Russell

"When the power of love overcomes the love of power, the world will know peace."

Jimi Hendrix

 

brazylianwisnia said:
I think that Sony/MS will go with one chip with multiple cores(integrated CPU and GPU and in near future CPU,GPU and HDD-that thing exist now with 1TB HDD).Something like Larebo maybe but a little diferent.Major plus is MUCH fuster data transmission and lower power consumption.I can see PS4 with integrated CPU and GPU in one chip with,let's say 60 cores.This will gonna happen.It's only a matter of time.

My personal guess is Llano, AMD's 32nm CPU+GPU+Northbridge chip due 2011.



In theory a second gpu can actually give a boost of over 100%. If the load is divided right, it could take away certain ristrictions or bottlenecks that you have when trying to have one gpu do a lot of things at the same time.

The thing is no PC game can be fully optimized for multi gpu systems, since most PC's don't have such a setup. But if a console would use it they could fully optimize the games for it.

Also it could actually be a cheaper solution, since you could use gpu's that are just a bit lower in power but have proven fabrication processes and much higher yields. This opposed to trying to get the maximum out of a (unkown) chip.

Also with every die shrink you save twice as much since you're using two chips(and two slower chips would use a smaller die anyway).



binary solo said:
disolitude said:
Mr Khan said:

My thoughts were that we were moving away from separate GPUs entirely and moving towards integrated chipsets. I imagine that Sony is at least going to have the CELL pull the whole workload for PS4 like it was originally intended to do for PS3, and certainly not to multi-GPU rendering.


I honestly don't see that happening.

I mean...sure they can try that and spend billions of dollars in developing a CPU which has the ability to substitute a GPU. In the meantime, xbox720 will just phone ATI and have them deliver a current gen GPU.

I think Sony has learned their lesson with the PS3 and all the money they lost. I read that cell's yield rate when they started making PS3s was somewhere around 25%...with 75% of the chips being a waste. And this is with the downgraded cell processor which is missing one of the SPEs...

Haven't they already done that? They've basically spent the money and proven that you can get excellent (for a console) graphics performance with a relatively weak GPU. Devs are now familiar enough with t3h C3ll that it's not going to be a barrier to game development for PS4 to keep going along the same path, and given the main development work has been done it's not going to be billions in R&D to beef up the horsepower enough to rid themselves entirely of 3rd party GPUs. PS4 could figuritively be 2 PS3s duct taped together, only with the GPU removed couldn't it? I bet Sony could beef up the Cell (is 2 x Cell1.5 a possibility?) into a console cheaper than what MS would have to pay for a current gen CPU/GPU combination.

With 360 having a superior GPU but developers struggling to match PS3 graphics, and unable to surpass PS3 graphics (at present), Sony is basically showing the world that GPUs can be a thing of the past, therefore why would Sony turn around and pursue a dual GPU route in any way shape or form?

Perhaps Sony has a bigger picture in mind with diminishing the importance of GPUs, or eventually making them entirely irrelevant, in its console. If you can eliminate as much reliance on 3rd party hardware as possible then it means the profits are going to you, not them. Or rather your losses are diminished.

As pointed out already. Cell Is dead.

Sony sold Cell completely to IBM. The whole rights. IBM decided t stop production of Cell in full. No more R&D.

Why? Again as pointed out in this thread. GPU's on PC eat Cell alive. They now have GPU's doing Multiprocesses of what Cell was supposed to be designed for much much more efficently. And CPU manufactorers have CPU's doing the rest of the tasks much much better than Cell can do those.

The future is clearly GPU's being made more into CPU's and helping with those tasks.

If PS4 is to have a Cell, Sony need to front up alot of money to do R&D without IBM funding at all this time to make Cell advancements. Yet it's pointless, considering GPU multi processing is clearly the way forward.



Around the Network
disolitude said:

What do you guys thing? Will this ever become a reality down the road?

Most PC users are familiar with this... Buy a video card and when its time to upgrade, add another of the same type which gives a GPU processing boost of up to 80%. Here is an example of some video cards and their performance with Crysis. Nvidia GTX 260 for example went from 25 fps with 1 card, to 36 fps with 3 cards...to whooping 57 fps with 3 video GPUs in one machine.

I am frankly surprised that neither Microsoft or Sony has provided us with an ability to connect 2 360s/PS3s together and have them scale the visuals workload for a game.

Obviously each game would still have to run fine with a single console but by doing this they could enable the gamer ethnusiasts with native 1080p resolution, 3D gaming, 60 frames per second frame rate...etc

So what do you guys think? Will conosoles ever go down this road? It sure would be nice to see...

actually, i thought i saw an article about GT5 prologue where you could do that and get a 1440p resolution.



PC gaming is better than console gaming. Always.     We are Anonymous, We are Legion    Kick-ass interview   Great Flash Series Here    Anime Ratings     Make and Play Please
Amazing discussion about being wrong
Official VGChartz Folding@Home Team #109453
 

If the cost benefit is positive and worth it then it will happen. Imo, it is a great idea would allow for a very long timeframe between releases and from a purely biz standpoint you want to avoid that if you can.



ssj12 said:
disolitude said:

What do you guys thing? Will this ever become a reality down the road?

Most PC users are familiar with this... Buy a video card and when its time to upgrade, add another of the same type which gives a GPU processing boost of up to 80%. Here is an example of some video cards and their performance with Crysis. Nvidia GTX 260 for example went from 25 fps with 1 card, to 36 fps with 3 cards...to whooping 57 fps with 3 video GPUs in one machine.

 

I am frankly surprised that neither Microsoft or Sony has provided us with an ability to connect 2 360s/PS3s together and have them scale the visuals workload for a game.

Obviously each game would still have to run fine with a single console but by doing this they could enable the gamer ethnusiasts with native 1080p resolution, 3D gaming, 60 frames per second frame rate...etc

So what do you guys think? Will conosoles ever go down this road? It sure would be nice to see...

actually, i thought i saw an article about GT5 prologue where you could do that and get a 1440p resolution.


Sometimes it helps when you read the whole thread... I did see that and i mentioned it in 2 different posts. That was something a little different than GPU scalling though.

That involved 4 PS3s rendering a different quarter of the screen at 1080p each...and compining the image in to 1 giant screen.  And I'd love to see this tech available for purchase as well...lol



flacomeza said:
disolitude said:
Soleron said:
disolitude said:
...


I agree that having dual GPU in a single console is not an option due to heat, power consumption...etc. However don't you think having 2 consoles connected in a way where they can share the GPU workload could provide some benefits?

You are thinking about it form a PC perspective...and yeah it doesn't make sense for console.

And where is the market that would pay for it? I always try and think in economic terms, not technology.

Even if a few thousand people are prepared to pay for it, a two-tier hardware capability like this is completely against the concept of a console anyway.

Well if a company is making money on a console...and you need to buy 2 consoles and 2 games to make this work, then there is some potential to sell this to the public. Crytek, Naughty Dog, Epic and other top tier developers would prolly jump on the opportunity to get a performance edge in the console market.

I am not saying I have the business plan for this model or anything...but I am hoping that someone at Microsoft or Sony does :)

Or what if because they have the extra processing power of another console, they don't max out the tech of only one console? I see a little of evil in this formula...


Then the developpers would get an inferior looking game to 90% of the people playing it. I doubt more than 10% of the console users would buy 2 systems and 2 copies of the game for some visual benefits...