By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - Why Sony should also use a Cell Processor for PS5 (x86+Cell coprocessor)

vivster said:

I know I shouldn't be but I'm still surprised by your lack of basic technological understanding. Well, they do say hate comes from ignorance, so that explains a lot.

We do have a lot of people on VGC who are really knowledgeable in this stuff. Instead of fighting them you should consult them first before doing threads like these.

Is that your argument against mine? Throwing some Ad hominem

ClassicGamingWizzz said:
Op convinced me with the amazing quality arguments and research, top notch, learned a lot too.

Make ps consoles great again, bring back the power of cell !!!

When AMD sends over its Jaguar, they are not sending the best. They are not sending Ryzen, they are sending  processors that have lots of problems and they are bringing those problems. They are bringing lag, they are bringing 20 frames. They are rapists and some, I assume, are good CPUs, but I speak to Digital Foundry frame rate guards and they are telling us what we are getting haha



Around the Network

I was hoping for cell processor on the ps4. Thing is they brought in mark cerny around 2009 when at that stage alot of developers were still struggling with the cell and ports, especially 3rd party. By the time they had been working on the ps4 components, most if not all devs had become pros with the cell processor, and you had guys lake gabe newell who at first talked shit about the cell, released portal 2 on ps3 and was probably better than the xbox version, and we saw a trend of multilats looking better on ps3. Sadly it was too late by then. Ps4 was already in the works. Had mark cerny or sony execs known how well received the cell would have become, i feel they would have continued with the cell processor technology. Maybe they would have gone with 16 or 24 cell units with advanced architecture. But the cost of the console would probably be more.



Would probably be a devs nightmare.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

KratosLives said:
I was hoping for cell processor on the ps4. Thing is they brought in mark cerny around 2009 when at that stage alot of developers were still struggling with the cell and ports, especially 3rd party. By the time they had been working on the ps4 components, most if not all devs had become pros with the cell processor, and you had guys lake gabe newell who at first talked shit about the cell, released portal 2 on ps3 and was probably better than the xbox version, and we saw a trend of multilats looking better on ps3. Sadly it was too late by then. Ps4 was already in the works. Had mark cerny or sony execs known how well received the cell would have become, i feel they would have continued with the cell processor technology. Maybe they would have gone with 16 or 24 cell units with advanced architecture. But the cost of the console would probably be more.

The CELL was never well received. Except maybe for Kojima and Sony's first parties.

Developers simply got used to its needlessly complicated architecture as the gen went on. 

Going the X86 route was the smart thing to do. Now developers don't have to throw money into learning yet another alien console setup, and can simply focus on optimizing their games for platforms that all speak the same language.



pxrocks said:
They can but it's going to make NGPS more expensive which may slow down its adoption rate.they rather use that resource to actually make NGPS more powerful and use that power to soft emulate ps3 games.placing two CPUs inside a gaming console is not good for anything.imo they should make NGPS more affordable as they can.if it launches at 350$ ww then it's guaranteed that it's going to break some record.

Technically the Playstation 4 has 2x CPU's anyway.

You have the 8-core Jaguar group. (Which are actually 2x Quad Core processors.) and then you have the secondary ARM CPU and DDR3 Ram to assist with background tasks.

Teeqoz said:

Even then, the whole BC argument is stupid... A PS5 will easily be powerful enough to emulate the PS3 if Sony wants it (the XBO can, after all, emulate the 360. Now, while the PS3 is afaik a lot harder to emulate, the PS5 should also be a lot more powerful than the XBO...). Not to mention PS2 and PS1. That only leaves the PS4 which is easily achieved by sticking to x86.

The Playstation 3 Emulator is actually better than the Xbox 360 emulator, so the Cell hasn't been a hindrance in that regard.

captain carot said:

Thing is, everything where really CELL exceeded falls under those parallel processes. Cloth simulation, you wouldn't actually do that on a CPU today. In other cases CELL was kind of a nightmare for programmers.

Many things CELL was intended for are simply done GPGPU-wise today, for the other stuff there are way more efficient ways.

Not really. Because modern CPU's not only have superior serialized performance to Cell, but they have significantly superior parallel capabilities as well... Hyper-Threading has been a massive boon for parallel processing capability. - In-fact right now the PC seems to be experiencing the "Core Wars". - Thanks in part to AMD of course.

However, both pale in comparison to a modern GPU's capabilities on that front.

Ruler said:

Microsoft wont be around when the PS5 launches

Lies.

Ruler said:

But i tell one thing they wouldnt composite the extra power from a cell processor inside a PS5 by just adding in a Xeon from the 360 for their next xbox, because that would turn their BC roadmap they have taken so far into a waste.

You are still under the false illusion that the Cell is actually powerful? Really? Are you being serious or are you just trolling?

Ruler said:

What you just dont get is that a Ryzen is pretty much all what Sony could get out of AMD in 2019 looking at how expensive they are now. 

False.

Ruler said:

 It doesnt just work that way that they you can add in an extra core for a CPU. You ether buy an affordable 8 core, or super expensive 12 or 16 core, looking at Ryzen.

False.

Ruler said:

the Cell is just being an extra processor for some extra graphics.

If you think you have ample knowledge about Cell. Then please. Describe those "extra graphics" to me.
What effects are you talking about exactly?

Zkuq said:
I'm no expert, but I'm fairly sure combining even exactly identical processors isn't easy to do efficiently. There's a reason even multicore development is challenging, let alone multiprocessor development. Here we have a suggestion to combine a fairly standard processor with possibly the hardest processor ever to develop for in a console. I'd say this sounds like a recipe for a disaster.

Combining Identical processors has actually been a thing for over a decade.
But there are different approaches to doing so which has it's own caveats.

You have systems that have two separate physical processors, these tend to be the hardest to program for as there are a ton of caveats, namely communication between the two chips.

Then you have two separate physical chips on the one processor card, this is actually the approach Intel took with some Core 2 Quad processors, packaging two Dual-Cores to make a Quad-Core and it worked well, from a developer standpoint you would never have known.
And AMD took that same idea and took it to the extreme with Threadripper having multiple CPU chips on the one processor allowing them to Scale CPU core counts to 16+ cores, 32 threads.

And with the Xbox One, Xbox One X, Playstation 4, Playstation 4 Pro... AMD took two Quad-Core Jaguar units and combined them to make an 8-core processor which has it's own caveats, especially when it comes to cross-communication between the two separate quad-core clusters, but for the most part has been a non-issue for developers.

KratosLives said:
 Had mark cerny or sony execs known how well received the cell would have become, i feel they would have continued with the cell processor technology. Maybe they would have gone with 16 or 24 cell units with advanced architecture. But the cost of the console would probably be more.

The Cell was never well received.

Ruler said:

Yes you can have as good of a GPU you want, it wont archive 60fps if still using the same Jaguar CPU cloacked @1.6-2.2 Ghz. The XBox One X is the definitive prove for that, Microsoft had all the time in the world to fix it, they went that way because 4k 30fps was the goal from the beginning. 

That is completely up to the developer and how the game is bottlenecked. If you are GPU bound, then doubling the GPU performance can mean the difference between 30fps and 60fps, regardless of the CPU being used.

Ruler said:

So now i have to philosophy about the numbers 4 and 18? I can tell you what i know about this topic, that 4 is almost a fourth of the number 18 thats all you need to know about.

I mean if this is all too complicated for you and me, I can also express it visually by showing some pictures of the Last of Us on PS3  and ask you if it can hold a candle against Uncharted 4?

You are not even making any sense.
You can have a GPU with more Gflops perform slower than a GPU with less Gflops. - Do you wish for me to provide some evidence for this like I have prior in other threads?
Because the GPU industry is littered with examples where this is the case.

Ruler said:
So are you, because you never provide any evidence for anything

The difference is. I can. - And if you desire for me to do so, I will be happy to provide evidence for all my points going forward.

However, I think over the years of being on VGChartz I have conducted myself with a degree of proficiency where my knowledge in regards to technology can be taken with a degree of seriousness.
Unless you are claiming you have more intimate knowledge on these topics than I?

Ruler said:
So what, it is using PowerPC? Just like Macs did or the Wii U. Doesnt mean its bad, just look at Wii U games like Bayonetta 2 or the latest Zelda, all praised for their graphics and scale and that wasnt even a Cell.

The WiiU and Switch don't have good graphics.
That doesn't mean the games cannot have great artistic flair, did you not watch the Digital Foundry on the breakdown of Zelda's imagry?
There were a ton of graphical sacrifices that were made.

But because you desire evidence... Here you go.
Sub 720P resolution: http://www.eurogamer.net/articles/digitalfoundry-2017-zelda-breath-of-the-wild-uses-dynamic-resolution-scaling

Drops to 20fps were a thing: http://www.eurogamer.net/articles/2017-03-31-zelda-patch-improves-framerate-on-switch

Poor Texture Filtering: http://www.eurogamer.net/articles/digitalfoundry-2017-the-legend-of-zelda-breath-of-the-wild-face-off

Poor Anti-Aliasing, draw distance on things like grass also leaves much to be desired, low-quality shadowing.



Ruler said:
There are various benchmark showcasing how the Cell can render a lot of stuff without the GPU.

And? All CPU's can render. But if you think that render video is somehow superior to what RSX or a modern CPU can give us... Then you are kidding yourself.

Here is Unreal, which could run on a 300Mhz Pentium 2, think: Worst than the Original Xbox CPU.



Doesn't mean the Cell is great at rendering. (And if that image quality in the video you posted that has no Physics, High-Quality lighting, shadowing, Particles, A.I and so on is to go by... Eww.)

Ruler said:
Yeah an integer calculation designed for x86.

What the hell?
https://en.wikipedia.org/wiki/Integer


Ruler said:
Sure the Jaguar is great but any processor would be with 8 Gigs of Ram and a powerful GPU, put that into the Cell and you would see the same performance for many games, if not even better in some games.

Except, no.
Also... Ram typically has no processing capabilities, so it doesn't actually "speed" anything up.

I demand you provide evidence that Cell would provide the same or better performance than Jaguar when equipped with a powerful GPU and plentiful amount of Ram.

Because the games say otherwise:
Battlefield 4 has more than twice the multiplayer players (24 vs 64) in a match, that is something that is very much CPU driven, not memory or GPU.
See here: http://www.bfcentral.net/bf4/battlefield-4-ps3/
And here: https://battlelog.battlefield.com/bf4/news/view/bf4-launches-on-ps4-with-64-players/

Ruler said:
Like in PUBG as an example, it would probably run better on the Cell than on a Jaguar processor, simple because its designed for dual or quad processor on PC, not 8th cores.

I have PUBG on PC. I have 12 threads. I can assure you, PUBG utilizes them all.



Ruler said:
And that was my main point that Cell wouldnt supposed to be used to render the entire game if it is just a co-processor

Just like Fetch. Cell is never going to happen, so stop trying to make it happen.

Ruler said:

It doesnt matter what is better than XDR2, XDR2 is better than GDDR5 and is needed for the Cell. All the other types of RAM arent out there in any relevant form and probably ultra expensive too.

But you made the statement that XDR2 is the best. You were wrong.
Here, go brush up on your Ram tech, clearly your information is stuck a decade in the past.

https://en.wikipedia.org/wiki/High_Bandwidth_Memory
https://www.anandtech.com/show/9969/jedec-publishes-hbm2-specification
https://www.anandtech.com/show/9266/amd-hbm-deep-dive

Note: the GDDR5X and GDDR6 16Gbps per pin which is stark contrast of about 12Gbps per pin of XDR2.
https://www.anandtech.com/show/12186/micron-finishes-gddr6-internal-qualification

Not to mention, you can take DDR2 Ram and if taken wide enough, made faster than GDDR5 or XDR2 anyway.


Ruler said:

Yes it has everything to do with Developers and also gamers cheerleading for their favorite hardware monopolies.

I demand evidence for your baseless conspiracy theory.

Ruler said:

 Look at PS4s Exclusive Uncharted 4, Driveclub, Bloodborne, they look beyond anything that runs on any Nvidia hardware, and that despite of the weak Jaguar CPUs.

Get back to me when they are 4k, 60fps.
All those games listed would look better on PC, running high-end nVidia graphics.

Ruler said:

 Software optimisation is all what matters, consoles arent effected by that because its only 1 or 2 pieces of hardware of the same brand, in this case on ''Weak'' AMD architecture.

You make it sound like PC doesn't get any kind of optimization? That would indeed be an ignorant assumption on your behalf.
Here an AMD driver increased performance by up to 8%.
http://www.guru3d.com/articles-pages/radeon-crimson-driver-december-2016-performance-analysis,1.html

Or hows about a 20% increase?
https://techreport.com/review/29357/amd-radeon-software-crimson-edition-an-overview

Or hows about the performance increase that Direct X 12 brang to the table, bringing low-level console-like efficiency to PC?
https://www.extremetech.com/gaming/246377-new-directx-11-vs-directx-12-comparison-shows-uneven-results-limited-improvements

Or Hows about the performance increase that Vulkan brought to the table, based upon AMD's Mantle tech?
https://www.anandtech.com/show/11223/quick-look-vulkan-3dmark-api-overhead

Or hows about the performance gains with Windows 10?
https://www.digitaltrends.com/computing/windows-10-review-gaming-performance/

Or when games release patches to increase performance.
https://www.extremetech.com/gaming/246840-new-ashes-singularity-update-substantially-boosts-ryzen-performance

Don't ever make the ignorant assumption that the PC never gets any kind of optimization.



--::{PC Gaming Master Race}::--

Around the Network
KLAMarine said:
I'm just here to read up on what tech-heads make of this suggestion.

Did you come to a conclusion after reading what the tech-heads had to say?



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Pemalite said: 
Ruler said:

...

Lies.

Ruler said:

...

Are you being serious or are you just trolling?

Ruler said:

...

False.

Ruler said:

...

False.

Ruler said:

...

You are not even making any sense.

Ruler said:
...

The difference is. I can. - And if you desire for me to do so, I will be happy to provide evidence for all my points going forward.
Unless you are claiming you have more intimate knowledge on these topics than I?

Ruler said:
...

And? All CPU's can render. But if you think that render video is somehow superior to what RSX or a modern CPU can give us... Then you are kidding yourself.

Ruler said:
...

What the hell?

Ruler said:
...

Except, no.

Ruler said:
...

Just like Fetch. Cell is never going to happen, so stop trying to make it happen.

Ruler said:

I...

But you made the statement that XDR2 is the best. You were wrong.

Ruler said:

...

I demand evidence for your baseless conspiracy theory.

Ruler said: 

...

You make it sound like PC doesn't get any kind of optimization? That would indeed be an ignorant assumption on your behalf.

This is hilarious. Ruler is like console Trump^^

*edited for your viewing pleasure*



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

I'm not quoting that huge post but Ruler. That PS3 video of Cell rendering a city looks like shit even for the era. Wii could render a city like that practically.



I think a CPU bottleneck is more desiareble than a sales bottleneck. The cell was a problem for developers. I don't think it's a good idea to repeat that mistake. The PS4 is easy to develop for and port to. That is intentional, and i think it's the best choice.

The only thing preventing 60 FPS are the developers/publishers themselves. If they get more power they will just once again focus on graphic resolution rather than FPS.



Pemalite said:


captain carot said:

Thing is, everything where really CELL exceeded falls under those parallel processes. Cloth simulation, you wouldn't actually do that on a CPU today. In other cases CELL was kind of a nightmare for programmers.

Many things CELL was intended for are simply done GPGPU-wise today, for the other stuff there are way more efficient ways.

Not really. Because modern CPU's not only have superior serialized performance to Cell, but they have significantly superior parallel capabilities as well... Hyper-Threading has been a massive boon for parallel processing capability. - In-fact right now the PC seems to be experiencing the "Core Wars". - Thanks in part to AMD of course.

However, both pale in comparison to a modern GPU's capabilities on that front.

 

 


Ruler said:
Like in PUBG as an example, it would probably run better on the Cell than on a Jaguar processor, simple because its designed for dual or quad processor on PC, not 8th cores.

I have PUBG on PC. I have 12 threads. I can assure you, PUBG utilizes them all.



 

First, we're in the same boat about multiple cores, for games as well as apps. I was refering to the stuff where CELL actually seems to shine. Like cloth simulation and other stuff that can be parallelized that good. If not neccessary you wouldn't do that on the CPU.

In other cases, as far as i've always understood CELL, the architecture can be a massive hindrance. Even IBM's improved PowerXCell didn't work out as planned in the end.

But PS3's CELL actually depends on dividing the workload on as much SPE's as good as possible, which wasn't that easy and sometimes did not work at all. So leading to games with good multithreading, they should definitely benefit from multicore CPU's if they are to use all SPE's. That doesn't give CELL any advantage over 'real' multicore CPU's htough, at least in my understanding.