By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Squilliam said:
Deneidez said:
Squilliam said:
Deneidez said:
Squilliam said:
It doesn't do floating point code as well as a GPU.

Actually it does the job better than GPU sometimes with floating point calculations. It really depends on application. Anyway for raster rendering its kind of useless and thats what GPU usually do only.

Certainly not the raw throughput. I've been reading up on the compute shader in the Direct 3d SDK and its looking pretty cool. By the time the PS4 is out we'll probably be looking at the draft spec if not the final result for direct 3d 12. I Was more talking about the future since the software to take advantage of the Cells unique abilities hasn't been written either.

Well, it is shown that CELL can do nicely raytracing when compared to regular CPU or GPU,

http://web.archive.org/web/20080123121030/http://gametomorrow.com/blog/index.php/2007/09/05/cell-vs-g80/

(GPU architecture just isn't made for raytracing. :P)

 

Anyway, theres no really need for raytracing or extra power for floating point operations in todays games so it really doesn't matter from game point of view.

 

I predict that MikeB will show up soon and start pasting things he doesn't really understand himself.

And ray-tracing isn't made for real time!

 

I remembered seeing a demo a few months back of a real-time raytracing demo using the CBE. A quick Google search yields all kinds of info on real-time raytracing using the PS3 by developers/programmers.

http://www.gametrailers.com/player/usermovies/227820.html (clustered PS3s running LINUX)

http://eric_rollins.home.mindspring.com/ray/ray.html (individual programmer project)

http://www.alphaworks.ibm.com/tech/irt (interactive raytracer for CBE)

So, the CBE is not the problem for the PS3. The issue is mainly memory (lack of) and reliance upon GPUs that simply age too quickly to stay on the edge of graphical development.

The RSX is the rough equivalent of a Nvidia G70/7800 GPU commonly used in high end gaming PCs back in 2006. Emphasis: back in 2006. Most of the cards based upon that GPU only shipped with 256MB of VRAM as well.

You'd be hard pressed to find a single serious gamer using a G70 based graphical solution unless they were using an old laptop, or were too poor to afford a cheap G94 or G92 based VGA card for even a $100. Most serious gamers have already moved forward two generations to an R770/4000 series ATI GPU or a GTX200 series Nvidia GPU(s).

 

So the point is pretty much as an end user, most non-tech savvy individuals are not going to see the merits of the CBE as a developing technology.

The gullible will buy it simply from being overly succeptible to marketing claims and the ability to boast about having something they don't even fully understand.