By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sales Discussion - The PS3 not performing as bad as some claim?

MikeB said: I don't think all game developers are lazy, especially not many coming from an Amiga background like the guys at Factor 5 (of Turrican fame, developing Lair for the PS3 right now). The classic Amiga is still popular within the demoscene, such developers tend to push every little ounce of performance out of ancient hardware configurations and actually have fun doing so. Look at this as a rewarding challenge. For instance the people behind Max Payne/FutureMark (demoscene group MatureFurk) made an excellent demo for the Amiga in 2001 winning the Assembly competition, the biggest demoscene event around the world. Lapsuus demo 50 mhz 68k Amiga with an AGA chipset (released in 1992): http://youtube.com/watch?v=2aoIsWxGWHQ For instance regarding Jorge (Jürgen Schober) a SW engineer posting at AmigaWorld, who also toured an early version of AmigaOS4 around Europe in 2003: Article here He is very excited about the Cell since he got his PS3 (he moved to the US), he thinks the Cell is a lot of fun to experiment with and offers lots of potential. He also thinks Linux already runs well on the PS3, OS4 should run even better since it better optimized to run efficiently (already runs on an old Amiga 1200 released in 1992, expanded with 166Mhz PPC and 128 MB, ancient video).
I havent seen that one before. Sure blows the PS1 out of the water graphically.



PSN ID: Kwaad


I fly this flag in victory!

Around the Network

[quote]MikeB said: A quick port of OS4 to the PS3 should run faster than any A1 or Peg II."/quote] So far this is right. The real problem in the case of the Cell lies in a different place. Each component for itself is quite fast, and its component are not so difficult to program. The real problem is their interaction. The Cell was in fact designed for some very special purposes in mind: 1. Cluster computing. You write a program that calculates the changes in a certain node to the next timestamp. Then you load this program into the SPEs. The PPE on the other hand is a kind of intelligent cache. It loads the current data of nodes and gives them to free SPEs. SPEs return their computed values to the PPE which gives them their next values and writes the results back to the memory. The results will be the next start values in the next timestamp.. 2. Pipeline You divide a problem into certain independend steps. One step calculates the start values for the next step. You implement for eachs of these steps an SPE Programs. THen you load SPEs with these programs. Every time, when all SPEs are finished they give their results to the next SPE. These are the basic design targets of the complete Cell. To achieve this each SPE has its own local memory of 512 KB. These SPEs and the PPE are connected through a common ring. The problem of the cell in the PS-3 whatr have these concepts to do with a game? It is really hard to imagine a game that can be implemented in the first model. But the second model shows the real problem. You have to find parts that can run independent from each other and consume nearly the same amount of time and you would like to have exactly the number of independent parts as you have available SPEs. If you have even one more, you have a problem. If one of your parts does not fit into 512 KB of memory. The SPEs do only work efficient if they work most of the time in their local memory. If they need a result of another SPE they have to send a message through the bus and wait for the reply. If they have to load values from the memory directly, they have to wait even longer. And if you look on the demands, you will see, that these demands are a bit unrealistic. In most cases the steps will not be finished in the same amount of time. In reality some steps will be calculated much faster than others. And even if you would be able to find a solution where only a few parts would be two times faster than others, it would be a totally static solution. You are working on a game. The demands change rapidly, and there are always game designers, who demand changes, because the game does not feel right. You do not have the time to find perfect static solutions, instead ypu have to be able to react to changes. Big (but not too big, the 512 KB limit) monolithic structures can't work this way. It is easier to work with more, smaller units. But then you have to load the programs first. In fact the Xbox could have a big advantage: its three cores (more or less twins of the PPE)have a unified cache. If the processor was not forced to swap the memory to the local memory you can create multiple working clients of the same type rapidly on the different cores. You don't have to deal with the code difference of SPE and PPE. You don't even have to overview the different processes so tightly, because the processor will do it for you. If you look at the articles the explanation about the use of cell stay in most cases very abstract. With good reason. There are no simple methods if you want to work SPE based. "



@ Kwaad

I havent seen that one before. Sure blows the PS1 out of the water graphically.
Actually the Amiga 1200 was turned into a CD based games console in 1993, called the Amiga CD32, sadly most games were straight A500 and A1200 game ports without much enhancement. (so an A1200 game which took 1 diskette on the A1200, like for instance Naughty Ones would use 0,1% of the available CD storage space..) Still the CD32 managed to secure over 50% of the fledgling CD-ROM market in the UK in 1993 and 1994 outselling the MegaCD, Philips CDi and even PC CD-ROM sales, till C= bankrupted in 1994 due to PC branch losses. Still there were some good game releases like Banshee, Chaos Enige, Alien Breed Tower Assault and there are a couple of similarities with the PS3! The CD32 could be expanded with a keyboard, mouse, diskdrive, harddrive, FMV (movies), etc, turning the console into a genuine homecomputer running AmigaOS 3.1! http://www.amiga.org/modules/myalbum/photo.php?lid=2798&cid=20 Sadly the CD32 had a lot more potential than was ever realized, the much older Amiga 500 (1MB Ram, 7 MHz) from the 80s however was pushed more to its limits by game developers. Famous A500 demo (880kb diskette), State of the Art: http://www.youtube.com/watch?v=ay2Qyc45Uks&mode=related&search=



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

For something like that, I can't help to admit it's very impressive.



Nobody is crazy enough to accuse me of being sane.

@ vanguardian1 The PS3 could solve the Amiga community's current hardware problems, the OS is finally done, but where's the hardware? The A1s served its purpose as a development platform for moving 68k code to PPC, but such hardware offers limited potential for actually expanding the userbase if targeted at ordinary people. When these systems were still available about 1500 A1 systems have been sold together with the OS4 developer pre-release. These motherboards (the mini-ITX MicroA1 with onboard graphics, sound, 256 MB RAM module, 800 Mhz G3) alone costed more than a complete PS3 system! PS3s are surely going to sell more units than all Amiga models combined in the past. If we take into account the price of the even a lowend Amiga 500, this computer launched for $595.95 in 1987, so taking in account inflation this would translate roughly into $1056.25 for 2006. From such a perspective the PS3 (multi-functional) launch price doesn't sound bad, add $100 for AmigaOS and it's still well below A500 launch figures.



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

Around the Network

That's true, but then again, the PS3 will still be mainly a game console, and $600 is still a lot of money for a lot of us. I'd like one, but at this rate I and many other probably won't own one 'till 2010 or 2011. :-/



Nobody is crazy enough to accuse me of being sane.

Today interesting news is expected to come from GDC 2007. Interesting info regarding the Cell and Linux: "This week at Game Developers Conference IBM will show a Linux based PS3 real-time rendering a complex (3 million triangle) urban landscape, at 1080p resolution, using only software rendering techniques" Weblink



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

MikeB said: Today interesting news is expected to come from GDC 2007. Interesting info regarding the Cell and Linux: "This week at Game Developers Conference IBM will show a Linux based PS3 real-time rendering a complex (3 million triangle) urban landscape, at 1080p resolution, using only software rendering techniques" Weblink
I was actually impressed until I read the article MikeB. You left this part out of your quote. "Even though the PS3’s RSX is inaccessible under Linux the smart little system will reach out across the network and leverage multiple IBM QS20 blades to render the complex model, in real-time, with software based ray-tracing. Using IBM’s scalable iRT rendering technology, the PS3 is able to decompose each frame into manageable work regions and dynamically distribute them to blades or other PS3s for rendering. These regions are then further decomposed into sub-regions by the blade’s Cell processors and dynamically dispatched to the heavy lifting SPEs for rendering and image compression. Finished encoded regions are then sent back to the PS3 for Cell accelerated decompression, compositing, and display." The Cell renders that scene with the aid of multiple Cells from IBM QS20 Blade servers. The way you quoted it, I was under the impression that Cell rendered the scene completely on its own.



http://www.ps3forums.com/showthread.php?t=22858 That link is an excellent read. I don't think it is biased and if you can stomach the whole reading it is good info to know. The PS3 by far has the most computing power, the most graphics power, the fast memory transfer, and the most room to expand development in the future. Much like the PS2 (which has now achieved graphics better than Xbox and Gamecube, see God of War II if you don't believe me) the PS3 has room to grow in the future. Also if you listen to certain developers (Factor 5) they make strong arguments as to why the PS3 does make sense for developers if they put the work in. (Some of you might remember Factor 5 did the Rogue Squadron series which people went insane over on gamecube cause of how real the movies looked and the gameplay). They are making a title named Lair, and if anyone has seen the screens, it looks excellent. I believe they mentioned they choose the PS3 because of the graphics, computational power, and motion control. They flat out said the game would not be possible on the Wii (due to the size of levels, graphics, and AI involved) and they didn't do it on 360 cause it didn't make sense without motion control. You're flying a dragon, obviously it might be nice to have motion control. They have some good interviews where they talk about the PS3 architecture and taking advantage of programming with the PS3.