By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Killzone 2 VS Gears of War 2 through the eyes of a programmer

jetrii said:
Final-Fan said:
So you're saying the people using PS3 clusters are just ignorant?
Not at all. Like I said, GPUs are not fit for every task and there are times when a PS3 cluster would be more efficient and effective. However, GPGPU if fairly new to a lot of people and they may not know too much about it/don't have the funding or time to sit down and relearn everything again. Instead, they decide to go with a familiar system, even if they have to get accustomed to the Cell processor.

Please stop putting words into my mouth.

Wait, what?  How is it a familiar system if they have to get accustomed to it? 

Also, when you said "There are tasks which GPUs are not fit for but most of the PS3 projects I've seen involve tasks which a GPU can do much more efficiently for much less", I don't think it's entirely unreasonable to conclude that you meant that (most) people using PS3 clusters were ignorant of the superior GPU solution.  Apparently that's not an accurate assessment of your position, but that's a reason why I sometimes rephrase someone's position and spit it back -- to see if I'm understanding it correctly.  I guess I'll avoid that technique with you, but I don't think I need to apologize for it. 



Tag (courtesy of fkusumot): "Please feel free -- nay, I encourage you -- to offer rebuttal."
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
My advice to fanboys: Brag about stuff that's true, not about stuff that's false. Predict stuff that's likely, not stuff that's unlikely. You will be happier, and we will be happier.

"Everyone is entitled to his own opinion, but not his own facts." - Sen. Pat Moynihan
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
The old smileys: ; - ) : - ) : - ( : - P : - D : - # ( c ) ( k ) ( y ) If anyone knows the shortcut for , let me know!
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
I have the most epic death scene ever in VGChartz Mafia.  Thanks WordsofWisdom! 

Around the Network

@jetrii

Oh I see, thanks for clearing that up I wanted to get into developing games and all that but decided to go on a different career path :( still working with computers anyways so not a total lost...

Nice thread man keep it up



Final-Fan said:
jetrii said:
Final-Fan said:
So you're saying the people using PS3 clusters are just ignorant?
Not at all. Like I said, GPUs are not fit for every task and there are times when a PS3 cluster would be more efficient and effective. However, GPGPU if fairly new to a lot of people and they may not know too much about it/don't have the funding or time to sit down and relearn everything again. Instead, they decide to go with a familiar system, even if they have to get accustomed to the Cell processor.

Please stop putting words into my mouth.

Wait, what?  How is it a familiar system if they have to get accustomed to it? 

Also, when you said "There are tasks which GPUs are not fit for but most of the PS3 projects I've seen involve tasks which a GPU can do much more efficiently for much less", I don't think it's entirely unreasonable to conclude that you meant that (most) people using PS3 clusters were ignorant of the superior GPU solution.  Apparently that's not an accurate assessment of your position, but that's a reason why I sometimes rephrase someone's position and spit it back -- to see if I'm understanding it correctly.  I guess I'll avoid that technique with you, but I don't think I need to apologize for it. 

Systems with multiple CPUs have been around for decades. Although multiple processing units may be fairly new to games, multi-threading is the standard for scientific applications and other high performance applications. They are familiar with that system, they are just not familiar with the Cell processor. However, using the Cell processor for something that already scales well across multiple CPUs is not too hard. Using the Cell processor for something that is typically linear (games) is difficult. It doesn't take long to become familiar with it for those purposes.

I apologize if I seemed rude or angry, trust me, I'm perfectly calm. In the context, it seemed like you were saying that in a negative way, not as a question to clarify what I meant. I guess it's just one of those things that are difficult to understand with text. Had you said ignorant of a newer solution, then yes, I would have agreed. Calling them just ignorant sounded very negative.

@WereKitten

That was an interesting post. I'm a little too busy to search online for something I saw a while back, but I will certainly do so when I get home in 2 hours.

 

 

 



Good news Everyone!

I've invented a device which makes you read this in your head, in my voice!

I see.



Tag (courtesy of fkusumot): "Please feel free -- nay, I encourage you -- to offer rebuttal."
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
My advice to fanboys: Brag about stuff that's true, not about stuff that's false. Predict stuff that's likely, not stuff that's unlikely. You will be happier, and we will be happier.

"Everyone is entitled to his own opinion, but not his own facts." - Sen. Pat Moynihan
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
The old smileys: ; - ) : - ) : - ( : - P : - D : - # ( c ) ( k ) ( y ) If anyone knows the shortcut for , let me know!
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
I have the most epic death scene ever in VGChartz Mafia.  Thanks WordsofWisdom! 

@jetrii,who owns the rights to the cell technology is it a three way split ,or do toshiba /sony own it and IBM is the contracter and if so does ibm own the power x cell8i seeing it has the power branding or do the other companies have an interest in it and could it or a future version be the basis for say a PS4



Research shows Video games  help make you smarter, so why am I an idiot

Around the Network
mjk45 said:
@jetrii,who owns the rights to the cell technology is it a three way split ,or do toshiba /sony own it and IBM is the contracter and if so does ibm own the power x cell8i seeing it has the power branding or do the other companies have an interest in it and could it or a future version be the basis for say a PS4

 

IBM, Toshiba, and Sony all have ownership if the Cell processor. However, I am not sure how the IP is split. In 2006 they renewed their partnership to extend the alliance until 2011 or so. However, Sony is the only one of the the 3 that doesn't have any factories to make the Cell processor, they have to rely on other companies to make it for them. Don't quote me on this, but as I understand it, if one of the three companies makes improvements to the Cell processor, the other 2 companies have the right to benefit from it. Again, I may be wrong on this.

The PowerXCell 8i's most dramatic change is the way it handles double-precision floating point calculations. IBM bumped the peak performance from 14Gigaflops to over 100Gigaflops. Personally, although I don't want Sony to go with the Cell processor, I think it would be best for them if they did. They can retain backwards compatability and it is a powerful chip. I'd prefer a simple/inexpensive CPU and a monster GPU to handle the graphics/GPGPU but I know it's not likely, at least, not from Sony.

@WereKitten

Swapping a highly detailed model for a lesser detailed one is not unusual, in fact, quite a lot of games do that. While I understand that in a FPS the character does get closer to walls and other objects, you also have to consider that the player is always looking at the model. In Uncharted, that is a 30,000 polygon character that always has to be rendered. I have yet to play a TPS in which the character didn't look fugly after 10 minutes. In GoW and MGS4 they are just too close and the AA is not nearly high enough to mask the jaggies. However, like I said, I can understand your argument and you did raise a few points which I did not consider, but I still think that overall, FPS typically look better. 



Good news Everyone!

I've invented a device which makes you read this in your head, in my voice!

@jetrii ,time changes everything but whats wrong with say the latest development cycle cell at the time coupled with a monster GPU ,rather than a simple CPU and monster GPU,yes you could have a relatively cheap and fast CPU,but the latest cell should also be smaller more powerful and hopefully cheaper .
Another question at the moment INTEL make the fastest and best chips in the pc market why aren't they a serious player in the console cpu market and if sony go for a conventional CPU who would you like to be the supplier



Research shows Video games  help make you smarter, so why am I an idiot

mjk45 said:

@jetrii ,time changes everything but whats wrong with say the latest development cycle cell at the time coupled with a monster GPU ,rather than a simple CPU and monster GPU,yes you could have a relatively cheap and fast CPU,but the latest cell should also be smaller more powerful and hopefully cheaper .
Another question at the moment INTEL make the fastest and best chips in the pc market why aren't they a serious player in the console cpu market and if sony go for a conventional CPU who would you like to be the supplier

 

Sony won't make the same mistake again. There won't be be 600 dollar PS4 if Sony wants to stay in the industry. The money spent on the Cell processor could be spent on a cheaper CPU and the remaining funds allocated to the GPU. The more powerful and expensive the CPU, the less they can spend on a GPU. Developers are embracing GPU and moving more things onto the GPU, something which DirectX 11 will greatly improve. Averagr CPU + Monster GPU > Strong GPU + Strong GPU in my book. Then again, I am extremely biased since I work with GPUs :D

Sony made the same mistake Microsoft made years before, they got a GPU but they didn't get the IP for it. Right now, Microsoft owns the IP for both the CPU and GPU. They can go to any company and have them build their chips for them. Heck, they can build their own factory and make their own CPUs and GPUs. Sony is forced to go to Nvidia. Sony is at the mercy of Nvidia. Since they were in a hurry to get a GPU, they didn't have much to negotiate with. The RSX will probably be the most expensive components in the PS3 by the end of the generation. Like Nvidia, Intel doesn't like to share owenership of the IP. If anyone goes x86, they will probably go with AMD.



Good news Everyone!

I've invented a device which makes you read this in your head, in my voice!

@jetrii

About TPS vs FPS:

Yes, I'm sure many other TPS swap models a-la-Uncharted. I only brought it as an example because it is a game that I find beautiful to look at and still was able to use such an obvious optimization (incidentally I read that Naughty dog is going to use the hyper-detailed Drake model for both cutscenes and gameplay in Uncharted 2... I wonder if it will  make an obvious difference visually)

That was the whole core of my TPS/FPS idea anyway: TPS can decide more easily on texture/geometry vs performance tradeoffs because the view scene is much more "controlled". Controlled as in having a big character right in the middle, in a known and limited range of distances, and in having a much smaller logarithimc spread for the distances of other objects as well.

It seems to me that FPS don't have this luxury because by definition the camera is much more free, and the object right in the middle of the scene, ie the one the player is concentrating his/her attention on, is potentially the worst looking one because of scaling.

About processors:

You seem to be knowledgable on the subject, but I can't understand the whole Cell vs GPUs ongoing issue. From what I read it was my impression that we're moving towards fading the distinction between CPUs and GPUs, or at least that's what I got from my reading about Larrabee, about GPGPUs, and about the ideas that Cell themselves could be responsible for the rendering.

If this generation of consoles ends up with a longer lifespan, couldnt the whole CPU/GPU issue be moot at the point, with Sony sticking with an updated Cell and asking IBM to develop a Larrabee-like rendering subsystem based on a number dedicated SPUs?

Of course additional SPUs could "help in" if required, a bit like Guerrilla did with their deferred rendering techniques, using SPUs to have thousands of real light sources and complex post-processing.

If this is the trend Nvidia is basically the weakest player because their GPGPUs won't be able to match the flexibilty of "real" CPUs (Intel's Larrabee, AMD+ATI could very much move in the same direction, IBM/Sony could as well with Cell)

 

 

 



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

WereKitten said:

@jetrii

About TPS vs FPS:

Yes, I'm sure many other TPS swap models a-la-Uncharted. I only brought it as an example because it is a game that I find beautiful to look at and still was able to use such an obvious optimization (incidentally I read that Naughty dog is going to use the hyper-detailed Drake model for both cutscenes and gameplay in Uncharted 2... I wonder if it will  make an obvious difference visually)

That was the whole core of my TPS/FPS idea anyway: TPS can decide more easily on texture/geometry vs performance tradeoffs because the view scene is much more "controlled". Controlled as in having a big character right in the middle, in a known and limited range of distances, and in having a much smaller logarithimc spread for the distances of other objects as well.

It seems to me that FPS don't have this luxury because by definition the camera is much more free, and the object right in the middle of the scene, ie the one the player is concentrating his/her attention on, is potentially the worst looking one because of scaling.

About processors:

You seem to be knowledgable on the subject, but I can't understand the whole Cell vs GPUs ongoing issue. From what I read it was my impression that we're moving towards fading the distinction between CPUs and GPUs, or at least that's what I got from my reading about Larrabee, about GPGPUs, and about the ideas that Cell themselves could be responsible for the rendering.

If this generation of consoles ends up with a longer lifespan, couldnt the whole CPU/GPU issue be moot at the point, with Sony sticking with an updated Cell and asking IBM to develop a Larrabee-like rendering subsystem based on a number dedicated SPUs?

Of course additional SPUs could "help in" if required, a bit like Guerrilla did with their deferred rendering techniques, using SPUs to have thousands of real light sources and complex post-processing.

If this is the trend Nvidia is basically the weakest player because their GPGPUs won't be able to match the flexibilty of "real" CPUs (Intel's Larrabee, AMD+ATI could very much move in the same direction, IBM/Sony could as well with Cell)

 

 

 

I'll jump straight down to the CPU/GPU section since that is the part that interests me the most. Don't get me wrong, I think you have a very valid point, but I think that it's a rather objective discussion. Little pressed for time so I would rather jump down to the part I like and reply to the first part when I get home.

Although the lines of what GPUs and CPUs can do, they are still two very different chips. They are essentially going to acomplish the same thing, but they do it in completely different ways. Modern GPUs have hundreds(1000+ now) of stream processors with programmable shaders to achieve its goal while Larrabee has a few dozen x86 based cores stuck together + 512-bit SIMD unit and plans to achieve the same goal with software rendering.

The problem I have with Larrabee/Cell processor for graphics is that although they *can* render, and I am sure Larrabee will do it effectively, all the numbers I've seen are very underwhelming.

Last I heard, Intel said Larrabee could handle up to16 flops a second.

2Ghz * 16 flops * 48 cores = 1536 Gigaflops for the highest end Larrabee under the best possible conditions.

Right now a year old Radeon HD 4870 reaches around 1,200 Gigaflops and ATI's best card passes the 2.4 Teraflop barrier. A single 4870 can also handle double-precision floating point operations 16X faster than the Cell processor in the PS3 and over 2X faster than the latest PowerXCell 8i CPU which was released 6 months ago. It just seems to me that GPUs are advancing much quicker than CPUs and are becoming more and more flexible thanks to OpenCL and DirectX11

Also, keep in mind that Nvidia has been hiring x86 engineers like there's no tomorrow and AMD already has a bunch of them. Things are only improving for the GPGPU fanboys!


I know this post is a mess but I got lost in my own daydreams. Some people dream of driving ferraris, others dream of having millions of dollars... Me, I dream of a GPU with 2,000 stream processors, GDDR6 memory, and the ability to cure cancer (which it would obviously cause due to an overexposure to pure awesomeness)



Good news Everyone!

I've invented a device which makes you read this in your head, in my voice!