By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - A Factual Article everyone should read.

WereKitten said:
A word of warning: that article dates back to November 2006.

While I could only find a few troubles with the "factual" parts of it (eg, I don't think it's true that the EDRAM on the Xenos is enough to have HDR+4xAA for free, unless you go sub-HD like in Halo 3), the speculations about how much the potential of the CPUs and GPUs can be tapped are naturally quite outdated, as the info about the memory footprint of the PS3 OS, that has been reduced meanwhile.


It is possible that OS systems of both consoles have got better on needing resources. But all the factual info about CPU's, memory and GPU will NEVER change unless they actually physically update the components. So yes your right about the OS. But everything will NOT change. Icluding both companies lies about GFLOPs. Even though Sony and M$ said theoretical they both knew they worded so average joe public would belive them. Truth is No game this will go over 80 GFLOPS on either console.



Around the Network
selnor said:
WereKitten said:
A word of warning: that article dates back to November 2006.

While I could only find a few troubles with the "factual" parts of it (eg, I don't think it's true that the EDRAM on the Xenos is enough to have HDR+4xAA for free, unless you go sub-HD like in Halo 3), the speculations about how much the potential of the CPUs and GPUs can be tapped are naturally quite outdated, as the info about the memory footprint of the PS3 OS, that has been reduced meanwhile.


It is possible that OS systems of both consoles have got better on needing resources. But all the factual info about CPU's, memory and GPU will NEVER change unless they actually physically update the components. So yes your right about the OS. But everything will NOT change. Icluding both companies lies about GFLOPs. Even though Sony and M$ said theoretical they both knew they worded so average joe public would belive them. Truth is No game this will go over 80 GFLOPS on either console.

That's not a lie, the theoretical numbers they gave are right. The fact that you usually wouldn't come close to those theoretical limits was well known to anyone who read some more in-depth analysis. I don't care much if it resulted deceiving for a "Joe Public" that learns about gigaflops to taunt others during piss contests, but then doesn't learn what it actually means in practice. The same happened with the gigahertz race between Intel and AMD during the Netburst architecture era.

But my point was another: much of the "speculative" part of the aritcle was simply written before developers were really put to test at efficiently implementing their work on the hardware. And remember that the article mostly comes from the perspective of PC coders, used to certain kinds of standard out-of-order processors and little SMP.

For example, the part about the SPE being useless at AI holds on the notion of the standard algorythms that benefit heavily on branch predicition. Same thing has been said about the collision detection code. But different algorythms have been used that can be efficiently ran by the SPEs ( Uncharted offloaded AI calculations to SPEs, I'm pretty sure there are other cases).

So I'm just saying: the technical specs part is good enough, though not perfect, and it makes a good read. The speculations about how the CPUs and GPUs would cope with real cases of game programming were a bit academic at the time, and should be taken with many pinches of salt, seeing what actually happened in practice since 2006. For example the writer was impressed with Heavenly Sword, and I think we can all agree that we've gone way beyond that initial benchmark.



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

WereKitten said:
selnor said:
WereKitten said:
A word of warning: that article dates back to November 2006.

While I could only find a few troubles with the "factual" parts of it (eg, I don't think it's true that the EDRAM on the Xenos is enough to have HDR+4xAA for free, unless you go sub-HD like in Halo 3), the speculations about how much the potential of the CPUs and GPUs can be tapped are naturally quite outdated, as the info about the memory footprint of the PS3 OS, that has been reduced meanwhile.


It is possible that OS systems of both consoles have got better on needing resources. But all the factual info about CPU's, memory and GPU will NEVER change unless they actually physically update the components. So yes your right about the OS. But everything will NOT change. Icluding both companies lies about GFLOPs. Even though Sony and M$ said theoretical they both knew they worded so average joe public would belive them. Truth is No game this will go over 80 GFLOPS on either console.

That's not a lie, the theoretical numbers they gave are right. The fact that you usually wouldn't come close to those theoretical limits was well known to anyone who read some more in-depth analysis. I don't care much if it resulted deceiving for a "Joe Public" that learns about gigaflops to taunt others during piss contests, but then doesn't learn what it actually means in practice. The same happened with the gigahertz race between Intel and AMD during the Netburst architecture era.

But my point was another: much of the "speculative" part of the aritcle was simply written before developers were really put to test at efficiently implementing their work on the hardware. And remember that the article mostly comes from the perspective of PC coders, used to certain kinds of standard out-of-order processors and little SMP.

For example, the part about the SPE being useless at AI holds on the notion of the standard algorythms that benefit heavily on branch predicition. Same thing has been said about the collision detection code. But different algorythms have been used that can be efficiently ran by the SPEs ( Uncharted offloaded AI calculations to SPEs, I'm pretty sure there are other cases).

So I'm just saying: the technical specs part is good enough, though not perfect, and it makes a good read. The speculations about how the CPUs and GPUs would cope with real cases of game programming were a bit academic at the time, and should be taken with many pinches of salt, seeing what actually happened in practice since 2006. For example the writer was impressed with Heavenly Sword, and I think we can all agree that we've gone way beyond that initial benchmark.

Multithread has been around for 20 years, and SPE's that are on Cell are not the first time PC coders have seen and tested these. The 90's had many manufacturers research this.

But it's intersting to point out to everyone as well, PS2 actually had a better CPU on the whole than Xbox 1. And the gap between Cell and Xenon is actually closer than the gap between PS2 and Xbox 1 CPU's. That is something that will shock many people.

You have to remember he does state that with time and effective programming developers can come up with using the SPU's. But at the end of the day it still comes back to spreading 512kb L2 Cache between 6 SPE's and 1PPE. As well as having to deal with each instruction in order rather than like a PC CPU which can deal with any code it needs to.

Their is actually very little speculation in the article. And when he does he tells you.



selnor said:

Multithread has been around for 20 years, and SPE's that are on Cell are not the first time PC coders have seen and tested these. The 90's had many manufacturers research this.

But it's intersting to point out to everyone as well, PS2 actually had a better CPU on the whole than Xbox 1. And the gap between Cell and Xenon is actually closer than the gap between PS2 and Xbox 1 CPU's. That is something that will shock many people.

You have to remember he does state that with time and effective programming developers can come up with using the SPU's. But at the end of the day it still comes back to spreading 512kb L2 Cache between 6 SPE's and 1PPE. As well as having to deal with each instruction in order rather than like a PC CPU which can deal with any code it needs to.

Their is actually very little speculation in the article. And when he does he tells you.

Research is one thing, but day-to-day coding on PCs is an entirely different one. The Unreal Engine 3, the multithread version of Source are stuff of about 2003-2004. And to this day Valve doesn't seem keen on anything but symmetric multiprocessing.

As to the bolded part: the author might have said "with time" in 2006, but in 2009 most of that time has come and gone yet :)

Talking about what the L2 cache or in-order processors could imply for the coding effort is all fine and dandy speculation before you start, but the proof is in the pudding. Good developers came to terms with symmetric multiprocessing and hyperthreading on the 360 and came to terms with the PPE+SPEs multiprocessing on the PS3. All those CPUs being in-order doesn't keep them from delivering better and better results each time, it seems.



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

MikeB is gonna have an orgasm !!!



Time to Work !

Around the Network

Most of this article appears to be factually correct. I agree with werekitten's comments about outdated projections.

Please don't let MikeB see this. It's an article he's attempted to discredit before, which resulted in a massive circular argument consisting of every reasonable technologically minded poster on this website versus Mike.



starcraft - Playing Games = FUN, Talking about Games = SERIOUS

WereKitten said:
selnor said:

Multithread has been around for 20 years, and SPE's that are on Cell are not the first time PC coders have seen and tested these. The 90's had many manufacturers research this.

But it's intersting to point out to everyone as well, PS2 actually had a better CPU on the whole than Xbox 1. And the gap between Cell and Xenon is actually closer than the gap between PS2 and Xbox 1 CPU's. That is something that will shock many people.

You have to remember he does state that with time and effective programming developers can come up with using the SPU's. But at the end of the day it still comes back to spreading 512kb L2 Cache between 6 SPE's and 1PPE. As well as having to deal with each instruction in order rather than like a PC CPU which can deal with any code it needs to.

Their is actually very little speculation in the article. And when he does he tells you.

Research is one thing, but day-to-day coding on PCs is an entirely different one. The Unreal Engine 3, the multithread version of Source are stuff of about 2003-2004. And to this day Valve doesn't seem keen on anything but symmetric multiprocessing.

As to the bolded part: the author might have said "with time" in 2006, but in 2009 most of that time has come and gone yet :)

Talking about what the L2 cache or in-order processors could imply for the coding effort is all fine and dandy speculation before you start, but the proof is in the pudding. Good developers came to terms with symmetric multiprocessing and hyperthreading on the 360 and came to terms with the PPE+SPEs multiprocessing on the PS3. All those CPUs being in-order doesn't keep them from delivering better and better results each time, it seems.

I agree. He even stated that good devs would work around the small cache.

No denying though if both ps3 and 360 had out of order processors, they would be much more efficient and faster. As the quote in his article from Romaro states.

Maybe next gen we will see OOO Processors.



its outdated



...not much time to post anymore, used to be awesome on here really good fond memories from VGchartz...

PSN: Skeeuk - XBL: SkeeUK - PC: Skeeuk

really miss the VGCHARTZ of 2008 - 2013...

Well, yeah, this site has some pretty retarded fanboys lurking around.  I'd even gotten sick of the PS3 fanboys, so, I'd stopped being one myself.  But, I, however had a 360, two years before a PS3 (and now, I have two working 360s), so, I could specifically say what I did, and didn't like about 360, because I had one.

It wasn't like I'd always owned a PS3, and always hated the 360 because for the simple fact it wasn't a Sony product.  But, thanks to nameless idiots (I'll let you decide who the idiots are) and, microsoft's decent E3 showing, I dislike 360 a little less today. (I thought Sony's was awesome, but, Microsoft's wasn't too bad, however nintendo was just a borefest).  I watch all 3 showings live, from start to finish, and, nintendo's was the only one I couldn't wait to see end.

At least Wii owners wont try to say that any game they have is better looking than something like Kill Zone 2, or something like that.  They know their place.  Wii mostly sold because of massive hype, Sony's PS3 price, and word of mouth, and in search of a new audience.  Not because of how powerful it is.

I haven't read anything yet, but, I will after the post.  But, can the 360 read up to 7 controllers? Does it let you change controller ports with a push of a button, or do you have to turn them off and on in a specific order?  Does is scratch your discs?  Does it have the highest failure rate of all 3 systems (PS3, and Wii combined doesn't have anywhere close to 360 failures).  Directional pad is also retarded.. I used to hate accidentally jumping in SCIV because of that thing... And, I'm not paying a penny for online, I'd rather buy something from the Playstation Store.  Xbox Live points means that things are a bit more expensive than on Play Station network.

And, those are the specific reasons I don't like 360.  See?  Nothing to do with the actual games at all, but, the first one was a joke, lol.  I don't care if graphics are slightly better on PS3, or 360.  I still have no plans to get SCIV on PS3 (not yet) eventhough it apperently looks better on PS3 (I'm kinda picky, and want collectors edition for it, just for the sake of having a bigger case).



@selnor
"But at the end of the day it still comes back to spreading 512kb L2 Cache between 6 SPE's and 1PPE. As well as having to deal with each instruction in order rather than like a PC CPU which can deal with any code it needs to."

i read this article back in 2006.

your statement only proves how outdated this article is..for one the SPE's do not just rely on the PPE for instruction, because the SPE's have their own instruction's seperate for the PPE. the SPE's can do direct DMA to and from other SPE's with out the need of the PPE.because the SPE's have their own local store.

here is some fact's for people:

users.ece.gatech.edu/~lanterma/mpg/ece4893_xbox360_vs_ps3.pdf

real world tests....

http://www.ibm.com/developerworks/power/library/pa-cellperf/

@selnor

to me this shows his take has many faults on what you describe as.."facts" about the ps3 though i respect his opinion i do no agree with it. like other's may not agree with mine.



I AM BOLO

100% lover "nothing else matter's" after that...

ps:

Proud psOne/2/3/p owner.  I survived Aplcalyps3 and all I got was this lousy Signature.