By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Why is there soo much negativity in any 360 related thread?

I think the PS3 fanboys are better behaving since the slim....

That or their best game ever rant was turned to more deserving games than before so it was less anoying....



OoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoOoO

Around the Network

After 2009, the PS3 was supposed to reign supreme. In 2010, the 360 has been putting up a hell of a fight and laughing at that notion. The PS3 library has been rising in quality this whole gen but this is the first time it's receding. SDF has to come out stronger than ever to combat this.



http://www.destructoid.com/fact-ps3-fanboys-are-the-worst-149259.phtml

This article may shed some light on the battle between PS3 fans and the 360 fans. $599 PS3 launch price tag. FFXIII no longer a PS3 exclusive. Infamous Versus Prototype. Killzone 2 reaction from PS3 fans.

360 fans are ignorant. But the PS3 fans are the worst. PS3 fans outnumber the 360 fans on this sites and are much more vocal.



PS3 fans will start to get boring when PS3 take over xbox 360...



Hmmm... funny, I swear this time last year you could have switched 360 with PS3 on this thread and it would have been more accurate. It seems to me when the rival console is doing well the fans will make threads to defend their preffered console in outrageous ways, which then become argument central with fans on both sides making ignorant comments.

If the Slim's popularity decreases and 360's increases I expect you'll see the same thing happen with PS3 fans making outrageous threads and 360 fans comming in to point out ignorant PS3 fan comments.



Around the Network
numonex said:
http://www.destructoid.com/fact-ps3-fanboys-are-the-worst-149259.phtml

This article may shed some light on the battle between PS3 fans and the 360 fans. $599 PS3 launch price tag. FFXIII no longer a PS3 exclusive. Infamous Versus Prototype. Killzone 2 reaction from PS3 fans.

360 fans are ignorant. But the PS3 fans are the worst. PS3 fans outnumber the 360 fans on this sites and are much more vocal.

no all it did was gather more hit's for their site...

so Jim's OPINION is just that an OPINION.

[Editor's Note: We're not just a (rad) news site -- we also publish opinions/editorials from our community & employees like this one, though be aware it may not jive with the opinions of Destructoid as a whole, or how our moms raised us. Want to post your own article in response? Publish it now on our community blogs.]

this artical may shed some light on ONE person's OPINION.

an it was to:

goad

An agent or means of prodding or urging

exactly what your doing right now..



I AM BOLO

100% lover "nothing else matter's" after that...

ps:

Proud psOne/2/3/p owner.  I survived Aplcalyps3 and all I got was this lousy Signature.

joeorc said:
selnor said:
joeorc said:

an for people that think IM just pro Playstation 3 and trying to tear down the xbox360

 Im not What i am trying to do is tear down these misconception's both side's have against the other:

The 360’s GPU can produce up to 500 million triangles per second. Given that a triangle has three edges, for practical purposes this means the GPU can produce approximately 1.5 billion vertices per second. (In comparison, the ATI X1900 XTX processes only 1.3 billion vertices per second and runs at nearly double the clock speed.) For antialiasing, the 360 GPU pounds out a pixel fillrate of 16 gigasamples per second, using 4X MSAA (Multi-Sampling Anti-Aliasing). Of course, the big claim to fame of the 360’s GPU is the stunning 48 billion shader operations per second, thanks to its innovative use of Unified Shader Architecture.

make no mistake about that is one bad mofo of a graphic's card

Why is that figure so impressive? For the uninitiated, shader operations are the core of what makes a rendered graphic look the way it does. There are two separate types of shaders that are used in gaming graphics: vertex shaders and pixel shaders. Vertex shaders impact the values of the lines that make up a polygon. They are what determine how realistic animation of polygons and wireframe models will look: the swagger of a walking character, for instance, or the rolling tread of a tank as it crushes an android skull laid to waste on a charred battleground.

Pixel shaders, on the other hand, are what determine how realistic that charred battlefield will look or the color of the dents in the tank. They alter the pixel’s color and brightness, altering the overall tone, texture, and shape of a “skin” once it’s applied to the wireframe. These shaders allow developers to create materials and surfaces that no longer look like, say, the main characters in Dire Straits’ “Money For Nothing” video. That is, they enable developers to create games with textures and environments that much more closely resemble reality.

Each of these graphics processing functions are called and executed on a per-pixel or per-vertex basis as they pass through the pipeline. Until recently, graphics processors handled each type of shader individually with dedicated units for each. Developers used low-level assembly languages to talk directly to the chip for instructions on how to handle the shaders, or they used APIs such as OpenGL or DirectX. Unified Shader Architecture changes all that by handling both shader types at the hardware level in the same instruction process. This means that the GPU can make use of the common pieces of each type of shader while making direct calls and relaying specific instructions to the shader itself. This decreases the actual size of the instruction sets and combines common instructions for two shader types into one when applicable. This is how the 360’s GPU quickly and efficiently handles shader operations. 48 billion shader operations per second, in fact.

which by the way is what one function of the Cell processor can do for the RSX.

now the Cell is by no mean's faster at that then xbox360's GPU, but it does'nt need to be.

How Does It Stack Up?

It’s tempting to compare the GPU inside the Xbox 360 to today’s high-dollar, high-performance video cards, and some who do might scoff a little. The latest graphic cards from Nvidia and ATI, such as Nvidia’s GeForce 7800 GTX and ATI’s Radeon X1900 series, areon papersuperior GPUs. They tout processor speeds of 550 to 625MHz and memory clock speeds of 1,500MHz and above. In terms of raw horsepower, these cards are indeed brutes. Of course, if there’s one thing we’ve all learned about clock speeds in the great processor wars between Intel and AMD, it’s that raw speed hardly translates into a real measure of processing power.

It’s not hyperbole to say that video memory bandwidth is one of the most important (if not the most important) parts of processing and rendering graphic elements. This is simply because bandwidth and speed determine how rapidly instructions can be transferred, processed, and returned to the system. Thus it’s in direct control of overall graphics performance for a system.

To improve video memory bandwidth, graphics card manufacturers have resorted to the typical methods of boosting speed, such as creating wider bitpaths (512MB nowadays) or boosting core clock speed. These techniques have placed performance in the range of 40 to 50GBps at peak range, which is respectable when compared with other graphics processors. However, these figures still fall short of the Xbox 360’s 256GBps.

Yes, you read that right: 256GBps memory bandwidth. It’s utterly stunning, and it’s thanks to the chip’s embedded 10MB of eDRAM.

No currently available video card makes use of embedded DRAM. And even if one was available, it’ll be at least the end of 2006 before they’ll be of any use. That’s when Windows Vista comes out, meaning that the operating systems they’re gaming on can’t make use of Vista’s WGF (Windows Graphics Foundation) 2.0 features. This speed of instruction handling combined with Unified Shading Architecture not only makes the GPU inside the Xbox 360 the current graphics powerhouse, it also means it’ll stay that way for a number of years.

And even when current PC-based GPUs start catching up, it’s going to be extremely expensive to match the performance of this dedicated gaming platform. At the time of this writing, the top-level cards by ATI and Nvidia described in this article are retailing for around $560 apiece, and that’s without Unified Shading Architecture support or eDRAM. And of course, there are other aspects of the system to consider, such as the fact that the CPU and memory were custom-built for dedicated gaming performance.

ATI and Microsoft have truly built something special in the Xbox360’s GPU. It’s astounding to see a chip with such power run at such an efficient clock speed and generate as little heat as it does, while at the same time making use of never-before-seen technology that will surely be replicated in graphics cards and consoles for years to come. It’s comforting to know that the Xbox 360 will continue to produce visually stunning and smooth graphics well into the foreseeable future.

http://www.smartcomputing.com/editorial/article.asp?article=articles/archive/r1003/77r03/77r03.asp

once again the RSX is not just a 780 GTX

Jen-Hsun Huang already stated that the 7800 gtx will be slower then the RSX but when the PS3 is launched thier will be a faster desktop chip then the RSX.

BOTH THE XBOX360 AND THE PS3 has custom designed graphic's system's in their machine's

 

A triangle is only one form of poly's. Many currnet games use squares and hexagons among others. Theres plenty of official info for Cell and Xenos via IBM themselves, which backs up the 114.2 GFLOPS of Cell ( instead of the theoretical 200GFLOPS ). Also several posters have posted from Crytek, ID etc that PS3 GPU is underpowered. Crytek in fact have said it 3 times. I posted a video link in here from 6 months ago that Crytek say PS3 weaker of the 3. Others posted newer articles from devs like Rage team and ID who say the 360 can do x y and z better. So what makes PS3 fans info more credible than the info 360 owners post. Thats the point here. In many cases the newer articles and head programmer articles side with 360, so why does that get pushed aside?

nope..the people making that claim, is the very same people that are making a guess.

here's IBM's:

IBM, the theoretical peak performance of a single SPE is 25.6 GFLOPS. The seven SPEs in the PS3, in addition to the VMX unit in the PPE, would yield a total combined single-precision floating point performance of 218 GFLOPS (the same figure quoted by Sony). It should be noted that this figure is an estimate based on ideal, 100% efficient operation of the processor. The floating-point capacity of the PS3 will significantly exceed that of the XBox 360, although it should be noted that Microsoft's console, due to its 3 symmetric fully featured processor cores (which are very similar to the Cell's PPE), may fare better on dynamically branching code, like that used for artificial intelligence.

It should also be noted that floating-point performance is a single-dimensional metric for comparing computers, and that many other considerations (including integer performance, memory size and bandwidth, etc.) determine the "overall" performance of a computer system. Floating point calculations are very important for graphics, multimedia, and game physics, but considerably less important for other tasks like artificial intelligence.

Finally, whether the PS3's advantage in floating-point performance will be readily apparent in games depends entirely on whether developers are able to effectively make use of the system's unique architecture.

 

which:

you stated this:

Also several posters have posted from Crytek, ID etc that PS3 GPU is underpowered. Crytek in fact have said it 3 times. I posted a video link in here from 6 months ago that Crytek say PS3 weaker of the 3. Others posted newer articles from devs like Rage team and ID who say the 360 can do x y and z better. So what makes PS3 fans info more credible than the info 360 owners post. Thats the point here. In many cases the newer articles and head programmer articles side with 360, so why does that get pushed aside?

 

what a FREAKIN SUPRISE:

Multi-platform developer's..

john Carmack:

"...the only real advantage that the PS3 has over the 360, from our point of view, is the extra space."

On development for Rage. "...and we're trying to say pretty plainly that this is going to be the one thing that the PS3 version is gonna be better at, and in fact it's almost the worst sort of thing for Microsoft there because we are having to work twice as hard on the PS3 to bring it up to spec. But in the end it's going to be 60fps game, and it's going to wind up looking just like the 360, we just had to sweat lot more for it. And if it winds up getting a benefit because of the blu-ray and having the better compression on there, then it's going to wind up looking like the PS3 was the better machine, even though it really wasn't..." So John, how do you really feel about the PS3?

"...the only real advantage that the PS3 has over the 360, from our point of view, is the extra space."

so from his point of view= the entire industry?

while I respect John, there are other developer's that happen to disagree with him.

 

Stop. Your so far off the mark. Ok. Those figures are from IBM yes. But not actual usage data. Efficiency is where the figures are at. And Cell in PS3 gets nowhere near it's claims.Here the following is from IBM's actual tests on their own website. Something Sony fans never ever use. And something they always push aside hoping noone will ever see it. Table 9. Comparison of Linpack performance between Cell BE and other processors
Linpack 1kx1k (DP) Peak GFLOPS Actual GFLOPS Efficiency
SPU, 3.2GHz 1.83 1.45 79.23%
8 SPUs, 3.2GHz 14.63 9.46 64.66%
Pentium4, 3.2GHz 6.4 3.1 48.44%
Pentium4 + SSE3, 3.6GHz 14.4 7.2 50.00%
Itanium, 1.6GHz 6.4 5.95 92.97%
Now at this point I will point out to you, the SPU's also according to IBM CANNOT run an operating system. They are only designed for Floating work. IBM's own words. Now the Pentium + SSE3, 3.65GHz is a single core processor. If that was a quad core it would destroy 8 SPU's together even though the SPU's are designed for the work this test provides and the Pentium isn't. The Pentium is a PPE processor. Doing 7.2 Actual GFLOPS in a test which Cell was designed to be good at, and 8 SPU's actual is only 2 GFLOPS above a single core PPE back in 2006.Also bear in mind take off some for PS3 has a dormant SPU and 1 dedicated to OS and PS3 has only 6 SPU's available for making games.This is where the figures come from. When all is added up, under actual performance the Cell does 114.2 GFLOPS and Xenon 115.4. But remeber games that require more General purpose work, have considerably more power on the Xenon as the SPU's CANNOT do it at all. So PS3 has just 1 PPE to do those jobs.Table 7. Performance of parallelized double-precision Linpack on eight SPUs
matrix size # of Cycles # of Insts. CPI Dual Issue Channel Stalls Other Stalls Used Regs SPEsim GFLOPS Measured GFLOPS Model Accuracy Efficiency
1Kx1K 236.7M 69.1M 3.42 2.9% 6.7% 68.5% 128 9.704 9.46 97.49% 64.66%
2Kx2K 1.64G 44.9M 3.65 2.2% 3.3% 72.5% 128 11.184 11.05 98.80% 75.53%
Table 4. Performance of parallelized Linpack on eight SPUs
Matrix size Cycles # of Insts. CPI Single Issue Dual Issue Channel Stalls Other Stalls # of Used Regs SPEsim Mea- sured Model accuracy Effi- ciency
1024x1024 27.6M 2.92M 0.95 27.9% 32.6% 26.9% 12.6% 126 83.12 73.04 87.87% 35.7%
4096x4096 918.0M 1.51G 0.61 29.0% 56.7% 10.8% 3.4% 126 160 155.5 97.2% 75.9%
http://www.ibm.com/developerworks/power/library/pa-cellperf/Now IBM also state the following:

The PPE was designed specifically for the Cell processor but during development, Microsoft approached IBM wanting a high performance processor core for its Xbox 360. IBM complied and made the tri-core Xenon processor, based on a slightly modified version of the PPE.[33][34]

Anything the PS3 needs the PPE for, the 360 is more than 3 times as powerful at doing so. Because the SPU's cannot do it at all.




I think this negativity is cause the human-extended idea "That one that falls, lets fall upon him", people just like to blame the other´s mistakes without looking at their own, in most of the cases, they are the same mistakes.

 

I m not sure how to translate the idea very well, but i think you can understand it :D

 



themanwithnoname said:
I dunno what's funnier, that this thread is a microcosm of what the OP's talking about or that libellule's argument boils down to "lots of people agree with me, so I must be right!"

"lots of people agree with me, so I must be right!"

keep your shit out of my mouth plz

When I said the "whole internet", I m referring to the main review websites.
Considering how the "whole internet" is OVER dominated by the american propaganda and has bashed the PS3 to hell (just check the rating of Uncharted1 compared to Halo3 ...), I do believe that, when the same internet people change their opinion then it means something has changed.

To put it clear : despite all anti PS3 biais, it is common knowledge that the more technically impressive games are on PS3.

(see my signature, even jesus kung fu magic say it !)



Time to Work !

selnor said:
Stop. Your so far off the mark. Ok. Those figures are from IBM yes. But not actual usage data. Efficiency is where the figures are at. And Cell in PS3 gets nowhere near it's claims.Here the following is from IBM's actual tests on their own website. Something Sony fans never ever use. And something they always push aside hoping noone will ever see it. Table 9. Comparison of Linpack performance between Cell BE and other processors
Linpack 1kx1k (DP) Peak GFLOPS Actual GFLOPS Efficiency
SPU, 3.2GHz 1.83 1.45 79.23%
8 SPUs, 3.2GHz 14.63 9.46 64.66%
Pentium4, 3.2GHz 6.4 3.1 48.44%
Pentium4 + SSE3, 3.6GHz 14.4 7.2 50.00%
Itanium, 1.6GHz 6.4 5.95 92.97%
Now at this point I will point out to you, the SPU's also according to IBM CANNOT run an operating system. They are only designed for Floating work. IBM's own words. Now the Pentium + SSE3, 3.65GHz is a single core processor. If that was a quad core it would destroy 8 SPU's together even though the SPU's are designed for the work this test provides and the Pentium isn't. The Pentium is a PPE processor. Doing 7.2 Actual GFLOPS in a test which Cell was designed to be good at, and 8 SPU's actual is only 2 GFLOPS above a single core PPE back in 2006.Also bear in mind take off some for PS3 has a dormant SPU and 1 dedicated to OS and PS3 has only 6 SPU's available for making games.This is where the figures come from. When all is added up, under actual performance the Cell does 114.2 GFLOPS and Xenon 115.4. But remeber games that require more General purpose work, have considerably more power on the Xenon as the SPU's CANNOT do it at all. So PS3 has just 1 PPE to do those jobs.Table 7. Performance of parallelized double-precision Linpack on eight SPUs
matrix size # of Cycles # of Insts. CPI Dual Issue Channel Stalls Other Stalls Used Regs SPEsim GFLOPS Measured GFLOPS Model Accuracy Efficiency
1Kx1K 236.7M 69.1M 3.42 2.9% 6.7% 68.5% 128 9.704 9.46 97.49% 64.66%
2Kx2K 1.64G 44.9M 3.65 2.2% 3.3% 72.5% 128 11.184 11.05 98.80% 75.53%
Table 4. Performance of parallelized Linpack on eight SPUs
Matrix size Cycles # of Insts. CPI Single Issue Dual Issue Channel Stalls Other Stalls # of Used Regs SPEsim Mea- sured Model accuracy Effi- ciency
1024x1024 27.6M 2.92M 0.95 27.9% 32.6% 26.9% 12.6% 126 83.12 73.04 87.87% 35.7%
4096x4096 918.0M 1.51G 0.61 29.0% 56.7% 10.8% 3.4% 126 160 155.5 97.2% 75.9%
http://www.ibm.com/developerworks/power/library/pa-cellperf/Now IBM also state the following:

The PPE was designed specifically for the Cell processor but during development, Microsoft approached IBM wanting a high performance processor core for its Xbox 360. IBM complied and made the tri-core Xenon processor, based on a slightly modified version of the PPE.[33][34]

Anything the PS3 needs the PPE for, the 360 is more than 3 times as powerful at doing so. Because the SPU's cannot do it at all.


Do we know what the developers use PPE's for and how efficiently the 360's 3 cores are utilised? As far as I'm aware most of the power hungry stuff is shifted to the GPUs and in the PS3's case the SPEs. As far as I'm aware AI and physics are mainly covered on the CPUs but physics can also be done on the GPU and more efficiently, and I didn't think AI was particularly power hungry.

On PC I'm running on an AMD 3800 X2 (V.old) and games run fine whilst looking better than most console games as I'm running on an 8800GTS GPU.