By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - PC graphics vs Console graphics

haxxiy said:
FrostyTop said:
The_vagabond7 said:

And just to quote a small bit of Rocketpig's post

Flat out untrue. If you spent $800 on a PC when the PS3 released, you'd still be able to play almost any game released (Mass Effect, CoD4, etc) on high settings and you'd probably be able to squeak out low/mid on Crysis, which still looks better than anything found on either HD console.

I spent about 700 on a PC christmas 2006, and I can play mass effect COD4, and bioshock fine and dandy, and they definitely have the graphical edge over their console counterparts on my PC which isn't very good at all by today's standards.

 

 I'd also like to comment on that in a similar line of thought.

When the Playstation 1 and the Sega Saturn, heck even the MegadriveSnes was released...

They WERE a good couple of years ahead of the latest PC hardware.

The PS3Xbox 360 were NEVER ahead of the latest PC hardware, they were based on the current latest PC hardware but using stripped down versions of it.

Xbox 360 = ATI X1800 graphics card with tosh CPU

PS3 = 7800GTX (this had been out well over a year on the PC) + over complex CPU that still couldn't stand up to a Quad Core Q6600 in real terms and still can't.

 

Any RTS will show the CPUs on the consoles aren't up to much and Crysis, although running like a dog even on the latest hardware this is only relative to other PC gamers games who EXPECT 60FPS+.

Compared to how a lot of games run on consoles, Crysis ran like a DREAM even on medium hardware.

 

 

So....In summary, the op doesn't know what he's talking about. I don't like reading people making sweeping statements that haven't been thought through properly.

The Cell is 218 GFLOPS single precision and 25 GFLOPS double precision counting all 7 SPE + PPU.

Q6600 is 96 GFLOPS single precision and 48 GFLOPS double precision.

For games AI, physics, etc. the Q6600 runs rings around the Cell by a factor of several numbers. In media encoding and graphic rendering the Cell wins by a lot more.

Cell works like halfway between a GPGPU and a common CPU just like the Emotion Engine, Xenos or the PS1/Dreamcast processor. Console producers seems to have got a taste for such.

 

When games start doing media encoding(???!!!), and the PS3 games start using the Cell genuinely for graphics rendering give me a call.

I'll be waiting, for a long time!

You're looking at it far too theoretically. The Q6600 is an allrounder beast. The Cell doesn't even belong in a games console as weak as the PS3, it's potential is completely unharessed and mosty unuseable without cost prohibiting optimisation.

In any case, its not as powerful as a Q6600 in the real world I don't care what numbers you have, I can show you some numbers for how fast a Q6600 can encode using x264, what real world numbers can you show me?

 



Around the Network

the GPUs of the 360 and PS3 are on par with PC video cards 2 generations ago. Current PC cards, even the mid range ones like the ATI Radeon 4850 and Nvidia 260 GTX completely cream the console gpus (modified ATI x1900 core with unified pixel/vertex pipeline, and 7800 modified with programable main system memory read/write functions).

The performance difference isn't remotely comparable. Games will run much faster on a modern $800-1200 PC then a console. However..

The graphic differences between PC and console versions in terms of texture resolutions is completely up to the developer. Most developers will most likely just target consoles, and just reuse the same lower detail assets on the PC version meaning PC players only get the added bonus of running at resolutions above 1080p (I run all of my pc games at 1920x1200, ie 1200p if the game allows it) and having insane AA and AF settings. I can't blame the developers for doing this but its something people who are writing these flame-bait posts should consider before bashing either consoles or PCs on the graphical front.



 

Slimebeast said:

Console versions most often have the same texture quality. Look at Tomb Raider Underworlds for example:

PS3 version (originally a 1280x720 res pic that I upscaled to 1080P then cut out a portion to be able to do a comparison):

PC version:
(All settings maxed, 4+AA. Screen taken with FRAPS. Res 1920x1200, portion cut out to be able to compare with PS3-version)

As you can see the textures are exactly the same. The blurryness on the PS3 is mostly caused by the upscale from original 1280x720 res to 1920x1080 (and perhaps somewhat by higher AF on the PC version, not sure).

And this applies to most multiplatform games - the PC version has the same textures, 3D models, mapping, lighting effects etc. Only the resolution and AA/AF is better on PC - whic is not as big thing as PC elitis make it to be.

 

EDIT: don't be fooled by the blurryness of the first (PS3) pic. It's caused by the "not best possible" quality of the original screenshot at IGN and also the extra loss in image quality because I used MS Paint to upscale it from 720p to 1080p (it isn't a good program to convert screenshots in).

Thanks for posting this example.  Let me say up front that since you yourself aren't sure what settings the PC version is at the example could actually be an unfair one for the PC but even then its pretty clear that the PC textures are better.  And just to quash the argument right here and now I've made an animated gif for you using your images:

Notice how the PS3 textures are far more blurry and how details are far more obvious on the PC version.  Also note that since we are looking at textures from a distance these details should be even less noticeable but yet they still are.  The clear conclusion from your own example is thus that the PC version has noticably better textures which is at odds with your conclusion of no difference whatsoever.

Key differences:

  • Note the plant under the right arm is much fuzzier on the PS3 version.
  • The cracks and wear'n'tear on the statue is far more defined and detailed on the PC version.
  • Vegatation behind the statue is much clearer on the PC and fairly fuzzy on the PS3.
  • Beads on the statues necklace reaveal higher quality bump mapping at work for the PC version in addition to the obvious texture detail.
  • Base of the statue is more defined and again has far more detail.
  • Note the complete lack of light shafts on the PS3 version.
  • A more subtle detail is that the lighting on the PC version is better as a result of the textures, particularly the bloom lighting


To Each Man, Responsibility

lol, nice gif


Sqrl,

I already explained the blurryness of the PS3 version (IGN compression combined with my upscaling in MS Paint). The textures are the same.

And the PC screen was taken by me (1920x1200 res, 4xAA, trilinear filtering, everything on max).



FrostyTop said:
haxxiy said:
FrostyTop said:
The_vagabond7 said:

And just to quote a small bit of Rocketpig's post

Flat out untrue. If you spent $800 on a PC when the PS3 released, you'd still be able to play almost any game released (Mass Effect, CoD4, etc) on high settings and you'd probably be able to squeak out low/mid on Crysis, which still looks better than anything found on either HD console.

I spent about 700 on a PC christmas 2006, and I can play mass effect COD4, and bioshock fine and dandy, and they definitely have the graphical edge over their console counterparts on my PC which isn't very good at all by today's standards.

 

 I'd also like to comment on that in a similar line of thought.

When the Playstation 1 and the Sega Saturn, heck even the MegadriveSnes was released...

They WERE a good couple of years ahead of the latest PC hardware.

The PS3Xbox 360 were NEVER ahead of the latest PC hardware, they were based on the current latest PC hardware but using stripped down versions of it.

Xbox 360 = ATI X1800 graphics card with tosh CPU

PS3 = 7800GTX (this had been out well over a year on the PC) + over complex CPU that still couldn't stand up to a Quad Core Q6600 in real terms and still can't.

 

Any RTS will show the CPUs on the consoles aren't up to much and Crysis, although running like a dog even on the latest hardware this is only relative to other PC gamers games who EXPECT 60FPS+.

Compared to how a lot of games run on consoles, Crysis ran like a DREAM even on medium hardware.

 

 

So....In summary, the op doesn't know what he's talking about. I don't like reading people making sweeping statements that haven't been thought through properly.

The Cell is 218 GFLOPS single precision and 25 GFLOPS double precision counting all 7 SPE + PPU.

Q6600 is 96 GFLOPS single precision and 48 GFLOPS double precision.

For games AI, physics, etc. the Q6600 runs rings around the Cell by a factor of several numbers. In media encoding and graphic rendering the Cell wins by a lot more.

Cell works like halfway between a GPGPU and a common CPU just like the Emotion Engine, Xenos or the PS1/Dreamcast processor. Console producers seems to have got a taste for such.

 

When games start doing media encoding(???!!!), and the PS3 games start using the Cell genuinely for graphics rendering give me a call.

I'll be waiting, for a long time!

You're looking at it far too theoretically. The Q6600 is an allrounder beast. The Cell doesn't even belong in a games console as weak as the PS3, it's potential is completely unharessed and mosty unuseable without cost prohibiting optimisation.

In any case, its not as powerful as a Q6600 in the real world I don't care what numbers you have, I can show you some numbers for how fast a Q6600 can encode using x264, what real world numbers can you show me?

 

Get your facts straight. The Cell isn't used only in PS3s. Among its uses are video processing card, blade servers, home cinema, supercomputing, cluster and distributed computing, mainframes etc.

It has been proven that using parallel matrix multiplication the Cell reaches 98% of its peak theoretical performance. Plus, here is demostrated how a 3.2 GHz Cell with 8 SPEs delivering a performance equal to 100 GFLOPS on an average double precision Linpack 4096x4096 matrix.

There is simply no way the millenium-old x86 architecture delivers more than an Cell equivalent. For instance, Fixstars Corporation released some months ago in Japan a PCI-E board using a 2.8 GHz cell with 180 GFLOPS single precision and 90 GFLOPS double precision.

 

 



 

 

 

 

 

Around the Network
Slimebeast said:

lol, nice gif


Sqrl,

I already explained the blurryness of the PS3 version (IGN compression combined with upscaling). The textures are the same.

And the PC screen was taken by me (1920x1200 res, 4xAA, trilinear filtering, everything on max).

Then why post it at all?

I mean your basically saying:

"Here are two images showing the PS3 clearly inferior but thats only because of upscaling so you should just assume and infer that it looks better when the upscaling isn't present. "

This does nothing to help your argument, and I have a hard time buying that the blurryness is entirely due to upscaling to begin with.

The problem here is threefold though:

First, finding a single game with similar textures only shows that one dev was lazy and just reused the textures. Second, an example should be at fairly close range to the texture so detail can be seen, this statue is so far out away that even with the upscaling the difference in detail is striking. Finally, you need to have screenshots from known settings and known capture techniques. 

In any case I went back and grabbed the original IGN image and people can decide for themselves how much of it was upscaling from your image. This time I downscaled the PC version which does put it at a disadvantage but again the result is clear:

 

Just looking at the left half of the statue's mid-section it is blatantly obvious that these are not the same textures.

PS - No amount of down or upscaling will ever remove the light shafts =P Those just aren't in the console version.



To Each Man, Responsibility

Interesting to note that IGN uses the same image for the PS3 and PC.  For whatever reason they are different sizes (PC is larger by nearly 3x) and yet they are the same resolution.  Here is an overlay gif with absolutely no modification to either the PS3 or PC images.  It's pretty obvious they just used the same image given the exactness of the angle and positioning.

Yes, this really is an animated gif.

If these were from a different system we wouldn't have Croft at the exact same spot in her animation, the camera would probably be at a slightly different angle, and there would probably be some differences in plant locations as well.

 



To Each Man, Responsibility
Sqrl said:

In any case I went back and grabbed the original IGN image and people can decide for themselves how much of it was upscaling from your image. This time I downscaled the PC version which does put it at a disadvantage but again the result is clear:

 

Just looking at the left half of the statue's mid-section it is blatantly obvious that these are not the same textures.

PS - No amount of down or upscaling will ever remove the light shafts =P Those just aren't in the console version.

Wow, good job!

You are probably right about the light shafts (although perhaps the sun wasn't shining in this direction when the PS3 shot was taken?).

I'm now perhaps leaning towards the PS360 using slightly less detailed textures (some form of compression, I dunno how that thing works), but... why are the palm trees leaves also blurry on the PS3 shot? The air filled "spaces"  between the leaves forming a grid-like pattern shouldnt look blurry like that no matter what quality textures were used on the PS3 version. This suggests still that the textures are the same and the blurriness/lack of detail is IGN's fault.

Another thing - I'm not 100% sure of this but I have a strong feeling that IGN (and Gamespot and many other sites) always tend to have kinda blurry screenshots even on their PC-screenies. I suspect they compress the screens more than I did when converting the PC screen from FRAPS .bmp to a .jpg.

 

 

 



^^Honestly I think the IGN images are just untrustworthy so I would call even my comparison into question from a strictly scientific standpoint.

I have severe doubts about their labeling of which images are from what platform and the differences I'm seeing in IGN images compared to other images indicates that their capture technique degrades the image quality, and that isn't necessarily going to be uniform from platform to platform meaning some platforms could be at a disadvantage (most likely the consoles due to being played at lower resolutions in most cases).  So even if the images are labeled properly by platform their probably unfairly altered by their capture technique.



To Each Man, Responsibility
Sqrl said:

Interesting to note that IGN uses the same image for the PS3 and PC.  For whatever reason they are different sizes (PC is larger by nearly 3x) and yet they are the same resolution.  Here is an overlay gif with absolutely no modification to either the PS3 or PC images.  It's pretty obvious they just used the same image given the exactness of the angle and positioning.

Yes, this really is an animated gif.

If these were from a different system we wouldn't have Croft at the exact same spot in her animation, the camera would probably be at a slightly different angle, and there would probably be some differences in plant locations as well.

 

lol

So they cheat, categorizing the same screens on 3 different platforms.