By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - CoD4 is 600P?!?!?

@ Lord The Night Knight

My statement was that rendering resolution is indeed relevant to graphics quality. Never did I have a discussion or confusion about this on the PC when discussing this with people, of course you could actually choose the rendering resolution on the fly on a powerful enough PC to notice the difference, so any claims of this being irrelevant was pretty much pointless.

With regard to jaggies, there are various ways to reduce jaggies, one would be to increase the rendering resolution together with higher res graphics, secondly is by hand picking your colors in a way that there's not much contrast between the colors or use lighting techniques and then you also have the various AA techniques.



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

Around the Network

@ rocketpig

You look at the systems, see ~9mb/s data transfer speeds, 512 megs of RAM (including GPU RAM), and yet you think that the main limitation of these consoles is disc size?


Exactly that's much more / faster than the PS2 which reached the limits of DVDs.

9 MB/s is faster than the average reading speed on the XBox 360 for dual layer DVDs. Single layer DVDs can be read faster, but such games are small enough for quite a few of such games to be completely installed on the PS3's harddrive and could IMO rather be distributed on the PSN.

Regarding 50 GB, note that if you continuously stream 4.5 MB per second for games on average (sound and graphics), it only takes about 3 hours to stream all that data!



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

*shakes head in disbelief*


Maybe your opinion is clouded by running a bloated OS like Windows Vista back at home, requiring GBs of system RAM as well as virtual memory just to use the operating system well enough. I also have a WindowsXP box at home, but I have a much different background using many different efficient operating systems and following the demoscene quite closely this last decade, 512 MB is a huge amount of memory to work with on a games console (MacOS X, Windows and full blown Linux distros [at least they're free, no ultra high expectations there] are very inefficient).



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

@ TheBigFatJ

Of course, if you want good graphics you'll be playing it on computer where you can /easily/ render 1080p.


Depends entirely on the game, I can easily imagine current PCs choke on trying to achieve the same things as future and some current SPE tapping PS3 games are and will be achieving (but for cross platform games, top PCs will probably always offer the highest Specs, obviously such games are not designed specifically with the PS3 in mind).

And personally I only have a 17 inch LCD for my PC and don't like a bigger screen on my desk and I prefer playing from my couch anyway and for many games the PS3 controller IMO is much more suitable. It's not just about wanting good graphics.



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

MikeB said:
@ Lord The Night Knight

My statement was that rendering resolution is indeed relevant to graphics quality. Never did I have a discussion or confusion about this on the PC when discussing this with people, of course you could actually choose the rendering resolution on the fly on a powerful enough PC to notice the difference, so any claims of this being irrelevant was pretty much pointless.

With regard to jaggies, there are various ways to reduce jaggies, one would be to increase the rendering resolution together with higher res graphics, secondly is by hand picking your colors in a way that there's not much contrast between the colors or use lighting techniques and then you also have the various AA techniques.

I did not wrote that rendering resolution is not relevant to the gameplay. I wrote, "Rendering resolution is part of the texture buffer, not the frame buffer." 

And your jaggies comment shows you don't know my point at all. I clearly stated that more pixels mean more work for the frame buffer, and one of your suggestions for jaggies is to add more pixels



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

Around the Network

@ Lord The Night Knight

I did not wrote that rendering resolution is not relevant


You highlighted this part of my message you didn't seem to agree with:

Of course the rendering resolution counts significantly as well.


Anyway higher resolutions increase possible details (and makes jaggies less noticeable), just reducing jaggies does not add details, this reduces contrast between neighbouring pixels, making the graphics look smoother.



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

MikeB said:
@ Lord The Night Knight

I did not wrote that rendering resolution is not relevant


You highlighted this part of my message you didn't seem to agree with:

Of course the rendering resolution counts significantly as well.


Anyway higher resolutions increase possible details (and makes jaggies less noticeable), just reducing jaggies does not add details, this reduces contrast between neighbouring pixels, making the graphics look smoother.

You still don't get my point. Those things you mention can be achieved by upscaling, while keeping the native resolution low (even below "trud HD"), to save bandwidth on the frame buffer.



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

@ Lord The Night Knight

You still don't get my point. Those things you mention can be achieved by upscaling


Well, at least we agree on one thing, I don't get your point at all.

Upscaling adds no details, similar like upscaling a DVD to 720p or 1080p does not give you Blu-Ray/HD-DVD detail (neither native 1080p or downscaled to 720p).



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales

MikeB said:
@ Lord The Night Knight

You still don't get my point. Those things you mention can be achieved by upscaling


Well, at least we agree on one thing, I don't get your point at all.

Upscaling adds no details, similar like upscaling a DVD to 720p or 1080p does not give you Blu-Ray/HD-DVD detail (neither native 1080p or downscaled to 720p).

That's because you are confusing raster images with vector images. Videos just diplay several raster images a second. 3D graphics don't work that way, as they are vector images. Raster images are static, so what you see is what you get, and of course resolution is important. 3D graphics are vector images, they work like the human eye and the world it sees.

If you run past an enemy in an FPS, it's not in frame, and therefore not read by the frame buffer part of the VRAM. Yet it has to still be in the memory, or else it wouldn't be there if you turn around. It's not reloaded from the ROM, which would take longer than the split second to turn around (no loading medium is that fast). So when the enemy is in a level, it's in the texture buffer part of the VRAM, whether or not it's in frame.

So again, the frame buffer, is like the human eye; the texture buffer is like the world it sees. Therefore, the world cannot lose detail if the eye doesn't see as much. The detail is still there. The same is with 3D graphics. You still see the detail, even if the screen resolution is lower.*

However, unlike the human eye, the frame buffer can work with the texture buffer to save bandwidth and add detail. Yet these, as I've stated, require each and every pixel to be watched. This is not true for video files, since the images are static, and are going to be the same (save for display settings, but those would affect games as well). The RAM is just there to read the video, not render it.

This means that as a frame buffer has only so much room (just 128MB in an optimal situation on the PS3), all the work it does it added by more pixels. If it has fewer pixels to work with, it can add more detail. Yet this work is also to save bandwidth for the texture buffer, which puts more detail there.

In short,  CoD 4 uses a lower native resolution to help get the most detail out of the levels and character models. If it were a movie, it would be native 1080p, because there are just frames to display, not graphics to create.

*I do admit screen resolution can be too low, but that would have to be 240p, as in 16-bit resolutions, not 600p. 



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

@ Lord The Night Knight

Vector graphics have been with us at least since the early 80s.

Using a scaler does not necessarily mean that the picture becomes clearer or more detailed, scalers in their simplest form only increase the sample points for the original signal resulting in more data points for the original given information. You can however add video processing or algorithms creating a Video processor to filter and/or add detail enhancement.

Although such modern chips can enhance image quality significantly (not only by scaling) the results will be of no match compared to natively rendered graphics. Why else would you buy a PC with more processing power if adding a cheap scaler chip would be sufficient? I haven't, but maybe you should compare Gears of War on a 1080p display and look if the PC version while natively rendering in a similar resolution provides significantly better results, I guess it will, how much would be depending on graphics assets.

God of War 2 looks great in 576p and will look good upscaled on a 1080p HDTV (HDTV's have their own scalers), but God of War 3 would certainly offer much better image quality (probably in a native 1080p resolution) on this HDTV.



Naughty Dog: "At Naughty Dog, we're pretty sure we should be able to see leaps between games on the PS3 that are even bigger than they were on the PS2."

PS3 vs 360 sales