Quantcast
the 1080p people...

Forums - Gaming Discussion - the 1080p people...

OdinHades said:
If resolution doesn't matter, we can keep playing at 256 x 240 pixels on our NES Systems until all eternity. After all, its gameplay that matters, right?


Somebody says something similar in every thread about resolution.  Of course resolution matters!  And of course 1080p is definitely better than 720p!  The argument is that 720p is HD and 1080p is HD.  1080p is better but to many people, it's not as big a deal.

I was happy to have my PS3 and Xbox 360 upscale PS2 and Xbox games.  I was disappointed when the Wii U didn't upscale Wii games.  Sharper is better than the alternative. There comes a point where it doesn't matter, though.  My PC has a billion different resolution settings and I don't even know which one is best.  I just chose one that looked sharp enough for my TV and left it at that.  

1080p looks better but when the game is actually in motion, does it matter?  My TV is 42 inch LED 1080p, active 3D capable, w/ Picture in Picture, Smart apps, 5 HDMI (one on the side I just found), and 4 USBs and 240 Hz, btw. I can't give you the refresh rate since I'm at work.  



Twitter: @d21lewis  --I'll add you if you add me!!

Around the Network
drake4 said:
SandyVGina said:
i didnt expect `1080p from nintendo, but i did from the xbox one and ps4. look PC gamers have been bragging about 1080p for years, i expected sony and microsoft to put enough power into there counsel so most games could be 1080p. it looks like Sony did, and it looks like MS got cheap and did not.

microsoft didn't go cheap though there console cost more to make then ps4, they just made horrible descisions by focusing on making a all in one entertainment device, they decided early they wanted 8 gigs of ddr3 no matter how bad it would effect the console performance,.sony was orginially gonna jusy have 2-4 gigs of fast ram, cause of how expensive it was, and there main focus was on making the best gaming console, later they decided lets just put 8 cause the price has dropped significantly by the time the console was in final planning stages.


Neither Microsoft, Sony or Nintendo will release exact manufacturing costs and they are not obligated to do so. Always take what they state with a pinch of salt, statistics are something easily manipulated especially when you don't give out any actual figures.

Yes Microsoft went cheap. They ripped out some GPU compute units so they could fit the ESRAM into the same silicon. They could easily have done 2 memory pools with 8GB of main memory and perhaps 256M/512MB of GDDR5 memory but instead they went for a low cost approach. Most of kinect processing is done by the main cpu/gpu.

There is nothing really there to justify the retail price but like any manufacturer it's in their interests to sell at the maximum price. No point selling at $400 if it still sells well at $500.

All the spec information for the Xbox One shows its very weak performance. We aren't talking wii u weak but its still disappointing. Many don't see the Xbox One price as fair for the performance on offer. Like most consoles though the real sales come when its gone through a few price reductions. I'm looking to buy at £200-250 which I think is a fair price for what is on offer. I think there is a strong chance this will be possible next year as European sales have been quite poor in mainland Europe and without a price reduction I think Microsoft will simply hand Europe to Sony on a plate, with the possible exception of the UK. However even though I'm in the UK its easy to import from mainland Europe if they are cheaper there.



d21lewis said:
OdinHades said:
If resolution doesn't matter, we can keep playing at 256 x 240 pixels on our NES Systems until all eternity. After all, its gameplay that matters, right?


Somebody says something similar in every thread about resolution.  Of course resolution matters!  And of course 1080p is definitely better than 720p!  The argument is that 720p is HD and 1080p is HD.  1080p is better but to many people, it's not as big a deal.

I was happy to have my PS3 and Xbox 360 upscale PS2 and Xbox games.  I was disappointed when the Wii U didn't upscale Wii games.  Sharper is better than the alternative. There comes a point where it doesn't matter, though.  My PC has a billion different resolution settings and I don't even know which one is best.  I just chose one that looked sharp enough for my TV and left it at that.  

1080p looks better but when the game is actually in motion, does it matter?  My TV is 42 inch LED 1080p, active 3D capable, w/ Picture in Picture, Smart apps, 5 HDMI (one on the side I just found), and 4 USBs and 240 Hz, btw. I can't give you the refresh rate since I'm at work.  

You just did.

Also, if you can tell the difference between PS2 and Xbox games before and after they were upscaled, the difference between 720p and 1080p should be like night and day to you.



d21lewis said:
OdinHades said:
If resolution doesn't matter, we can keep playing at 256 x 240 pixels on our NES Systems until all eternity. After all, its gameplay that matters, right?


Somebody says something similar in every thread about resolution.  Of course resolution matters!  And of course 1080p is definitely better than 720p!  The argument is that 720p is HD and 1080p is HD.  1080p is better but to many people, it's not as big a deal.

I was happy to have my PS3 and Xbox 360 upscale PS2 and Xbox games.  I was disappointed when the Wii U didn't upscale Wii games.  Sharper is better than the alternative. There comes a point where it doesn't matter, though.  My PC has a billion different resolution settings and I don't even know which one is best.  I just chose one that looked sharp enough for my TV and left it at that.  

1080p looks better but when the game is actually in motion, does it matter?  My TV is 42 inch LED 1080p, active 3D capable, w/ Picture in Picture, Smart apps, 5 HDMI (one on the side I just found), and 4 USBs and 240 Hz, btw. I can't give you the refresh rate since I'm at work.  

Really? You don't notice a clear difference between choosing the native display resolution or anything else? If not then there is something wrong with the tv. (no rgb, native or dot by dot mode perhaps?) Input refresh rate is 60hz, tvs don't accept more. 240hz is motion interpolated adding lag and softness to moving objects.

The whole 720p is HD is an empty argument, HD doesn't mean much, it's a moving goalpost, just a marketing term. The first high definition tvs had 405 scan lines...

The WiiU does upscale Wii games, same way the bc ps3 upscales ps2 games. It scales it slightly differently then my HD tv does with the Wii, I get a different amount of overscan throught the WiiU. It doesn't render in a higher resolution, neither does a bc ps3 playing ps2 discs.

Resolution matters more when in motion. Sub pixel detail and aliasing artifacts become less of a problem the higher the render resolution. Resolution limits the amount of detail that a game can have. We're still stuck with glowing items to make them stand out in low res heavily anti aliased 3D worlds.



i cant see the difference in my 42" tv.
But from 480 to 720 the difference is huge!
But I dont plan to change this tv for a long time. So 720p is good enough for me for this gen.



Around the Network
MohammadBadir said:

even N64 had 1080p, get it together Xbone!


That clearly says 1080o.



HeisenbergRules said:
Not sure the point of this thread....No one says 1080P is more important then gameplay..but why can;t we have both? It's 2013 1080p is pretty standard. Choose between the same game one at 720P and the other 1080p, any objective person would rather 1080p, it is a very noticeable upgrade. Its not like 1080p is going to change the gameplay lol.

yeah?

and a lot of people would also say... wii u games sucks because it doesnt run 1080p....where were you when they announced most game in xbox one does not run 1080p? yeah it is standard? then if it is standard... then how come most games that are being realeased does not run 1080p...

 

stand·ard
ˈstandərd/
 
adjective: standard
  1. 1.
    used or accepted as normal or average

 
it is getting there but it is still not a standard.....maybe to ps4... and ps4 is only a part of the gaming world...


 

elazz said:
When I put the settings on my pc to 900p or 1080p I already see a huge difference. Let alone 720p - 1080p. Even on a big television you clearly feel the difference. If you play CoD ghosts for a while on xbox one and afterwards playing on PS4 the difference is immense and I actually think that this factor alone can already define your purchase

Your PC monitor doesn't have a scaler and 900p does not upscale to 1080p properly hence why it looks like ass. A good scaler is needed to make 900p or 720p map 1:1 to pixels on a TV that has 1080 lines.

Truth on the matter 720p vs 1080p is just one part of the story, expecially when its upscaled with a good scaler like these consoles have at their disposal. Most PC gamers know that 720p with copious amounts of AA and a higher frame rate kicks the crap out of standalone "1080p" without AA and lower frame rate...when it comes to visual fidelity.



It's not just about 1080p vs 720p, but about the number of "samples per pixel".  Without sub-sampling you have aliasing, which makes some things like thin geometry, fences, and certain texture details look really bad.  When things pop in & out it looks bad at any distance.  With 1080p vs 720p you're getting double the sampling rate.

The problem with the Xbox One is that this isn't going to get much better, and things could get worse.  Some guys already mentioned that for more advanced games you're going to have the PS4 @ 720p and the XB1 below 720p, and I agree.

Here's some technicals to see what I mean.

To support high-res graphics and multisampling, high-end PC GPUs and the PS4 use GDDR5 memory.  Xbox One uses DDR3 which is like low end PC GPUs. You may know that XB1 has 32 MB of fast EDRAM to compensate for it's slow memory.

32MB is limiting for advanced high res graphics.

Example, to render 1080p with 4xMSAA you need 1920x1080x4 samples, with at least 8 Bytes per-sample (4 byte color, 4 byte depth), so 1920*1080*4*8 = 63.2 Megabytes.  That's double of what XB1 can fit in it's EDRAM!  So developers have to reduce resolution, or reduce sampling quality to 2xMSAA or do some "tiling", but tiling reduces performance.  This is one reason you don't see much 1080p on any but the simplest XB1 games.

But wait, 8 Bytes per-sample is the minimum for basic graphics.  Advanced renderers like in Killzone Shadowfall use 1080p G-Buffer with 5 MRTs, RGBA16f + Z (see Killzone postmortem) which is like 44 Bytes per-sample.  That's 1920*1080*44 = 87MB with no MSAA, or 174MB with 2xMSAA, and... you get the idea.  This doesn't even come close to fitting in XB1's fast EDRAM.

XB1's EDRAM can't fit Killzone Shadowfall graphics even at 720p with no MSAA.  (1280*720*44 = 38.6MB > 32MB)

KZ:SF, a launch title on PS4 is already using techniques which the XB1 can't do well even in 720p.  Things will get worse as PC games will use more advanced techniques, PS4 will be able to do them, and XB1 will not.




My 8th gen collection


At a certain point a high resolution is required to see the extra detail. I found I couldn't tell the difference between Metro: 2033 at High vs Ultra settings in 900p, but in 1080p the difference is obvious. 1080p at Med > 720p Ultra

The PS4 is EASILY strong enough to run all games effectively in 1080p. If the game isn't in 1080p, then the company didn't put forth next gen effort and I ain't buyin'!