By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - These resolution wars...

I appreciate graphics. Fuck me right?



Around the Network

I love continued graphics and technology, the better the graphics the more engaged I feel in the game world... Not necessarily talking about resolution... But shader, quality, tessellation etc... Its my field of study as well.. :)



1080p made some of the stuff that looked good in Tomb Raider for PS3/360 look fake on PS4. Some things are left unseen. It's like looking at a girl when you're drunk and thinking she's beautiful and then seeing her again when you're sober and seeing that she has fuzz on her lip.



Immersion is a big thing. Some of us play games to escape from day to day life and problems, so we would rather be fully immersed in the product.

Id rather play a game at 1080p60 than a game at 720p30, because its simply a better experience. The world feels more alive.



                            

Jizz_Beard_thePirate said:

The thing with the resolution war is that the X1 costs $100 more than the ps4 but its under-performing in the same games, so the resolution and frame rates are important cause people that buy the X1 are paying $100 more and it is not being justified... Other than that, yea its pretty useless overall

That would be true if even one person had voluntarily purchased the X1 because of resolution?

Ultimately people make purchases because of different games and an enhanced feature set (Kinect etc).



starcraft - Playing Games = FUN, Talking about Games = SERIOUS

Around the Network

The term "native resolution" has replaced "polygons per second", which replaced "bits / blast processing" in the console wars. New weapons, same shit.



On 2/24/13, MB1025 said:
You know I was always wondering why no one ever used the dollar sign for $ony, but then I realized they have no money so it would be pointless.

It still matters. Resolution still matters.

That said, it's not the only thing that matters, or even the most important thing, but the cold, hard reality is that as processing power increases, as technology improves, resolutions increase accordingly.

That is a law of technology, not an opinion or even a theory.

Now I'm going to point out the fault in focusing on technology is like focusing solely upon horsepower in say automobiles (although raw processing performance would be a better analogy). Better application of available power, or more efficient use of available power, can result in better real world performance. This is proven, not opinion. By the same token, poor use or inefficient use of more raw power can result in mediocre real world performance relative to power.

That said, regardless of what you're working with power-wise, those real world performance numbers are a reflection of how efficiently available hardware resources were utilized.

If you're working with really low power, then sure; lower res or lower frame rates, possibly both are just a reality of the limitations of the hardware. Nothing to complain about; there's a performance ceiling for everything.

Do higher resolution games look sharper than lower resolution games? Yes. Try not to sit 100 feet back from a display or squint to make everything look equally low quality.

Is it the only measure for performance? No. But really at this stage in video game entertainment history, certain resolutions should be a given, not a marketing bullet or point of sales.

1920x1080 native render at 60fps should not be a tall order, so this should be more of a mark against a given game that falls short, rather than a thumbs up every time we see it on a console.



pointless post. its like saying to any real gamer sales dont matter when people are disappointing about game sales. resolution matters to those who it matters to.

/thread



NightDragon83 said:
The term "native resolution" has replaced "polygons per second", which replaced "bits / blast processing" in the console wars. New weapons, same shit.

It's what specialized consumers (core audience) have latched onto as a point of focus for the last 2 console generations in terms of measuring hardware performance.

8bit vs. 16bit. 16bit vs. 32bit. After 64bit processing, it was no longer even a marketable "feature" by that point because it still boiled down to the games a given platform had, regardless of how it processed data.

Optical vs. cartridge. "Storage/format wars" where games were growing larger in terms of the amount of storage required to distribute them. Higher capacity, yet cheaper to produce (primarily matters to game publishers, not the gamers initially). Stopped being a point of issue by the 8th gen since all three consoles are using a high density violet wavelength optical format (BD) to distribute games. 

And of course Sony used their unorthodox, custom processors as a marketing tool. Emotion Engine, Reality Synthsizer, Cell Broadband Engine, etc. and focused on the potential of each to sell consumers on the future.  

Now the focus is on the performance. Which platform does the game the consumer want to buy run better on? It's a much simpler question really.

Of course, PC gamers have been obsessing over resolutions for decades after VGA was no longer the standard. Frame rates and resolutions became the main reason to upgrade video cards beyond the basic "can I play it?" minimum requirement. 



totally agreed.