By using this site, you agree to our Privacy Policy and our Terms of Use. Close
NightDragon83 said:
The term "native resolution" has replaced "polygons per second", which replaced "bits / blast processing" in the console wars. New weapons, same shit.

It's what specialized consumers (core audience) have latched onto as a point of focus for the last 2 console generations in terms of measuring hardware performance.

8bit vs. 16bit. 16bit vs. 32bit. After 64bit processing, it was no longer even a marketable "feature" by that point because it still boiled down to the games a given platform had, regardless of how it processed data.

Optical vs. cartridge. "Storage/format wars" where games were growing larger in terms of the amount of storage required to distribute them. Higher capacity, yet cheaper to produce (primarily matters to game publishers, not the gamers initially). Stopped being a point of issue by the 8th gen since all three consoles are using a high density violet wavelength optical format (BD) to distribute games. 

And of course Sony used their unorthodox, custom processors as a marketing tool. Emotion Engine, Reality Synthsizer, Cell Broadband Engine, etc. and focused on the potential of each to sell consumers on the future.  

Now the focus is on the performance. Which platform does the game the consumer want to buy run better on? It's a much simpler question really.

Of course, PC gamers have been obsessing over resolutions for decades after VGA was no longer the standard. Frame rates and resolutions became the main reason to upgrade video cards beyond the basic "can I play it?" minimum requirement.