By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Consoles would benefit at running games 720p instead of 1080p.

Imaginedvl said:
Raziel123 said:
Lower resolution would only have a very minor benefit in frame rate
30fps games would still be 30fps games
last gen stuck at 720p was enough. Consoles need to keep up with the times. Bad enough that Wii U doesn't do it. Let PS4 champion it.

You got some solid numbers here backed up by a lot of data! The flow of knowledge coming out of your post is simply crushing my mind... I feel like a little ignorant dwarf in a middle super-tech-giants now!

Always good to read some very credible and knowledgeable people on the Internet, thank you for sharing.

do you think if you changed a 30fps games rez from 1080 to 720 ud get double the performance? dude is right ud get like 10fps max



Around the Network
Wright said:

FF13 at 480p? Does that mean that Wii could run FF13?


Why yes!



JazzB1987 said:


Night and day? rofl.


ET on Atari vs Killzone Shadowfall is night and day.


This is equivalent to 480p with AA. Looks good to me.

If a game has no AA at all then it def needs a better resolution but AA helps alot and most games would be perfectly playable with lets say PAL resolution 576p (Final Fantasy13 on 360) just add AA and its fine. But I prefer high resolution with lower polycount and less effects etc. (e.g 3DS/Vita @1080p+AA)

But tbh the problem is not graphics its games. The games these days are so generic  unimaginative and boring or pseudo artistic  lack soul and polish that the best graphics in the world cant help. 90% of all games are simply bad these days.

 Id rather play a Secret of Mana/Timesplitters etc. than FF13/Crysis3.

This is generally a very misleading post. Just so much misinformation here that I can't get into.

AA isn't just a button press away from fixing everything. The more geometry on a a scene, the more jaggies there will be. The lower the resolution, the more noticeable said jaggies will be. To get a 576p crysis game looking as clean as it running at 1080p, the amount of resouces you would use for just the AA would be more than you would have needed to just up the resolution.

Lets not forget that lower rez also means lower overall clarity. Its easy ok looking at 480p on a 4.5-5" screen and saying its ok, but just ry blowing that up to 46" or 50" and it will become a horrible mess.

Besides, the whole point of AA is to eliminate jaggies thus cleaning up the image. The funny thing is, the absolute best type of AA is something called supersampling. What it basically does (for a 1080p image, you can apply this to any resolution too), is that the GPU renders your the game internally at 4K (1080p x 4) or even 8k. Then it downsamples that render and outputs the downsampled image at 1080p. Whole point of this is cause the higher up you go with rez, the less you will need AA. Problem with this is that its also the most expensive type of AA.

TL;DR? You use lower rez, you spend more on AA. Use higher rez, you spend less on AA and improve the overall clarity of the image.



The idea of dropping resolution is horrible.

Image quality is the most important part of graphics IMO. High resolutions and good AA, nothing tops a clean image.



Intrinsic said:
freedquaker said:
The thing is some people have 1080p screens, while some only have 720p. Sometimes you have both but had to pick either... So a much better path would be to leave it up to us. We should be able to choose to run the game either at 1080p with 30 fps and/or lower detail versus 720p 60 fps and/or higher detail.

That won't be possible cause devs have to optimize their engines to run on the consoles or their benchmark PC set up. So even fr PCs while PC gamers can tinker with rez and framrate and details all they want, those games were optimized to run on a specific kinda hardware. For consoles, devs know that everyone "has" the same specific hardware so they just focus on gettin their game running the best it possibly can on that hardware. All "pre-balanced".

Having said that, running at a lower resolution has very little effect on framerate. Actually, framerate is really a big deal. A really really really big deal. I'll just put it this way. Take a picture of any game you can think of in your head that you belive looks really really really good. Now imagine that as good as that game looked it was running at 30fps. To run the same game at 60fps would mean that everything that the CPU/GPU did to output that image at 30fps, has to happen twice as fast. So if we assume that the bottleneck is the CPU/GPU and not bandwidth you would literally need a CPU/GPU that is twice as powerful as whatever you had running the game at 30fps.

To run a game at 30fps, every frame has to be completed in ~33ms. And for 60fps, every frame in ~16ms. So no, dropping resolution doesn't mean you can tell the entire CPU/GPU to do everything else twice as fast.

Basic explanation to avoid it getting too long.


I understand and technically there isnt much to disagree here but it still is possible to have some simple configuration options. We have seen many examples of this before. The thing is, it just takes more time and effort to optimize for more than one setting. It is almost like optimizing the game for two different patforms.



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

Around the Network

You heard it here folks. Games that have lower resolution can't possibly run at higher framerates than if the same game was developed at 1080p. Nope. PC gamers can't change their resolution in game or remove graphical effects to increase framerate. If the game runs 1080p@10fps, nothing you can do will increase that fps, and according to one person, reducing framerate will actually lower it.

For consoles I'd assume the game is built to run one setting and the resolution is dependent on your television or what you set it as and that's why reducing resolution won't increase framerate. If the same dev was to port the game over for 720p instead of 1080p, I'm sure they'd be able to get 60fps out of it. Or maybe all the devs that have stated that they'd have to decrease resolution to keep a stable framerate are just talking out of their asses? What do they know about developing games anyways?



kupomogli said:
You heard it here folks. Games that have lower resolution can't possibly run at higher framerates than if the same game was developed at 1080p. Nope.  


Cause that's totally what was said?

again, the difference is not big enough. we're talking an average of 5-15 frames gained depending on the game. Which is why a 30fps game would remain 30fps. Unless it was running at ~50 with a 30fps cap. 

Infamous SS at 720p would not run at 60fps. Without the 30fps lock it averages at about 35fps. 720p wouldn't bring it above 45 or so.They'd have to further downgrade other things to bump it to 60.



freedquaker said:


I understand and technically there isnt much to disagree here but it still is possible to have some simple configuration options. We have seen many examples of this before. The thing is, it just takes more time and effort to optimize for more than one setting. It is almost like optimizing the game for two different patforms.

Well you are actually right. An maybe I was a bit excessive when I said its not possible. More like its not necesary or efficient use of time. Cause you also right, its almost like optimizing the game for two different platforms. And the amount of man hours spend to do that could be better spent on just optimzing the game some more to get the most outta the one platform. 

On consoles the only real way this can be done is withhaving things like 2xAA, 4xAA, locked or unlocked framerates and maybe a rez toggle. everything I listed out there with excpetion to framerate and rez are post porocessing effects. And playing with the rez/framerate can give you gains with the AA. Thats basically all they can really do on consoles. On PCs, devs still optimize for a certain minimum spec they have in mind. But the unlock everything and allow the gamers set however well or badly they want the game to run.

Can't do most of that on consoles cause well, if someone increased the texture rez on a console from say 2k to 4k then you would instantly need twice the amount of memory for textures. That would just break the game. 



Raziel123 said:
kupomogli said:
You heard it here folks. Games that have lower resolution can't possibly run at higher framerates than if the same game was developed at 1080p. Nope.  


Cause that's totally what was said?

again, the difference is not big enough. we're talking an average of 5-15 frames gained depending on the game. Which is why a 30fps game would remain 30fps. Unless it was running at ~50 with a 30fps cap. 

Infamous SS at 720p would not run at 60fps. Without the 30fps lock it averages at about 35fps. 720p wouldn't bring it above 45 or so.They'd have to further downgrade other things to bump it to 60.

You cut out the portion how I said the games on consoles aren't developed that way to make your case?  Awesome. 

Anyways, even if I don't know anything about game development, I know a thing or two about math.  You have 1080 pieces of something and only use 720 pieces, you still have 360 pieces left available for use elsewhere.  Add that together with the many devs who have stated that they need to reduce resolution to get it running at a good framerate on PS4 and Xbox One, or maybe even take a look at games like Rage and Wipeout HD which drop resolution mid game in order to keep framerate at 60fps.

Here.  Read Digital Foundry's Rage analysis.

http://www.eurogamer.net/articles/digitalfoundry-rage-face-off



I am gonna say it... 80% is not more people will not notice the difference between 720p and 1080p unless their TV is <36in at least. Yes, you guys can argue with me telling me you see a difference because you think you see one, but you really don't. I'd be willing to bet that if you took something at 1080 and something at 720 and looked at them (without knowing which one was which), you'd be guessing which one was the higher resolution.