By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Would you be okay with graphics staying at PS4/XBO level if it meant cheaper games and shorter dev times?

 

Would you accept such a tradeoff?

Yes 50 79.37%
 
No 13 20.63%
 
Total:63

I'd totally accept that, games wouldn't only be cheaper and faster to make, but a lot more performant as well. Especially with engines like Frostbite, that thing is a MONSTER.

Just look at Battlefield 1 or StarWars Battlefront 2, amazing graphics, rivaling several today's games... but ran on 8GB RAM and HDDs.



Around the Network

If it meant games having 3-4 year dev cycles and being $10+ cheaper ... yeah probably. PS4-era games still look fine and approximate any kind of environment to a point that it looks good if you have a good art department.

There just isn't a game I've played this gen yet that makes me go "wow, I can't go back and now play like a game that only looks like Horizon Forbidden West" or something and I've tried all the best looking games this gen, Senua's Saga, Indiana Jones, Avatar, Alan Wake II, Assassin's Creed Shadows, Star Wars Outlaws, FF7 Rebirth ... I mean yeah it's nice, but it's not like "OMG! I can never go back now!". There's no game that's given me that feeling. 



The complexity of writing games isn't just related to graphics though its all the logical decisions made by the game engine itself that focuses on CPU performance. On PC you have games that require high CPU resources but low GPU resources for the minimum graphic level and the reverse where CPU resources are lower but GPU resources are higher in comparison. I've been happy with graphic performance for quite a while but actually like games where the world is more alive with more stuff going on in the world. If you look up into the sky in Skyrim and you can see the moon slowly moving in the sky rather than being fixed in the sky I think that adds a lot to the world. It's a tiny thing I guess but makes it more immersive. If you asked me how I would like to see gaming develop I guess I would say more CPU resources, more memory but maybe less push for graphic fidelity but this could still increase the price of game development. I play some PC games at 720p and 4K on different computers and yes 4K is definitely nicer but I don't care that much about the difference. However I'm not as keen when a game engine is simplified to work on a platform with lower CPU resources and these finer details are lost. You could have a game with amazing graphics that was easier to develop purely because the game engine was relatively simple, perhaps a classic 2D scrolling shoot em up so it could have incredible 8K visuals and still be cheap to develop. You could also have a game with 720p visuals and poor textures but was extremely expensive to develop due to a very complex 3D world with many things going on. The more complex the game world the longer it takes to iron out the bugs which makes it much more expensive to develop.

On PC you can run games on hugely varying graphic levels from 640x480 with basic textures and missing graphic features all the way up to incredible 8K visuals but if you don't have the CPU or memory to run the game engine you ain't playing that game. You often get games where you can alter some files to get potato settings which really lowers the graphics but enables weaker PCs to run modern games. There is no option to change the game engine itself though, yes some graphic features require CPU resources too so it will simplify the game but there is a core amount of CPU processing required to run the game at all which you cannot get around.



PAOerfulone said:

Yes.
You don't need any higher than 4k in terms of resolution. I'm not going to lose sleep just because I can't see each individual pore on the character's face in a game and you shouldn't either.

I'd rather have a more solid, consistent frame rate. 90-120fps would be fantastic, but 30-60fps is still perfectly fine as long as it is rock solid and consistent with no drops whatsoever.

This would let them focus on improving OTHER areas for their consoles. Like reducing load times for games and for the online stores, to the point where they're almost non-existent. And also add more internal storage. I would GLADLY take a less powerful PS5 or Switch 2 with 2-4 TB of internal storage and lightening quick loading times. And in the case of Switch 2, a longer, more efficient battery life.

Not sure I follow.  You are ok with graphics not moving forward but want 4k?  The ps5 rarely even gets close to 4k resolution, unless running at 30 fps.  Spiderman 2, as an example, is 1080p to 1440p in performance mode.  4k/60 fps will require a ps6 with way more power.  4k/120fps, ps6 won't even do that.  My 4090 can't even do that in most cases.  I can get 1440p/120 fps in most games.  



“Consoles are great… if you like paying extra for features PCs had in 2005.”

PS4/XBO level is a gigantic spectrum. I mean, it goes from simple looking things like "most switch 2 games", to gorgeous things like Last of US 2 or Red Dead Redemption 2.



Around the Network
I would love it, because games taking 7 years to make is way too long, and having a game be a rock solid 60fps at 4k would be better than trying to squeeze more detail into the 3D models and textures, just to have it run at 45 fps and 1440p or whatever. 
Pemalite said:

I am okay with older games having aging visuals.
I am not okay for new games having older visuals.

Xbox One/Playstation 4 was a terrible generation dominated by sub-1080P, sub-60fps games.

I don't think framerate and resolution are what Curl is talking about here. Basically, imagine if PS5/Series only ran games at PS4/XBOne levels of polycounts, lighting, and texture sizes, but actually hit 60 fps and 4k with everything easily. So basically we would be going back to how things were in the 16 bit generation where resolution and 60fps were standardized. Pretty much every 16 bit game back then ran at 60 fps and 240p. 

Edit: Also here's the biggest disconnect for me. Modern games are often made so that only somebody with two massive GPUs can actually run the game at max settings, resolution, and FPS. But that's such a teeny tiny portion of the videogame market that it's just not worth it. Why waste so much dev time just to impress the few that have a $5000 rig? It makes even less sense for console games. Why would you make a game for a console that can't run at 60 fps and 4K? It's not like the PS5 Pro suddenly takes all PS4 games and runs them at 60 fps and 4K. It's not like PS6 Pro will do the same for PS5 games. 

Last edited by Cerebralbore101 - on 16 January 2026

Cerebralbore101 said:

I don't think framerate and resolution are what Curl is talking about here. Basically, imagine if PS5/Series only ran games at PS4/XBOne levels of polycounts, lighting, and texture sizes, but actually hit 60 fps and 4k with everything easily. So basically we would be going back to how things were in the 16 bit generation where resolution and 60fps were standardized. Pretty much every 16 bit game back then ran at 60 fps and 240p. 

Framerate and resolution make up part of the visual presentation of games.




www.youtube.com/@Pemalite

Manlytears said:

PS4/XBO level is a gigantic spectrum. I mean, it goes from simple looking things like "most switch 2 games", to gorgeous things like Last of US 2 or Red Dead Redemption 2.

Not the OP but I suppose in my head I imagine those as the upper limit with lower quality than that as more common.

And I'm fine with just about every game looking like Last of Us Part II or worse. I just want pretty much every game to be 4K60FPS HDR and such. I don't need the craziest number of polygons, ray tracing, and biggest crowds. 

120 FPS should be available too for those with supported displays, but I don't always expect that in 4K.



Lifetime Sales Predictions 

Switch: 161 million (was 73 million, then 96 million, then 113 million, then 125 million, then 144 million, then 151 million, then 156 million)

PS5: 122 million (was 105 million, then 115 million) Xbox Series X/S: 38 million (was 60 million, then 67 million, then 57 million. then 48 million. then 40 million)

Switch 2: 120 million (was 116 million)

PS4: 120 mil (was 100 then 130 million, then 122 million) Xbox One: 51 mil (was 50 then 55 mil)

3DS: 75.5 mil (was 73, then 77 million)

"Let go your earthly tether, enter the void, empty and become wind." - Guru Laghima

Pemalite said:
Cerebralbore101 said:

I don't think framerate and resolution are what Curl is talking about here. Basically, imagine if PS5/Series only ran games at PS4/XBOne levels of polycounts, lighting, and texture sizes, but actually hit 60 fps and 4k with everything easily. So basically we would be going back to how things were in the 16 bit generation where resolution and 60fps were standardized. Pretty much every 16 bit game back then ran at 60 fps and 240p. 

Framerate and resolution make up part of the visual presentation of games.

So if a new game came out and ran at 120 fps 1080p, but with polycounts from the 360 era you would call that a modern graphical presentation? 



Cerebralbore101 said: 

I don't think framerate and resolution are what Curl is talking about here.

Yeah when I made the thread I was thinking more of things like model/texture quality, detail, effects, etc.