By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Do you think developers should give console players more option akin to pc?

Not necessarily like PC with all those settings for the player choice, but at least some 2 or 3 pre-set modes already tested, like letting player choosing between better visuals and 30fps and 60fps and lower graphical settings. I know some games already give those options, but playing mostly on pc lately I kinda find wrong that on consoles people do not get to choose.

And for the argument that console are meant to be simpler, and therefore not having options are part of being Simple, console gaming got more complicated last decade already, with patches, dlcs, enhanced consoles and mandatory installations. 



Around the Network
invetedlotus123 said:

Not necessarily like PC with all those settings for the player choice, but at least some 2 or 3 pre-set modes already tested, like letting player choosing between better visuals and 30fps and 60fps and lower graphical settings. I know some games already give those options, but playing mostly on pc lately I kinda find wrong that on consoles people do not get to choose.

And for the argument that console are meant to be simpler, and therefore not having options are part of being Simple, console gaming got more complicated last decade already, with patches, dlcs, enhanced consoles and mandatory installations. 

So, what do you say would speak against a single button that reveals all the advanced options games usually have when they're not on consoles?



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

These settings are fairly meaningless when everyone has the same hardware. Developers can program the game to switch these modes on-the-fly during gameplay for best experience.

On PC they mostly exist because the automatic hardware detection traditionally would mess up with some driver/cards.



OneTime said:
These settings are fairly meaningless when everyone has the same hardware. Developers can program the game to switch these modes on-the-fly during gameplay for best experience.

On PC they mostly exist because the automatic hardware detection traditionally would mess up with some driver/cards.

What bothers some people on console is having to stay with lower framerate in order to have graphical fidelity, what I do think should be the players choice. 



If it fits within the vision of their game, then of course.

If a game is pushing the CPU, GPU, and I/O to the max, to deliver the vision of the game, then I would prefer they stick to their vision.

For me it comes down to this. If the game can play the same at 1080p/120 or 4K/30, then that option should be available. If the game would require a reduction in core gameplay elements to reach a higher FPS, then absolutely not.

At the end of the day, consoles are a set platform, that developers can count on, to achieve the same performance across 10's or 100's of millions of devices. I want developers to deliver their best vision within the capabilities of that set spec. If that means 1080, 1440, 4K, or 8K, or 30, 60, or 120FPS, then that is their call. I don't want them compromise their game to meet specific Resolutions or Frame Rates.



Stop hate, let others live the life they were given. Everyone has their problems, and no one should have to feel ashamed for the way they were born. Be proud of who you are, encourage others to be proud of themselves. Learn, research, absorb everything around you. Nothing is meaningless, a purpose is placed on everything no matter how you perceive it. Discover how to love, and share that love with everything that you encounter. Help make existence a beautiful thing.

Kevyn B Grams
10/03/2010 

KBG29 on PSN&XBL

Around the Network

Graphical affects like film grain and motion blur should be toggable. Other options would be nice but not necessary.

For games using RT in the future I'd like to see an option to toggle on/off it if it allows for more performance.

Last edited by hinch - on 20 July 2020

No. I don't expect it in movies either, why can't games reflect the artistic vision the director is going for. Concentrate on optimizing the intended vision.



OneTime said:
These settings are fairly meaningless when everyone has the same hardware. Developers can program the game to switch these modes on-the-fly during gameplay for best experience.

On PC they mostly exist because the automatic hardware detection traditionally would mess up with some driver/cards.

Not quite.

For instance, I don't need 4K or Raytracing, but 60FPS is a must. For some other person 30FPS may be enough but for him it has to 4K. And there are dozens of such scenarios that are possible.

It doesn't have to be nearly as detailed as it is on PC, as, like you said, the hardware is the same. But you should be able to set what you want most and what is way down on your priorities in the settings and the game then handles the rest from there.



It would be cool if they did as I prefer 60fps over some graphical affects.



                  

PC Specs: CPU: 7800X3D || GPU: Strix 4090 || RAM: 32GB DDR5 6000 || Main SSD: WD 2TB SN850

More freedom when it comes to resolution, framerate and other graphical settings sounds good to me. There have been occasions that I can vaguely recall where I've been playing a console game, and I would have liked to have disabled a visual enhancement option that I disliked in favor of a performance boost, but have been unable to do so.