sc94597 said:
Game developers know most console players prefer prettier games than higher frame rates, so they push for higher quality visuals and set a 30fps target to achieve them, only to later add performance modes. I am almost certain that somebody will get Avowed to play at 60fps on a Ryzen 5 3600 (a CPU weaker than the series X's) without over-utilizing that CPU (pegging it at 100% utilization on any core) by pairing it with the proper GPU and dialing in reasonable graphics settings. When Starfield originally released on the Series X it was limited to 30fps. People argued it was because the Series X's CPU was bottlenecking the title. Now it has a 60fps performance mode. If it were indeed a CPU bottleneck, we wouldn't expect the game's performance to scale well with internal resolution changes. |
Makes sense. I do know my 3050, my first attempt at PC gaming, was an absolute dog. I was not impressed, despite on paper some of the metrics were good. I think it was the 6 gb version with 16 gb system ram. It ran Halo Infinite shockingly poor.
Thanks for taking the time to post information. I'm always looking to learn.