By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Starfield will be 30 fps on Xbox Series X and S.

Otter said:

The game launched in an aweful state, what use is all these performance modes if not one of them is well optimised? The monthly patches didn't get the game in a decent state until about a year, so instead of worrying about increasing the number of modes and optimising for them, prioritising one makes more sense if performance is a concern. For that reason you using Cyperpunk is counter productive to your argument.

1. Given the performance profile of the Xbox version is currently 30fps with drops they likely do not have a full config for how it can maintain a solid 45fps throughout the experience on said hardware, 45 has likely never been a target because it is not a desirable frame rate. They're not going to whack on a low end PC profile onto Xbox Series X & S and call it a day without a thorough round of retesting.

We've already seen with plenty of games which require work to reach higher FPS that they are not next day patches. Plagues tale launched with 40 but took an extra  6months to get it up to 60fps. The logic you're working with is that the game is easily scalable for low-end CPUs when it being capped with drops below 30 suggest otherwise.

2. Early post game content can be approaching finalisation before a game is even launched but even in scenarios where it is not, testing is continuous throughout development & post game fixes in this kind of game will be a year long endeavour. Again it feels like you're not getting the point that it's not that it being impossible to do, but simply it is not something they will put high up on the list. It took Geurilla games a whole year to get around to fixing the AA in the 60fps mode in Horizon. There are often vital patches which don't get addressed months after launch.

3. The length and complexity of game interactions absolutely impacts the amount of QA needed which is what additional performance modes is competing for resources against. 

Especially for 1st party titles console performance profiles are often bespoke & there are not always 1:1 matches with available PC settings.

Lastly a single modder unlocking a frame rate and calling it a day is not the same due diligence a developer will take before putting out a patch out on millions of systems. It harkens back to Sony originally underplaying PS5's BC, for QA purposes they take a formalised route which is where the man hours come in. And again, configs for PC aren't going to just get thrown onto Xbox and ticked off as a new mode.

I would bet money on it not happening anytime soon after launch and the reason will be exactly as I'm saying, lack of priority

About a year for the game to be playable is much better than a few years after release when the next edition releases (the typical scenario for Bethesda games.) CD Projekt handled this much better than Bethesda has done with any of their games. 

Note: I never said a VRR update would be soon, all I said was that they should release a performance update to add a VRR mode. John Linneman (from Digital Foundry) also expressed this sentiment.

Your idea that I implied it should happen sooner rather than later was totally imagined. 

1. Every game developer does performance scaling tests of their game throughout the development process. They might not have a profile that can achieve 45 fps 99% of the time at the current moment, but they'd know what it would take to get to that point. And of course they would've tested the PC version. 

This game is both CPU and GPU intensive, from what we've seen. Current-gen consoles also have far better CPU's (in relative terms) than the 8th generation did. 

Again I never said this would be a next day patch or even something that comes very soon. I expect it to take months, because it isn't a priority. That's not the same thing as 1000's of hours of FTE. 

2. Yes, I'd expect such a patch to be 6-12 months out. Where did I ever say otherwise? 

3. Right, I never meant to imply that they would just throw a PC config in. Obviously there is more due-diligence required than that. BUT. If the target is a variable one (45-60fps) and if you've already done a lot of performance-testing (which you would've before release) then it isn't something that requires "1000's of hours" of FTE. You don't have to test the whole scope of the game either. You can incorporate an 80-20 rule or 90-10 rule, etc. where you initially, before the first release, test the areas of the game where players are most likely to be, and then iteratively move to the other locations with future patches. 



Around the Network
haxxiy said:

I honestly think the deal here is that MS wants graphics to be a selling point of this game and knows it will not look anywhere as nice in 60 fps (maybe it already has issues in 30 fps). And since most people aren't using HDMI 2.1 or 120 Hz displays for VRR at lower framerates, the mode doesn't even exist because MS wants no bitching about stuttering from clueless people enabling it when they shouldn't.

That makes sense. Typically my null hypothesis is that MS would want as many games to support VRR on their platform as possible so that it is a selling point they can use to compete with Sony on when selling the console and engaging players to purchase games on their platform rather than the other, but profits come from game sales and they can get sales from PC players just as easily as from Xbox owners, making feature-based competition at this level less of a priority. 



For a game like this, 30fps makes perfect sense.

What people often forget is that running at 60fps means sacrificing more than just graphics, it also halves your budget for physics, AI, animation, simulation, all that stuff running under the hood that you don't necessarily see but which facilitate the gameplay experience.
Starfield is a massive and complex open world game that tracks object permanence and dynamic systems across a world of interstellar scale.

No matter how powerful hardware gets, you will always be able to process twice as much per frame at 30fps as you can at 60fps, and that goes for gameplay systems as well as visuals. Expecting every single game to be 60 simply isn't realistic.



curl-6 said:

For a game like this, 30fps makes perfect sense.

What people often forget is that running at 60fps means sacrificing more than just graphics, it also halves your budget for physics, AI, animation, simulation, all that stuff running under the hood that you don't necessarily see but which facilitate the gameplay experience.
Starfield is a massive and complex open world game that tracks object permanence and dynamic systems across a world of interstellar scale.

No matter how powerful hardware gets, you will always be able to process twice as much per frame at 30fps as you can at 60fps, and that goes for gameplay systems as well as visuals. Expecting every single game to be 60 simply isn't realistic.

It's Bethesda they have never been known for releasing  60fps games or even locked 30fps games from what i remember. I think the majority games not running 60fps will be cause of graphics. Ray tracing is so heavy on current gen consoles that only a few games can run 60fps with it on. look at elden ring for example the game  dips a lot in performance mode, now play the ps4 version on ps5 and 60fps locked.



zeldaring said:
curl-6 said:

For a game like this, 30fps makes perfect sense.

What people often forget is that running at 60fps means sacrificing more than just graphics, it also halves your budget for physics, AI, animation, simulation, all that stuff running under the hood that you don't necessarily see but which facilitate the gameplay experience.
Starfield is a massive and complex open world game that tracks object permanence and dynamic systems across a world of interstellar scale.

No matter how powerful hardware gets, you will always be able to process twice as much per frame at 30fps as you can at 60fps, and that goes for gameplay systems as well as visuals. Expecting every single game to be 60 simply isn't realistic.

It's Bethesda they have never been known for releasing  60fps games or even locked 30fps games from what i remember. I think the majority games not running 60fps will be cause of graphics. Ray tracing is so heavy on current gen consoles that only a few games can run 60fps with it on. look at elden ring for example the game  dips a lot in performance mode, now play the ps4 version on ps5 and 60fps locked.

Different games will have different bottlenecks; some will be CPU bound, others GPU bound. There's a hard limit to what you can do at 60fps on console, and some devs will want to do more than that.



Around the Network
curl-6 said:
zeldaring said:

It's Bethesda they have never been known for releasing  60fps games or even locked 30fps games from what i remember. I think the majority games not running 60fps will be cause of graphics. Ray tracing is so heavy on current gen consoles that only a few games can run 60fps with it on. look at elden ring for example the game  dips a lot in performance mode, now play the ps4 version on ps5 and 60fps locked.

Different games will have different bottlenecks; some will be CPU bound, others GPU bound. There's a hard limit to what you can do at 60fps on console, and some devs will want to do more than that.

Yea we'll see. I still haven't seen a world more advanced then  Red Dead 2. will be interesting to see how they use the massive jump in CPU power.

Last edited by zeldaring - on 16 June 2023

zeldaring said:
curl-6 said:

Different games will have different bottlenecks; some will be CPU bound, others GPU bound. There's a hard limit to what you can do at 60fps on console, and some devs will want to do more than that.

Yea we'll see i haven't seen a world more advanced then  Red Dead 2. will be interesting to see how they use the massive jump in CPU power.

Stuff like this is often more down to developer effort more than raw power; Red Dead 2 after all runs on the quite slow Jaguar cores of PS4/Xbone, same as BOTW/TOTK run on the slow Wii U/Switch CPUs, but both go well beyond what you'd expect due to a lot of hard work by the devs.

But yeah, it will be cool to see devs push the limits of the way more power Zen 2 cores in PS5/XS.



curl-6 said:
zeldaring said:

Yea we'll see i haven't seen a world more advanced then  Red Dead 2. will be interesting to see how they use the massive jump in CPU power.

Stuff like this is often more down to developer effort more than raw power; Red Dead 2 after all runs on the quite slow Jaguar cores of PS4/Xbone, same as BOTW/TOTK run on the slow Wii U/Switch CPUs, but both go well beyond what you'd expect due to a lot of hard work by the devs.

But yeah, it will be cool to see devs push the limits of the way more power Zen 2 cores in PS5/XS.

Thats true i just feel like if Red dead 2 can still be one of the best looking games at 4k and do all that on ps4 with a crappy CPU then most games should offer a 60fps  considering the jump in power. This game is probably the most ambitious game of all time so it gets a pass from me. 



curl-6 said:

No matter how powerful hardware gets, you will always be able to process twice as much per frame at 30fps as you can at 60fps, and that goes for gameplay systems as well as visuals. Expecting every single game to be 60 simply isn't realistic.

This isn't necessarily true. It really depends on how these simulations are implemented and  how they must scale with respect to rendered frames.

Sometimes, for example, in order to maintain precision of say a physics simulation, internally the simulation might update more often than the frames rendered per second even if the game is only rendering 30fps to the screen you could have a simulation that updates 60 or 120 times per second or even less than the rendered frame-rate if it doesn't need to be very precise. You might not need to spend extra compute with more frames on said simulations. 

There is also the matter that sometimes the platform is bottlenecked in other ways and core utilization isn't at 100% because of that. Even if it is a matter of the calculations scaling linearly with rendered frames, you might be able to utilize the cores better to make up for it. 

Where I'd imagine CPU bottlenecking to occur for current gen platforms is in highly serialized workloads that need high single-core CPU performance. That'd be any piece of code or game process that depends on a bunch of branching if-else statements. Things like game AI fall in that category. But also alternating between scripts and game-modes.

Physics simulations (which tend to be highly parallelizable and mostly limited by memory bandwidth), object permanence (which has existed since Morrowind), etc shouldn't be big issues with an 8 core 16 thread, relatively modern CPU -- unless the engine and company aren't doing very well at utilizing the cores and multi-threading, maybe even leveraging the GPU where it makes sense (doing matrix multiplication) for compute rather than render loads. This is where I think Bethesda is probably struggling. Not because their game is complex, but because their hands are tied by the technology they use for business reasons and they likely aren't leveraging the hardware to the best advantage. 

A lot is up to speculation until we actually get to test the game. But my guess is that we'll see far less CPU-core scaling than we do with other titles released in late last gen and this gen. And if that is the case, then it isn't a matter of them not being able to achieve 60fps but rather them having chosen not to. 

Last edited by sc94597 - on 17 June 2023

sc94597 said:
curl-6 said:

No matter how powerful hardware gets, you will always be able to process twice as much per frame at 30fps as you can at 60fps, and that goes for gameplay systems as well as visuals. Expecting every single game to be 60 simply isn't realistic.

This isn't necessarily true. It really depends on how these simulations are implemented and  how they must scale with respect to rendered frames.

Sometimes, for example, in order to maintain precision of say a physics simulation, internally the simulation might update more often than the frames rendered per second even if the game is only rendering 30fps to the screen you could have a simulation that updates 60 or 120 times per second or even less than the rendered frame-rate if it doesn't need to be very precise. You might not need to spend extra compute with more frames on said simulations. 

There is also the matter that sometimes the platform is bottlenecked in other ways and core utilization isn't at 100% because of that. Even if it is a matter of the calculations scaling linearly with rendered frames, you might be able to utilize the cores better to make up for it. 

Where I'd imagine CPU bottlenecking to occur for current gen platforms is in highly serialized workloads that need high single-core CPU performance. That'd be any piece of code or game process that depends on a bunch of branching if-else statements. Things like game AI fall in that category. But also alternating between scripts and game-modes.

Physics simulations (which tend to be highly parallelizable and mostly limited by memory bandwidth), object permanence (which has existed since Morrowind), etc shouldn't be big issues with an 8 core 16 thread, relatively modern CPU -- unless the engine and company aren't doing very well at utilizing the cores and multi-threading, maybe even leveraging the GPU where it makes sense (doing matrix multiplication) for compute rather than render loads. This is where I think Bethesda is probably struggling. Not because their game is complex, but because their hands are tied by the technology they use for business reasons and they likely aren't leveraging the hardware to the best advantage. 

A lot is up to speculation until we actually get to test the game. But my guess is that we'll see far less CPU-core scaling than we do with other titles released in late last gen and this gen. And if that is the case, then it isn't a matter of then not being able to achieve 60fps but rather them choosing not to. 

Things like tracking a massive number of dynamic objects doesn't have a fixed cost, comparing it to Morrowind would be like saying reflections can't be demanding in modern games because some N64 games had them. These things scale.

PS5/Xbox Series are not infinitely powerful, their CPU/GPUs are not made of magic pixie dust, they offer a limited amount of power and some devs will want to do more with their games than current consoles can handle at 60fps.

Last edited by curl-6 - on 17 June 2023