By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Hellblade II is 30fps on Xbox Series consoles

EpicRandy said:
Hardstuck-Platinum said:

No it's definitely CPU related. As the GPU in the XBSX is 3x better than the one in the XBSS, any game that's GPU limited to 30 on the S would hit 60 no problem on the X. Problem is though, they have the same CPU so any game that's CPU limited to 30 on the S will also be CPU limited to 30 on the X. MS should have designed the X with more powerful CPU to avoid this 30FPS problem

I agree this is most likely a CPU bottleneck but Xbox simply could not make the CPU stronger on Series X than it is.

First, because this would have created way more issues than anything else. Keeping the CPU very close ensures that games are the same feature-wise and that the divergence happens almost exclusively with the presentation. There is some leniency thought because when rendering scenes with lessened resolution/and simplified assets the CPU would be slightly less used (things like fewer draw calls would be expected) and it's actually what we see with series S vs X CPU where the X is about 8% faster.

Secondly, the CPU does offer the best of what AMD could offer in 2020 for an 8-core variant when designing a ~200W max system and avoiding chip yield issues. Going above it would have meant either:

a) going to 12/16 core.

This would only have given mixed results, especially with 3rd party as the game must be written to take advantage of extra cores, and not everything is as easy to parallelize. This would also have increased die size significantly, reduced yields, and increased power requirement leading to a costlier, louder, even bulkier system.

b) Increasing Frequency

There is some headroom but it would only have been marginal, best case scenario would be +~20%, and it would either lead to a significant power consumption increase leading to a louder and/or bulkier system or a more drastic chip binning approach leading to reduced yields and significantly increase cost.

Yeah, it's a seemingly impossible technical challenge to give the XBSX a CPU twice as capable as the one in S, I know. If it wasn't possible though, they shouldn't have made an XBSX. XBSX's are very expensive to make, and MS might still be losing money on every one sold, especially if they are sold at a discounted price like last Christmas. If it can't run games at a better framerate than the budget S, it's perceived value is going to plummet to a price more similar to the S, and that is a major problem because it is A LOT more expensive to make than the S. 



Around the Network

They will likely have a 60 fps mode down the line when it releases on PS5.



I'm loving all the 40FPS sentiment.

Its a great middle ground between performance/resolution. Its much easier for my eyes to adjust going from 60 to 40 lol. 60 to 30 just feels like watching a movie to watching a slideshow.



Aside from the drama of some people going over "the 30 fps cinematic" take once again, gaming is more than feasible and watchable on 30 fps.
Moreso since the game is probably not aiming to be a deep frenetic action rump that will need top notch reflexes



Switch Friend Code : 3905-6122-2909 

30 fps is barely tolerable for me. I much rather have 60 fps at a lower resolution. After PC gaming at 80 to 120, going back to 30 is very jarring regardless of genre.



Around the Network
PotentHerbs said:

I'm loving all the 40FPS sentiment.

Its a great middle ground between performance/resolution. Its much easier for my eyes to adjust going from 60 to 40 lol. 60 to 30 just feels like watching a movie to watching a slideshow.

Seriously 120 Hz TV's were big part of the "next gen" consoles marketing in 2020, and for those TV's demanding games make sense at 40 fps instead of 30, which is indeed much better for responsiveness and for the eyes.



Chrkeller said:

30 fps is barely tolerable for me. I much rather have 60 fps at a lower resolution. After PC gaming at 80 to 120, going back to 30 is very jarring regardless of genre.

It generally is. Remember going from 100+Hz CRT monitors to 60Hz LCDs back in the days? It was almost as bad.

Yet, for generations, devs are opting for 30fps to push as much of the visual bling as possible in some games - and for some games that's OK, Hellblade being an example of it. Of course, there's barely any game that wouldn't benefit from higher frame rate, but it's not that it was unexpected after generations of devs doing exactly that on consoles.

.



HoloDust said:
Chrkeller said:

30 fps is barely tolerable for me. I much rather have 60 fps at a lower resolution. After PC gaming at 80 to 120, going back to 30 is very jarring regardless of genre.

It generally is. Remember going from 100+Hz CRT monitors to 60Hz LCDs back in the days? It was almost as bad.

Yet, for generations, devs are opting for 30fps to push as much of the visual bling as possible in some games - and for some games that's OK, Hellblade being an example of it. Of course, there's barely any game that wouldn't benefit from higher frame rate, but it's not that it was unexpected after generations of devs doing exactly that on consoles.

.

I think that is why PC is appealing.  I drop resolution before fps.  Even slower games benefit from 60 fps via camera panning.  40 fps at 120 hz is a superb compromise.  



Chrkeller said:

30 fps is barely tolerable for me. I much rather have 60 fps at a lower resolution. After PC gaming at 80 to 120, going back to 30 is very jarring regardless of genre.

If you're gaming on PC you won't have a problem. 



Radek said:
PotentHerbs said:

I'm loving all the 40FPS sentiment.

Its a great middle ground between performance/resolution. Its much easier for my eyes to adjust going from 60 to 40 lol. 60 to 30 just feels like watching a movie to watching a slideshow.

Seriously 120 Hz TV's were big part of the "next gen" consoles marketing in 2020, and for those TV's demanding games make sense at 40 fps instead of 30, which is indeed much better for responsiveness and for the eyes.

I don't think the general public have an awareness of this. Even as a core gamer, when I bought my TV in 2021 120hz was not in my feature list, no AAA single player games were ever going to reach that and 40fps only came to my awareness when Ratchet and Clank received the 40fps update. I think that was the first time it had been done for a console game if I recall correctly... But yeah, I think only a very small portion of console gamers actually have 120hz display... Hopefully next-gen it can become the norm

"Why Ratchet and Clank: Rift Apart's 40fps support is a potential game-changer "
https://www.eurogamer.net/digitalfoundry-2021-why-ratchet-and-clank-rift-aparts-40fps-fidelity-mode-is-a-potential-game-changer