By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Hellblade II is 30fps on Xbox Series consoles

curl-6 said:
Radek said:

Yeah it is above 1080p but ever so slightly. 1920*1080 = 2,073 mil pixels, 2304*964 = 2,222 mil pixels.

In terms of pixel count we could still call it a 1080p 30 fps game

2304x964 was the lowest pixel count they found; using a game's minimum res as if that's how it runs all the time is misleading.

They also found 2560x1070, which is well above 1080p in pixel count.

As to who wants amazing graphics if the resolution is PS4 level, I for one would take that any day over PS4 assets at a higher resolution. Lighting, effects, textures, detail, etc matter more than pixel count in my opinion. Hellblade II for instance is easily the best looking game I've seen on consoles.

Of course I'm going to use the lowest pixel count since the game is using dynamic resolution, reminder again that RDR2 is native (not dynamic) 3840x2160 on 2017 Xbox One X.

1080p with TAA will always look soft on a 4K TV, it's just a fact and Ollie from DF admits it in the analysis. Now if you played it on 1080p TV that will look considerably sharper.

If they wanted to avoid controversy regarding the resolution and black bars they would offer reduced settings 16:9 mode running at 1440p in 16:9.



Around the Network
Chrkeller said:

Weren't these next gen machines marketed as 4k? Kind of funny we are calculating to see if the new big release is better than 1080p....

If anyone really believed in that? I mean, simple calculations based on what was available on the market around that time gave pretty clear picture that they are not by a long shot.

If you look at Alan Wake 2 with Path Tracing, even 4090 is barely above 30fps at native 4K, so even a potential console with something similar inside it would be barely called 4K console.



HoloDust said:
Chrkeller said:

Weren't these next gen machines marketed as 4k? Kind of funny we are calculating to see if the new big release is better than 1080p....

If anyone really believed in that? I mean, simple calculations based on what was available on the market around that time gave pretty clear picture that they are not by a long shot.

If you look at Alan Wake 2 with Path Tracing, even 4090 is barely above 30fps at native 4K, so even a potential console with something similar inside it would be barely called 4K console.

Wholly agreed.  Which is why I giggle when new consoles are announced, the hype is always amusing.  Rumors are even better.  

I was hoping for most games being native 1440p at 60 fps, or at least 40 fps 120 hz.  Perhaps I'm too fps driven, but 30 fps is terrible and needs to die.



Chrkeller said:
HoloDust said:

If anyone really believed in that? I mean, simple calculations based on what was available on the market around that time gave pretty clear picture that they are not by a long shot.

If you look at Alan Wake 2 with Path Tracing, even 4090 is barely above 30fps at native 4K, so even a potential console with something similar inside it would be barely called 4K console.

Wholly agreed.  Which is why I giggle when new consoles are announced, the hype is always amusing.  Rumors are even better.  

I was hoping for most games being native 1440p at 60 fps, or at least 40 fps 120 hz.  Perhaps I'm too fps driven, but 30 fps is terrible and needs to die.

While 30 fps will always be there, the optional 40 fps mode for 120 Hz TV's should always be offered, why is only Sony caring about 40 fps support on consoles?



Radek said:
curl-6 said:

2304x964 was the lowest pixel count they found; using a game's minimum res as if that's how it runs all the time is misleading.

They also found 2560x1070, which is well above 1080p in pixel count.

As to who wants amazing graphics if the resolution is PS4 level, I for one would take that any day over PS4 assets at a higher resolution. Lighting, effects, textures, detail, etc matter more than pixel count in my opinion. Hellblade II for instance is easily the best looking game I've seen on consoles.

Of course I'm going to use the lowest pixel count since the game is using dynamic resolution, reminder again that RDR2 is native (not dynamic) 3840x2160 on 2017 Xbox One X.

1080p with TAA will always look soft on a 4K TV, it's just a fact and Ollie from DF admits it in the analysis. Now if you played it on 1080p TV that will look considerably sharper.

If they wanted to avoid controversy regarding the resolution and black bars they would offer reduced settings 16:9 mode running at 1440p in 16:9.

Games with DRS do not run at their lowest resolution all the time; most only drop that low a minority of the time. And Hellblade II is a generation beyond RDR2 in rendering technology.

Depending on the kind of game they want to make, developers will make different choices as far as how to allocate a console's resources.

Ninja Theory clearly wanted to create the most richly detailed visual experience they could, hence their choice to prioritise stuff like character rendering and lighting over a high framerate or pixel count. 

This is not a failure on their part, it is a design choice. Reducing settings as you suggest may well have compromised their vision.



Around the Network
Chrkeller said:

Weren't these next gen machines marketed as 4k? Kind of funny we are calculating to see if the new big release is better than 1080p....

The hardware being 4K capable doesn't mean developers will always choose to 4K over better lighting, details, textures, effects, etc.

PS3 was 1080p capable, and marketed as such, yet most games were 720p or less. 



curl-6 said:
Radek said:

Of course I'm going to use the lowest pixel count since the game is using dynamic resolution, reminder again that RDR2 is native (not dynamic) 3840x2160 on 2017 Xbox One X.

1080p with TAA will always look soft on a 4K TV, it's just a fact and Ollie from DF admits it in the analysis. Now if you played it on 1080p TV that will look considerably sharper.

If they wanted to avoid controversy regarding the resolution and black bars they would offer reduced settings 16:9 mode running at 1440p in 16:9.

Games with DRS do not run at their lowest resolution all the time; most only drop that low a minority of the time. And Hellblade II is a generation beyond RDR2 in rendering technology.

Depending on the kind of game they want to make, developers will make different choices as far as how to allocate a console's resources.

Ninja Theory clearly wanted to create the most richly detailed visual experience they could, hence their choice to prioritise stuff like character rendering and lighting over a high framerate or pixel count. 

This is not a failure on their part, it is a design choice. Reducing settings as you suggest may well have compromised their vision.

You are entitled to your opinion, but so are others.  I personally think 1080p/30fps on modern hardware is laughable.  



Chrkeller said:
curl-6 said:

Games with DRS do not run at their lowest resolution all the time; most only drop that low a minority of the time. And Hellblade II is a generation beyond RDR2 in rendering technology.

Depending on the kind of game they want to make, developers will make different choices as far as how to allocate a console's resources.

Ninja Theory clearly wanted to create the most richly detailed visual experience they could, hence their choice to prioritise stuff like character rendering and lighting over a high framerate or pixel count. 

This is not a failure on their part, it is a design choice. Reducing settings as you suggest may well have compromised their vision.

You are entitled to your opinion, but so are others.  I personally think 1080p/30fps on modern hardware is laughable.  

Technically it's not 1080p. It's a dynamic 1440p with letterboxing.

You have every right to dislike it, but clearly many developers have decided that chasing high resolutions isn't their priority; tons of games on PS5 and Xbox Series run at, around, or below 1080p.

This isn't because the hardware isn't capable, it's because developers made a conscious decision to devote more resources towards things like lighting or details than towards raw pixel count.

Even if the consoles were twice as powerful as they in reality, many devs would still be choosing to target lower resolutions and pour that extra power into things like Nanite,. raytracing, etc.



curl-6 said:
Chrkeller said:

You are entitled to your opinion, but so are others.  I personally think 1080p/30fps on modern hardware is laughable.  

Technically it's not 1080p. It's a dynamic 1440p with letterboxing.

You have every right to dislike it, but clearly many developers have decided that chasing high resolutions isn't their priority; tons of games on PS5 and Xbox Series run at, around, or below 1080p.

This isn't because the hardware isn't capable, it's because developers made a conscious decision to devote more resources towards things like lighting or details than towards raw pixel count.

Even if the consoles were twice as powerful as they in reality, many devs would still be choosing to target lower resolutions and pour that extra power into things like Nanite,. raytracing, etc.

I fully understand all that.  I just disagree with the direction some developers are taking.  Fps not only makes a game smoother but also reduces latency.  

And overall I do beleive consoles fell behind game engines very quickly this generation, moreso than previous generations.

RT on consoles is hot garbage, as an example.  UE5 is another example.  



Chrkeller said:
curl-6 said:

Technically it's not 1080p. It's a dynamic 1440p with letterboxing.

You have every right to dislike it, but clearly many developers have decided that chasing high resolutions isn't their priority; tons of games on PS5 and Xbox Series run at, around, or below 1080p.

This isn't because the hardware isn't capable, it's because developers made a conscious decision to devote more resources towards things like lighting or details than towards raw pixel count.

Even if the consoles were twice as powerful as they in reality, many devs would still be choosing to target lower resolutions and pour that extra power into things like Nanite,. raytracing, etc.

I fully understand all that.  I just disagree with the direction some developers are taking.  Fps not only makes a game smoother but also reduces latency.  

And overall I do beleive consoles fell behind game engines very quickly this generation, moreso than previous generations.

RT on consoles is hot garbage, as an example.  UE5 is another example.  

Yeah raytracing on current consoles definitely feels undercooked and not quite there.

Hellblade II is honestly the first UE5 game that really blossoms on console in my opinion, and I suspect the reason is because 30fps gave them more breathing room. Attempts to mix Lumen/Nanite and 60fps on consoles thus far have resulted in pretty severe compromises, usually in resolution.