By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - Rumor: Xbox "Lockhart" specs leaked, is $300

goopy20 said:
DonFerrari said:

Actually since CPU would be the same we could even see something strange like 720p60fps on Series S for a 1440p30fps Series X title right?

Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful. So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 720p (probably 540p) if it's running in 1440p on Series X. 

Going from 1280x720 to 2560x1440 will use around 4x the resources, not 2x the resources!

1280x720 x 30 fps = 27.6 million pixels per second

1920x1080 x 30 fps = 62.2 million pixels per second

2560x1440 x 30 fps = 110.6 million pixels per second

So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 1080p but stay above 720p if it's running in 1440p on Series X. 

2560x1440 x 30 fps = 110.6 million pixels per second; one third of that = 36.9 million pixels per second

1600x900 x 30 fps = 43.2 million pixels per second

1422x800 x 30 fps = 34.1 million pixels per second

1280x720 x 40 fps = 36.9 million pixels per second

So dynamic 800p - 900p could be theoretically possible in this scenario. Or constant 900p with a bit of tweaking.

Or 720p with 40 fps on a 120 Hz display. Or 800p with 30 - 50 fps VRR...

Last edited by Conina - on 21 March 2020

Around the Network
goopy20 said:
DonFerrari said:

Actually since CPU would be the same we could even see something strange like 720p60fps on Series S for a 1440p30fps Series X title right?

Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful. So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 720p (probably 540p) if it's running in 1440p on Series X. 

Like I said, if Lockhart is real, all of MS's exclusives will be designed from the ground up so they can run at 1080p on Series S and Series X games will have the exact same games, only running in native 4k/60fps (and 120fps with the X1 cross gen games). It's going to be very interesting to see how a 4k/60fps Xbox exclusive will compare to a ps5 exclusive that's running at 30fps/ 1440p. Ps5 games will simply have more than double the resources left by not going native 4k. All of which they can spend on physics, ai, world simulations and overall fidelity, while still running at a resolution that will look noticeably sharper than what most people are used to on the base current gen consoles.    

Again, a game aiming for 1440p and 30 fps would likely have exceptional visuals. These are scenarios where graphics settings should probably be lowered. Hence, I can respond saying nothing new.

Even if Series S doesen't exist, I don't think 1440p/30 fps is going to be a common goal for MS. Nor will it be for Sony. You keep trying to create extreme scenarios to make Series S seem like a bad idea.

The PS5 can turn God of War into a 4K/60 fps game with maybe half of its GPU power. So God of War 2 doesen't need to be 1440p/30 fps for a massive visual boost.

Ultimately, if the Series S can cost about $150-200 less and plays the same games, then its existence would be justified. A visual downgrade would be expected for the price disparity.

You seem to feel, "physics, AI, world simulation" are important. The Series S would could keep all that, its just visual fidelity taking a hit.

I have a X1X so I already play 4K content. A 1080p game can objectively look more impressive than 4K content. Hence, the quality of the pixels can matter more than the number.

So even if Series S is generally doing 720p-1080p in AAA games, that's fine. Because it will still come with the advancements of 9th gen game design. There will also be improved image reconstruction tecniques, there can still be 4K UIs and the games will still be a noticeable upgrade over 8th gen.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:
goopy20 said:

Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful. So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 720p (probably 540p) if it's running in 1440p on Series X. 

Like I said, if Lockhart is real, all of MS's exclusives will be designed from the ground up so they can run at 1080p on Series S and Series X games will have the exact same games, only running in native 4k/60fps (and 120fps with the X1 cross gen games). It's going to be very interesting to see how a 4k/60fps Xbox exclusive will compare to a ps5 exclusive that's running at 30fps/ 1440p. Ps5 games will simply have more than double the resources left by not going native 4k. All of which they can spend on physics, ai, world simulations and overall fidelity, while still running at a resolution that will look noticeably sharper than what most people are used to on the base current gen consoles.    

Again, a game aiming for 1440p and 30 fps would likely have exceptional visuals. These are scenarios where graphics settings should probably be lowered. Hence, I can respond saying nothing new.

Even if Series S doesen't exist, I don't think 1440p/30 fps is going to be a common goal for MS. Nor will it be for Sony. You keep trying to create extreme scenarios to make Series S seem like a bad idea.

The PS5 can turn God of War into a 4K/60 fps game with maybe half of its GPU power. So God of War 2 doesen't need to be 1440p/30 fps for a massive visual boost.

Ultimately, if the Series S can cost about $150-200 less and plays the same games, then its existence would be justified. A visual downgrade would be expected for the price disparity.

You seem to feel, "physics, AI, world simulation" are important. The Series S would could keep all that, its just visual fidelity taking a hit.

I have a X1X so I already play 4K content. A 1080p game can objectively look more impressive than 4K content. Hence, the quality of the pixels can matter more than the number.

So even if Series S is generally doing 720p-1080p in AAA games, that's fine. Because it will still come with the advancements of 9th gen game design. There will also be improved image reconstruction tecniques, there can still be 4K UIs and the games will still be a noticeable upgrade over 8th gen.

The bolded is the only caveat for pro and scorpio. Since it didn't give options (usually just higher res or fps) you couldn't decide for prettier 1080 (be you have just a fullHD tv or preffer it) instead of the downsampling for example.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Conina said:
goopy20 said:

Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful. So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 720p (probably 540p) if it's running in 1440p on Series X. 

Going from 1280x720 to 2560x1440 will use around 4x the resources, not 2x the resources!

1280x720 x 30 fps = 27.6 million pixels per second

1920x1080 x 30 fps = 62.2 million pixels per second

2560x1440 x 30 fps = 110.6 million pixels per second

So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 1080p but stay above 720p if it's running in 1440p on Series X. 

2560x1440 x 30 fps = 110.6 million pixels per second; one third of that = 36.9 million pixels per second

1600x900 x 30 fps = 43.2 million pixels per second

1422x800 x 30 fps = 34.1 million pixels per second

1280x720 x 40 fps = 36.9 million pixels per second

So dynamic 800p - 900p could be theoretically possible in this scenario. Or constant 900p with a bit of tweaking.

Or 720p with 40 fps on a 120 Hz display. Or 800p with 30 - 50 fps VRR...

If it would be 4x times the resources you would only make my point more obvious. However, for most games you'll see about half the fps if you're going from 1440p to 720p and half the fps going from native 4k to 1440p https://www.youtube.com/watch?v=AKUqQhSz210



Mr Puggsly said:
goopy20 said:

Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful. So if you have a game running at the same fps and graphics settings on Series S and X, you would have drop below 720p (probably 540p) if it's running in 1440p on Series X. 

Like I said, if Lockhart is real, all of MS's exclusives will be designed from the ground up so they can run at 1080p on Series S and Series X games will have the exact same games, only running in native 4k/60fps (and 120fps with the X1 cross gen games). It's going to be very interesting to see how a 4k/60fps Xbox exclusive will compare to a ps5 exclusive that's running at 30fps/ 1440p. Ps5 games will simply have more than double the resources left by not going native 4k. All of which they can spend on physics, ai, world simulations and overall fidelity, while still running at a resolution that will look noticeably sharper than what most people are used to on the base current gen consoles.    

Again, a game aiming for 1440p and 30 fps would likely have exceptional visuals. These are scenarios where graphics settings should probably be lowered. Hence, I can respond saying nothing new.

Even if Series S doesen't exist, I don't think 1440p/30 fps is going to be a common goal for MS. Nor will it be for Sony. You keep trying to create extreme scenarios to make Series S seem like a bad idea.

The PS5 can turn God of War into a 4K/60 fps game with maybe half of its GPU power. So God of War 2 doesen't need to be 1440p/30 fps for a massive visual boost.

Ultimately, if the Series S can cost about $150-200 less and plays the same games, then its existence would be justified. A visual downgrade would be expected for the price disparity.

You seem to feel, "physics, AI, world simulation" are important. The Series S would could keep all that, its just visual fidelity taking a hit.

I have a X1X so I already play 4K content. A 1080p game can objectively look more impressive than 4K content. Hence, the quality of the pixels can matter more than the number.

So even if Series S is generally doing 720p-1080p in AAA games, that's fine. Because it will still come with the advancements of 9th gen game design. There will also be improved image reconstruction tecniques, there can still be 4K UIs and the games will still be a noticeable upgrade over 8th gen.

I don't think it's an extreme scenario that next gen consoles will take a giant leap in fidelity, isn't that pretty much what we should be expecting from next gen games lol. But I don't see that happening if they're all running at native 4k/60fps. Especially if there are already some current gen games that can't hit 60 fps on a RTX2080ti at native 4k. And no GOW isn't native 4k/60fps, it's checkerboard 4k at 30fps or 1080p/60fps.

Ultimately, while the debate will continue to rage about the effectiveness of the technique, checkerboarding makes a lot of sense here - it would be impossible to render a game like God of War at native 4K on a PS4 Pro while maintaining a smooth frame-rate, and the visual payback compared to some of the 1800p and 1620p games we've seen is self-evident.

https://www.eurogamer.net/articles/digitalfoundry-2018-god-of-war-tech-analysis

Series X's specs are great but MS is really creating a problem for themselves by focusing so much on compatibility with pc and their current gen consoles. I mean think about Halo Infinite. We will have the base Xone running the game in 1080p/30fps, a X1X version probably running at 4k/30fps, a Lockhart version running at 4k/60fps and a Series X version running in 4k/120fps with a bump in (inefficient) graphics settings. In the end, no matter which console you're playing on, you will still be getting the exact same game that was designed for the lowest common denominator. They'll probably drop X1 support after a year or so, but what about X1X and the fact that all their exclusives will still have to run on a 4Tflops Lockhart for the remainder of the console generation? 

It's going to be weird as hell and Sony will be laughing their asses off when their ps5 exclusives start pouring out, completely unhindered by compatibility with weaker hardware, and where those extra resources can be used in a ton of stuff that's more exciting than just native 4k and extra fps. Especially if their SSD tech is such a game changer as Sony believes it is and isn't even available on pc.

Last edited by goopy20 - on 21 March 2020

Around the Network
goopy20 said:

If it would be 4x times the resources you would only make my point more obvious. However, for most games you'll see about half the fps if you're going from 1440p to 720p and half the fps going from native 4k to 1440p https://www.youtube.com/watch?v=AKUqQhSz210

No, it wouldn't make your point more obvious, it is totally contrary to your point.

"Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful."

Your point was that 1440p only needs 2x the resources of 720p and that 1/3 the performance of the Series X wouldn't be enough since 3 > 2.

But a perfect scaling engine would need 4x the resources for 4x the resolution. With a perfect scaling engine 1/3 the performance of the Series X would be enough since 3 < 4.

We all know that engines don't scale down perfect, but for most games you need around 1/3 performance for 1/4 resolution, not 1/2 performance for 1/4 resolution.

Last edited by Conina - on 21 March 2020

Conina said:
goopy20 said:

If it would be 4x times the resources you would only make my point more obvious. However, for most games you'll see about half the fps if you're going from 1440p to 720p and half the fps going from native 4k to 1440p https://www.youtube.com/watch?v=AKUqQhSz210

No, it wouldn't make your point more obvious, it is totally contrary to your point.

"Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful."

Your point was that 1440p only needs 2x the resources of 720p and that 1/3 the performance of the Series X wouldn't be enough since 3 > 2.

But a perfect scaling engine would need 4x the resources for 4x the resolution. With a perfect scaling engine 1/3 the performance of the Series X would be enough since 3 < 4.

We all know that engines don't scale down perfect, but for most games you need around 1/3 performance for 1/4 resolution, not 1/2 performance for 1/4 resolution.

Okay maybe you can help me out here. If a game is pushing Series X to its limits at 1440p/30fps, what resolution would they need to scale down to on Series S (with the same graphics settings) to hit the same framerate? 



goopy20 said:
Conina said:

No, it wouldn't make your point more obvious, it is totally contrary to your point.

"Going from 720p to 1440p will use around twice the resources, whereas SeriesX is 3 times more powerful."

Your point was that 1440p only needs 2x the resources of 720p and that 1/3 the performance of the Series X wouldn't be enough since 3 > 2.

But a perfect scaling engine would need 4x the resources for 4x the resolution. With a perfect scaling engine 1/3 the performance of the Series X would be enough since 3 < 4.

We all know that engines don't scale down perfect, but for most games you need around 1/3 performance for 1/4 resolution, not 1/2 performance for 1/4 resolution.

Okay maybe you can help me out here. If a game is pushing Series X to its limits at 1440p/30fps, what resolution would they need to scale down to on Series S (with the same graphics settings) to hit the same framerate? 

Probably 720p/30 fps if they are lazy or 800p to 900p with additional tweaks.

It also depends how much headroom the 1440p/30fps has. If 35 - 40 fps are possible for the Series X unlocked and they lock it down to 30 fps for consistency, 800p/30 fps with the same settings or 900p - 1080p with additional tweaks could be possible.



Conina said:
goopy20 said:

Okay maybe you can help me out here. If a game is pushing Series X to its limits at 1440p/30fps, what resolution would they need to scale down to on Series S (with the same graphics settings) to hit the same framerate? 

Probably 720p/30 fps if they are lazy or 800p to 900p with additional tweaks.

It also depends how much headroom the 1440p/30fps has. If 35 - 40 fps are possible for the Series X unlocked and they lock it down to 30 fps for consistency, 800p/30 fps with the same settings or 900p - 1080p with additional tweaks could be possible.

But don't you need roughly twice the processing power to go from 720p to 1440p and hit the same 30fps? Also, for argument sake, lets say that game is 100% optimized on Series X and there's no head room to play with. It's a visual spectacle that pushes Series X to its limits and even has some drops here and there below 30fps, just like we're seeing with most console games today.



Conina said:
goopy20 said:

Okay maybe you can help me out here. If a game is pushing Series X to its limits at 1440p/30fps, what resolution would they need to scale down to on Series S (with the same graphics settings) to hit the same framerate? 

Probably 720p/30 fps if they are lazy or 800p to 900p with additional tweaks.

It also depends how much headroom the 1440p/30fps has. If 35 - 40 fps are possible for the Series X unlocked and they lock it down to 30 fps for consistency, 800p/30 fps with the same settings or 900p - 1080p with additional tweaks could be possible.

And variable res (because we could consider the game could be above 1440 but choose it for standard) it could even be 900-1080 as you said. Also no idea why they would use the same for everything else since you won't need same res for textures and some other aspects.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."