By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Should Halo Infinite drop Xbox One and go Scarlet exclusive?

Tagged games:

 

Should it?

Yes, dump Xbone, next gen exclusive 35 42.68%
 
No, keep it cross gen with Xbone 47 57.32%
 
Total:82
Mr Puggsly said:

Go buy Fable 3 (or better yet, 2) on Xbox if the prospect of stealing upsets you. Eitherway, that version on PC is dead.

Do you not ever read? I own a copy. I have stated this multiple times already.

Mr Puggsly said:

Simplifying the development process with better specs isnt relevant. Also your examples were more about visual aspects. Im talking more about gameplay mechanics and gameplay related design in general. But Im sure you will continue to drag on the argument instead of admitting you get my point.

Simplifying the development process with better specs is certainly relevant.
Many visual aspects play into gameplay mechanics... Which is made achievable thanks to increases in hardware capability. They are all part of the same construct.
For example... Remember when we were all blown away when Physics became a thing in games such as Half Life 2 making the gravity gun a possible gameplay mechanic? - That was only thanks to increases in CPU capability that made it feasible.

Mr Puggsly said:

Halo 5 on the X1X was basically just a resolution upgrade, you cant say that was an attempt to enhance or fix the visual quirks of that game.

The Xbox One X is just wasted potential for the most part, same with the Playstation 4 Pro. PC is still where it is at.

Mr Puggsly said:

Generally speaking just increasing things like draw distance is not a huge task, it would certainly be expected in a PC port. The only excuse I could come up with is incrasing those settings somehow impacts the game. Thus it could only be addressed in a remaster. I dont thats the case for all visual aspects though.

Not always black and white.

curl-6 said:

Interesting stuff. Not sure if it's going to happen, it'll depend on devs, but I'd love to see a lot more simulation and interactivity in next gen games, that's kinda what I wanted from this gen but didn't really get.

Indeed. One thing that irked me about the 7th gen is the lack of simulation quality... We saw a slight improvement in the 8th gen with ants crawling on a tree in Horizon: Zero Dawn, but that was hardly the norm... And they did have to make cutbacks to other parts of the games simulation quality to get those "little things" in. (I.E. Water.)

DonFerrari said:

Well considering CPU was basically skipped this gen (as far as being 4-8x better than previous gen), then it should really be the biggest jump, basically 2 gens worth. So when people were talking about only 4x gain compared to base consoles of 8th gen I thought it was to little.

That is likely to be the case!
The Jump from the Playstation 2 to the Playstation 3 on the CPU side was pretty monolithic, but only in floating point tasks, for things like Integers and more modern Instructions the jump from Jaguar to Zen is probably just as big if not bigger (Haven't checked), Zen's wider, smarter cores really lends itself well.

For the Xbox though who went from a P6 based chip to PowerPC to Jaguar to Zen... This is likely to be the largest jump for any Microsoft console.

HoloDust said:
Pemalite said:

Probably a bit difficult to get a Jaguar to Zen2 comparison. - But years ago I did a comparison and calculated that 8x Jaguar cores is roughly equivalent to a dual-core Core i3 at the time, operating at around 3ghz.
So taking any Sandy/Ivy-Bridge Dual Core and pitting it against Zen 2 is as accurate as you are probably going to get it at this stage unless I can source some hardware. (Working on it!)

Hm...done that, using Passmark - all normalized for single core (normalized additionally @1GHz by me):

- around 380 for Jaguar (Athlon 5150, E2-3000 and Opteron X2170)
- 510-530
for i3 (i3 2100, i3 3210)
- 765-810 for Ryzens (3600-3900X)
- 805 for i9 9900K

That's what surprised me the first time - I'm not sure what Passmark actually benchmarks, but I don't know of any other benchmark that has such a comprehensive data base and that has single threaded rating as well.

It benchmarks a bit of everything... But it doesn't use data-sets that really take advantage of more modern instructions sets that Ryzen is really really good at.
https://www.cpubenchmark.net/cpu_test_info.html

We can safely assume developers will leverage Ryzen to it's strengths when it comes to gaming on next gen to eek the most of what they can out of the fixed hardware.
I'm actually excited for the gains on the CPU front.



--::{PC Gaming Master Race}::--

Around the Network
Pemalite said:
Mr Puggsly said:

Go buy Fable 3 (or better yet, 2) on Xbox if the prospect of stealing upsets you. Eitherway, that version on PC is dead.

Do you not ever read? I own a copy. I have stated this multiple times already.

Mr Puggsly said:

Simplifying the development process with better specs isnt relevant. Also your examples were more about visual aspects. Im talking more about gameplay mechanics and gameplay related design in general. But Im sure you will continue to drag on the argument instead of admitting you get my point.

Simplifying the development process with better specs is certainly relevant.
Many visual aspects play into gameplay mechanics... Which is made achievable thanks to increases in hardware capability. They are all part of the same construct.
For example... Remember when we were all blown away when Physics became a thing in games such as Half Life 2 making the gravity gun a possible gameplay mechanic? - That was only thanks to increases in CPU capability that made it feasible.

Mr Puggsly said:

Halo 5 on the X1X was basically just a resolution upgrade, you cant say that was an attempt to enhance or fix the visual quirks of that game.

The Xbox One X is just wasted potential for the most part, same with the Playstation 4 Pro. PC is still where it is at.

Mr Puggsly said:

Generally speaking just increasing things like draw distance is not a huge task, it would certainly be expected in a PC port. The only excuse I could come up with is incrasing those settings somehow impacts the game. Thus it could only be addressed in a remaster. I dont thats the case for all visual aspects though.

Not always black and white.

curl-6 said:

Interesting stuff. Not sure if it's going to happen, it'll depend on devs, but I'd love to see a lot more simulation and interactivity in next gen games, that's kinda what I wanted from this gen but didn't really get.

Indeed. One thing that irked me about the 7th gen is the lack of simulation quality... We saw a slight improvement in the 8th gen with ants crawling on a tree in Horizon: Zero Dawn, but that was hardly the norm... And they did have to make cutbacks to other parts of the games simulation quality to get those "little things" in. (I.E. Water.)

DonFerrari said:

Well considering CPU was basically skipped this gen (as far as being 4-8x better than previous gen), then it should really be the biggest jump, basically 2 gens worth. So when people were talking about only 4x gain compared to base consoles of 8th gen I thought it was to little.

That is likely to be the case!
The Jump from the Playstation 2 to the Playstation 3 on the CPU side was pretty monolithic, but only in floating point tasks, for things like Integers and more modern Instructions the jump from Jaguar to Zen is probably just as big if not bigger (Haven't checked), Zen's wider, smarter cores really lends itself well.

For the Xbox though who went from a P6 based chip to PowerPC to Jaguar to Zen... This is likely to be the largest jump for any Microsoft console.

HoloDust said:

Hm...done that, using Passmark - all normalized for single core (normalized additionally @1GHz by me):

- around 380 for Jaguar (Athlon 5150, E2-3000 and Opteron X2170)
- 510-530
for i3 (i3 2100, i3 3210)
- 765-810 for Ryzens (3600-3900X)
- 805 for i9 9900K

That's what surprised me the first time - I'm not sure what Passmark actually benchmarks, but I don't know of any other benchmark that has such a comprehensive data base and that has single threaded rating as well.

It benchmarks a bit of everything... But it doesn't use data-sets that really take advantage of more modern instructions sets that Ryzen is really really good at.
https://www.cpubenchmark.net/cpu_test_info.html

We can safely assume developers will leverage Ryzen to it's strengths when it comes to gaming on next gen to eek the most of what they can out of the fixed hardware.
I'm actually excited for the gains on the CPU front.

I didn't mean YOU should buy Fable 3. Also, Since YOU already own it, MS doesn't need to sell it anymore.

The cool physics in Half Life 2 isn't what made that a great game though, it was more like a neat feature. In case you forgot by the way, I'm simply arguing you don't need a vast improvement in specs (like Scarlett) to make interesting new games. Hence, Halo Infinite wouldn't suddenly become a much more ambitious project if it were simply moved to Scarlett exclusively. Many 8th gen games feel derivative or smaller in scope than many 7th gen games.

X1X has untapped potential, but I wouldn't say its mostly wasted. Even if we argue it not properly utilized, it still delivers a better way to play X1 games. It takes a lot of GPU power to increase them to 1800p, 4K, or whatever.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

DonFerrari said: 
Mr Puggsly said:

Witcher 3 is gonna be a functional product on Switch. No cut content, it will probably be around 30 fps with dips, it will ultimately be the same game.

I agree, better graphics is great especially after ~7 years into a gen. The mid gen upgrades were also great for a visual boost. I also agree the specs can be used in other places. I'm saying games don't necessarily take advantage of specs to make games more ambitious, that's often the case.

You're saying "all games." I'm saying most games won't really utilize the new specs to make games more ambitious or increase scope, etc. I was careful about my word choice.

The 8th gen had a huge spec boost, there could have been surprises there as well. I'm simply arguing they were few in regard to game design.

I argue the 8th gen mostly felt like 7th gen with extra polish. But as the games were designed, much of it could have worked fine on 7th gen from a game design perspective.

They are cutting a lot for Witcher 3, there is no other way around it. I guess what you mean is that the gameplay elements will be kept.

Mid gen upgrades I can agree weren't needed and that the games didn't really improve outside of graphics due to them, but the fault would be that they had to keep support for the baseline versions and also kept same architeture due to compatibility. So in this case very bad CPU that held down the GPU more than it should.

Nope, I the "all games" you are saying, which I don't remember saying, would be that any game could benefit from a better HW to improve scope. Still I wouldn't say all games, because there is plenty of shovelware and indies that would run "exactly" the same if released on PS3.

Plenty of games would have a very lower NPC count and physics if CPU was worse (that is a game design element), but also the bad CPU of both consoles also limited the ambition one could go for on PS4/X1 so it had more juice for graphics on GPU than gameplay on CPU.

If one was willing most of the games today with some heavy tweak could play on PS2 perhaps even PS1. But you can be sure a Halo designed exclusivelly for X4 would have potential to be better than having to launch on X1 base. If the game will be better than og Halo that we can only know when it releases and won't be fault of the better HW if it is worse Halo or game.

Well yeah... the point is visual compromises will be made but they claim the gameplay will be the same.

Were mid gen upgraded needed? No. But there is evidently enough demand for them, it was also needed to take advantage of 4K TVs which have become popular and decreased in price faster than I expected.

People tend to mock the CPU of 8th gen consoles, but GPUs aren't amazing either. I don't feel the 8th gen was held back by the CPUs, they were pretty much equal. Better CPUs wouldn't have changed much in the 8th gen because graphics was the focus.

Some games took advantage of the superior capabilities of 8th gen consoles to increase the scope of games, but most didn't. AAA games included, it was mostly visual.

I feel GTAV even on last gen was pretty impressive in regard to physics and NPC count, arguably more so than a lot of modern games. Which supports my original argument, better specs doesn't mean most games become more ambitious or increase in scope. And again getting back to point of thread, I don't feel a game like Halo Infinite would really benefit from being Scarlett exclusive, certainly not this far into development.

Arguably the best use of 8th gen specs that the 7th gen couldn't quite duplicate is large MP games like Fortnite and MMOs. However, that may have been more of a RAM limitation. MAG on PS3 was an exception I suppose, but I never played it so I don't have thoughts on it.

Last edited by Mr Puggsly - on 15 July 2019

Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

curl-6 said:
DonFerrari said:

RDR1 and GTA V are though. And you understood the point. Most if not all technical limitations are surpaseable if you cut down enough on the game. And as pemalite said, the biggest plus Switch have that WiiU didn't for the ports is that the architeture is more modern and closer to PS4/X1 so the cuts are less heavy than if ported to last gen.

Agreed, all I meant was that both the more modern tech and the much larger RAM help, rather than it being primarily one or the other.

Pemalite said:

Absolutely massive, probably the single largest leap in CPU capability in generations. - If there is only one criticism that I can give... Is that Sony/Microsoft doesn't go for more than 1 CCX grouping of cores... (But I had been saying that was the path they would take for years anyway due to cost reasons.)

Would have been nice to have 12-16 CPU cores for platform longevity, we just aren't there yet.

It does mean that the 10th gen consoles aren't likely to see the same kind of leap in CPU capability as well though, so the 9th gen is gearing up to be rather interesting.

Probably a bit difficult to get a Jaguar to Zen2 comparison. - But years ago I did a comparison and calculated that 8x Jaguar cores is roughly equivalent to a dual-core Core i3 at the time, operating at around 3ghz.
So taking any Sandy/Ivy-Bridge Dual Core and pitting it against Zen 2 is as accurate as you are probably going to get it at this stage unless I can source some hardware. (Working on it!)

Interesting stuff. Not sure if it's going to happen, it'll depend on devs, but I'd love to see a lot more simulation and interactivity in next gen games, that's kinda what I wanted from this gen but didn't really get.

Not and expert, but from what I can tell Switch is a pretty balanced HW for what Nintendo needs. And third parties have a good enough package that they can port most of the relevant games for it.

Pemalite said:
Mr Puggsly said:

Go buy Fable 3 (or better yet, 2) on Xbox if the prospect of stealing upsets you. Eitherway, that version on PC is dead.

Do you not ever read? I own a copy. I have stated this multiple times already.

Mr Puggsly said:

Simplifying the development process with better specs isnt relevant. Also your examples were more about visual aspects. Im talking more about gameplay mechanics and gameplay related design in general. But Im sure you will continue to drag on the argument instead of admitting you get my point.

Simplifying the development process with better specs is certainly relevant.
Many visual aspects play into gameplay mechanics... Which is made achievable thanks to increases in hardware capability. They are all part of the same construct.
For example... Remember when we were all blown away when Physics became a thing in games such as Half Life 2 making the gravity gun a possible gameplay mechanic? - That was only thanks to increases in CPU capability that made it feasible.

Mr Puggsly said:

Halo 5 on the X1X was basically just a resolution upgrade, you cant say that was an attempt to enhance or fix the visual quirks of that game.

The Xbox One X is just wasted potential for the most part, same with the Playstation 4 Pro. PC is still where it is at.

Mr Puggsly said:

Generally speaking just increasing things like draw distance is not a huge task, it would certainly be expected in a PC port. The only excuse I could come up with is incrasing those settings somehow impacts the game. Thus it could only be addressed in a remaster. I dont thats the case for all visual aspects though.

Not always black and white.

curl-6 said:

Interesting stuff. Not sure if it's going to happen, it'll depend on devs, but I'd love to see a lot more simulation and interactivity in next gen games, that's kinda what I wanted from this gen but didn't really get.

Indeed. One thing that irked me about the 7th gen is the lack of simulation quality... We saw a slight improvement in the 8th gen with ants crawling on a tree in Horizon: Zero Dawn, but that was hardly the norm... And they did have to make cutbacks to other parts of the games simulation quality to get those "little things" in. (I.E. Water.)

DonFerrari said:

Well considering CPU was basically skipped this gen (as far as being 4-8x better than previous gen), then it should really be the biggest jump, basically 2 gens worth. So when people were talking about only 4x gain compared to base consoles of 8th gen I thought it was to little.

That is likely to be the case!
The Jump from the Playstation 2 to the Playstation 3 on the CPU side was pretty monolithic, but only in floating point tasks, for things like Integers and more modern Instructions the jump from Jaguar to Zen is probably just as big if not bigger (Haven't checked), Zen's wider, smarter cores really lends itself well.

For the Xbox though who went from a P6 based chip to PowerPC to Jaguar to Zen... This is likely to be the largest jump for any Microsoft console.

HoloDust said:

Hm...done that, using Passmark - all normalized for single core (normalized additionally @1GHz by me):

- around 380 for Jaguar (Athlon 5150, E2-3000 and Opteron X2170)
- 510-530
for i3 (i3 2100, i3 3210)
- 765-810 for Ryzens (3600-3900X)
- 805 for i9 9900K

That's what surprised me the first time - I'm not sure what Passmark actually benchmarks, but I don't know of any other benchmark that has such a comprehensive data base and that has single threaded rating as well.

It benchmarks a bit of everything... But it doesn't use data-sets that really take advantage of more modern instructions sets that Ryzen is really really good at.
https://www.cpubenchmark.net/cpu_test_info.html

We can safely assume developers will leverage Ryzen to it's strengths when it comes to gaming on next gen to eek the most of what they can out of the fixed hardware.
I'm actually excited for the gains on the CPU front.

I'm preparing myself to be surprised and happy once again as every gen have made me.

Mr Puggsly said:
DonFerrari said: 

They are cutting a lot for Witcher 3, there is no other way around it. I guess what you mean is that the gameplay elements will be kept.

Mid gen upgrades I can agree weren't needed and that the games didn't really improve outside of graphics due to them, but the fault would be that they had to keep support for the baseline versions and also kept same architeture due to compatibility. So in this case very bad CPU that held down the GPU more than it should.

Nope, I the "all games" you are saying, which I don't remember saying, would be that any game could benefit from a better HW to improve scope. Still I wouldn't say all games, because there is plenty of shovelware and indies that would run "exactly" the same if released on PS3.

Plenty of games would have a very lower NPC count and physics if CPU was worse (that is a game design element), but also the bad CPU of both consoles also limited the ambition one could go for on PS4/X1 so it had more juice for graphics on GPU than gameplay on CPU.

If one was willing most of the games today with some heavy tweak could play on PS2 perhaps even PS1. But you can be sure a Halo designed exclusivelly for X4 would have potential to be better than having to launch on X1 base. If the game will be better than og Halo that we can only know when it releases and won't be fault of the better HW if it is worse Halo or game.

Well yeah... the point is visual compromises will be made but they claim the gameplay will be the same.

Were mid gen upgraded needed? No. But there is evidently enough demand for them, it was also needed to take advantage of 4K TVs which have become popular and decreased in price faster than I expected.

People tend to mock the CPU of 8th gen consoles, but GPUs aren't amazing either. I don't feel the 8th gen was held back by the CPUs, they were pretty much equal. Better CPUs wouldn't have changed much in the 8th gen because graphics was the focus.

Some games took advantage of the superior capabilities of 8th gen consoles to increase the scope of games, but most didn't. AAA games included, it was mostly visual.

I feel GTAV even on last gen was pretty impressive in regard to physics and NPC count, arguably more so than a lot of modern games. Which supports my original argument, better specs doesn't mean most games become more ambitious or increase in scope. And again getting back to point of thread, I don't feel a game like Halo Infinite would really benefit from being Scarlett exclusive, certainly not this far into development.

Arguably the best use of 8th gen specs that the 7th gen couldn't quite duplicate is large MP games like Fortnite and MMOs. However, that may have been more of a RAM limitation. MAG on PS3 was an exception I suppose, but I never played it so I don't have thoughts on it.

Sure there were demand, I bought Pro. Just didn't bought the better HW that X1X is because I mostly am interested in Sony exclusives and the few 3rd party I play wouldn't justify the extra cost. Unfortunatelly I'm not much into FPS or shotters in general where GoW and Halo would certainly be reason enough to buy it.

Of course the GPUs weren't impressive, but the CPU was even lower tier (but balanced for what they wanted on the gen, console games are much more GPU heavy), but needing to keep CPU the same for X1X and Pro (just small improvement to keep the processing for the improved pixel count) made most if not all games very limited.

Games are going to improve graphically, that is a marketing tool. So if the HW don't improve they will have to cut in other areas to keep the graphics improving.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said: 
Mr Puggsly said:

Well yeah... the point is visual compromises will be made but they claim the gameplay will be the same.

Were mid gen upgraded needed? No. But there is evidently enough demand for them, it was also needed to take advantage of 4K TVs which have become popular and decreased in price faster than I expected.

People tend to mock the CPU of 8th gen consoles, but GPUs aren't amazing either. I don't feel the 8th gen was held back by the CPUs, they were pretty much equal. Better CPUs wouldn't have changed much in the 8th gen because graphics was the focus.

Some games took advantage of the superior capabilities of 8th gen consoles to increase the scope of games, but most didn't. AAA games included, it was mostly visual.

I feel GTAV even on last gen was pretty impressive in regard to physics and NPC count, arguably more so than a lot of modern games. Which supports my original argument, better specs doesn't mean most games become more ambitious or increase in scope. And again getting back to point of thread, I don't feel a game like Halo Infinite would really benefit from being Scarlett exclusive, certainly not this far into development.

Arguably the best use of 8th gen specs that the 7th gen couldn't quite duplicate is large MP games like Fortnite and MMOs. However, that may have been more of a RAM limitation. MAG on PS3 was an exception I suppose, but I never played it so I don't have thoughts on it.

Sure there were demand, I bought Pro. Just didn't bought the better HW that X1X is because I mostly am interested in Sony exclusives and the few 3rd party I play wouldn't justify the extra cost. Unfortunatelly I'm not much into FPS or shotters in general where GoW and Halo would certainly be reason enough to buy it.

Of course the GPUs weren't impressive, but the CPU was even lower tier (but balanced for what they wanted on the gen, console games are much more GPU heavy), but needing to keep CPU the same for X1X and Pro (just small improvement to keep the processing for the improved pixel count) made most if not all games very limited.

Games are going to improve graphically, that is a marketing tool. So if the HW don't improve they will have to cut in other areas to keep the graphics improving.

Well the mid gen upgrades came when it was too soon to start a new gen but people wanted better graphics. I'm happy they came and my X1X will likely be a great way to play Halo Infinite.

I disagree, the CPU and GPU are fairly balanced in base hardware. Frankly, much of the bottle neck has been on the GPU which is why resolutions have varied and dynamic resolutions became common. People keep saying the 8th gen CPUs are too limited, but I don't think developers are even pushing its limits. I mean Just Cause 4 shouldn't exist if the CPUs were so limited.

In practice, it seems to me the CPUs in 8th gen consoles had enough power for what developers were generally looking to do. You have a theory and I don't feel evidence supports it.

I don't think graphics are just a marketing tool, its something many gamers care about. The 9th gen consoles will have vastly superior CPUs, but generally that will simply mean more 60 fps games.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Around the Network
Mr Puggsly said:
DonFerrari said: 

Sure there were demand, I bought Pro. Just didn't bought the better HW that X1X is because I mostly am interested in Sony exclusives and the few 3rd party I play wouldn't justify the extra cost. Unfortunatelly I'm not much into FPS or shotters in general where GoW and Halo would certainly be reason enough to buy it.

Of course the GPUs weren't impressive, but the CPU was even lower tier (but balanced for what they wanted on the gen, console games are much more GPU heavy), but needing to keep CPU the same for X1X and Pro (just small improvement to keep the processing for the improved pixel count) made most if not all games very limited.

Games are going to improve graphically, that is a marketing tool. So if the HW don't improve they will have to cut in other areas to keep the graphics improving.

Well the mid gen upgrades came when it was too soon to start a new gen but people wanted better graphics. I'm happy they came and my X1X will likely be a great way to play Halo Infinite.

I disagree, the CPU and GPU are fairly balanced in base hardware. Frankly, much of the bottle neck has been on the GPU which is why resolutions have varied and dynamic resolutions became common. People keep saying the 8th gen CPUs are too limited, but I don't think developers are even pushing its limits. I mean Just Cause 4 shouldn't exist if the CPUs were so limited.

In practice, it seems to me the CPUs in 8th gen consoles had enough power for what developers were generally looking to do. You have a theory and I don't feel evidence supports it.

I don't think graphics are just a marketing tool, its something many gamers care about. The 9th gen consoles will have vastly superior CPUs, but generally that will simply mean more 60 fps games.

Considering most games had bad drops in fps and some even passed long time below 30 while dynamic res and sub fullhd wasn't that problematic on ps4. gpu cpu were balanced, but when comparing to pc the cpu were lower tier to gpu.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

DonFerrari said:
Mr Puggsly said:

Well the mid gen upgrades came when it was too soon to start a new gen but people wanted better graphics. I'm happy they came and my X1X will likely be a great way to play Halo Infinite.

I disagree, the CPU and GPU are fairly balanced in base hardware. Frankly, much of the bottle neck has been on the GPU which is why resolutions have varied and dynamic resolutions became common. People keep saying the 8th gen CPUs are too limited, but I don't think developers are even pushing its limits. I mean Just Cause 4 shouldn't exist if the CPUs were so limited.

In practice, it seems to me the CPUs in 8th gen consoles had enough power for what developers were generally looking to do. You have a theory and I don't feel evidence supports it.

I don't think graphics are just a marketing tool, its something many gamers care about. The 9th gen consoles will have vastly superior CPUs, but generally that will simply mean more 60 fps games.

Considering most games had bad drops in fps and some even passed long time below 30 while dynamic res and sub fullhd wasn't that problematic on ps4. gpu cpu were balanced, but when comparing to pc the cpu were lower tier to gpu.

We would have to look at games individually to assess why there were frame drops below 30 fps, but it was often bottleneck on the GPU. Sometimes it might just be poor optimization especially if its a relatively linear game, yet seemingly more complex games can hit 60 fps.

Its worth noting Just Cause 3 really struggled with console CPUs, while Just Cause 4 was a huge improvement. Look at Mass Effect 1 or Oblivion on 7th gen, that gave me the impression they were already fully utilizing the hardware. Then better optimized games came not long after that.

There are also games on the perform better on mid gen upgrades because much of the bottleneck was primarily on GPU.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:

I didn't mean YOU should buy Fable 3. Also, Since YOU already own it, MS doesn't need to sell it anymore.

You referenced me in that sentence. Here I will provide the appropriate quotation in bold:

Mr Puggsly said:

Go buy Fable 3 (or better yet, 2) on Xbox if the prospect of stealing upsets you.

So obviously I am going to ascertain that you were referencing me from the get go.
But sure... Backpedal and all that.

*****************

Mr Puggsly said:

The cool physics in Half Life 2 isn't what made that a great game though, it was more like a neat feature. In case you forgot by the way, I'm simply arguing you don't need a vast improvement in specs (like Scarlett) to make interesting new games. Hence, Halo Infinite wouldn't suddenly become a much more ambitious project if it were simply moved to Scarlett exclusively. Many 8th gen games feel derivative or smaller in scope than many 7th gen games.

The cool Physics in Half Life 2 certainly helped make it a great game, you take away all those small "touches" that were simply awesome... And you end up with a droll game like all the other clones at the time, it's the little things that set it apart from all the other shooters.

Halo Infinite specifically has already spent years in development with the anemic base Xbox One hardware in mind, thus even if Microsoft/343i were to remove base Xbox One support, the game will already be limited by that hardware unless years more of development work was spent to fully take advantage of newer hardware.

Mr Puggsly said:

X1X has untapped potential, but I wouldn't say its mostly wasted. Even if we argue it not properly utilized, it still delivers a better way to play X1 games. It takes a lot of GPU power to increase them to 1800p, 4K, or whatever.

If it's not properly utilized, then it's wasted.

That doesn't mean there aren't benefits to having an Xbox One X... But the bulk of benefits are resolution and framerates, even then... Many games that were 720P locked on the base Xbox One are locked to 720P on the Xbox One X unless there was a specific patch... It's less of an issue on the Sony side of the equation as 720P titles were an extreme rarity even during the early years.

I mean, lets take Dragon Age: Inquisition, it's a few years old at this point but it was a 1600x900 game on the base Xbox One and a full 1920x1080 on the base Playstation 4 Pro. - But because there isn't am enhanced patch for that title, the Playstation 4 variant of the game looks better than the Xbox One X. - It's wasted potential... A large part of that blame certainly lays on the developers though rather than the hardware itself.

It will be interesting to see if Microsoft will "enhance" titles from the Xbox One family of consoles with Scarlett by bumping up resolutions like they did with some Original Xbox and Xbox 360 titles.

Mr Puggsly said:

I disagree, the CPU and GPU are fairly balanced in base hardware. Frankly, much of the bottle neck has been on the GPU which is why resolutions have varied and dynamic resolutions became common. People keep saying the 8th gen CPUs are too limited, but I don't think developers are even pushing its limits. I mean Just Cause 4 shouldn't exist if the CPUs were so limited.

In practice, it seems to me the CPUs in 8th gen consoles had enough power for what developers were generally looking to do. You have a theory and I don't feel evidence supports it.

I don't think graphics are just a marketing tool, its something many gamers care about. The 9th gen consoles will have vastly superior CPUs, but generally that will simply mean more 60 fps games.

There are plenty of cases when physics and A.I Calculations increase that performance tanks, which is why a certain Assassins Creed game on the base Xbox One actually had the edge over the Playstation 4 version... Because the higher CPU clock and lower-latency eSRAM and DDR3 Ram gave the Xbox One an advantage in those scenarios.

As for the GPU's, they are clearly the shining star of the 8th gen devices.

But one thing we need to keep in mind is that consoles are a closed environment, so developers will work with whatever limited resources and bottlenecks they have in order to achieve various targets... So if we have anemic CPU's in the consoles, games will be developed with those in mind.

But when games end up ported to the PC, then developers are able to relax as they have orders-of-magnitude more hardware capability at their disposal, so we do get better Physics, Particles, A.I characters and so on than the console releases... And that is all thanks to the CPU.

Mr Puggsly said:
DonFerrari said:

Considering most games had bad drops in fps and some even passed long time below 30 while dynamic res and sub fullhd wasn't that problematic on ps4. gpu cpu were balanced, but when comparing to pc the cpu were lower tier to gpu.

We would have to look at games individually to assess why there were frame drops below 30 fps, but it was often bottleneck on the GPU. Sometimes it might just be poor optimization especially if its a relatively linear game, yet seemingly more complex games can hit 60 fps.

Its worth noting Just Cause 3 really struggled with console CPUs, while Just Cause 4 was a huge improvement. Look at Mass Effect 1 or Oblivion on 7th gen, that gave me the impression they were already fully utilizing the hardware. Then better optimized games came not long after that.

There are also games on the perform better on mid gen upgrades because much of the bottleneck was primarily on GPU.

Just Cause 4 took advantage of allot of more modern CPU instructions and became more parallel in it's CPU workloads, hence why it was a step up over Just Cause 3, there is still significant room for improvement though.

Oblivion though wasn't fully utilizing the Xbox 360/Playstation 3 hardware... The bulk of the work was done on 1 CPU core... I spent allot of time working with Oblivion to run it on Original-Xbox equivalent PC hardware. (Pentium 3+Geforce 3.) It looked like a dogs breakfast as I had to strip/reduce the shader effects and even went to the extent of polygon reductions in models...

It wasn't until Bethesda severely reworked large swathes of the Net Immerse turned Gamebryo turned Creation Engine with Skyrim that we saw better CPU utilization across all platforms, hence the relatively large leap in visuals and general simulation quality... But even on that front there is still substantial room for improvement, it doesn't scale well across more than a few threads... And the 7th gen had 6/7 threads to optimize for.

Last edited by Pemalite - on 15 July 2019

--::{PC Gaming Master Race}::--

Pemalite said:
Mr Puggsly said:

I didn't mean YOU should buy Fable 3. Also, Since YOU already own it, MS doesn't need to sell it anymore.

You referenced me in that sentence. Here I will provide the appropriate quotation in bold:

Mr Puggsly said:

Go buy Fable 3 (or better yet, 2) on Xbox if the prospect of stealing upsets you.

So obviously I am going to ascertain that you were referencing me from the get go.
But sure... Backpedal and all that.

*****************

Mr Puggsly said:

The cool physics in Half Life 2 isn't what made that a great game though, it was more like a neat feature. In case you forgot by the way, I'm simply arguing you don't need a vast improvement in specs (like Scarlett) to make interesting new games. Hence, Halo Infinite wouldn't suddenly become a much more ambitious project if it were simply moved to Scarlett exclusively. Many 8th gen games feel derivative or smaller in scope than many 7th gen games.

The cool Physics in Half Life 2 certainly helped make it a great game, you take away all those small "touches" that were simply awesome... And you end up with a droll game like all the other clones at the time, it's the little things that set it apart from all the other shooters.

Halo Infinite specifically has already spent years in development with the anemic base Xbox One hardware in mind, thus even if Microsoft/343i were to remove base Xbox One support, the game will already be limited by that hardware unless years more of development work was spent to fully take advantage of newer hardware.

Mr Puggsly said:

X1X has untapped potential, but I wouldn't say its mostly wasted. Even if we argue it not properly utilized, it still delivers a better way to play X1 games. It takes a lot of GPU power to increase them to 1800p, 4K, or whatever.

If it's not properly utilized, then it's wasted.

That doesn't mean there aren't benefits to having an Xbox One X... But the bulk of benefits are resolution and framerates, even then... Many games that were 720P locked on the base Xbox One are locked to 720P on the Xbox One X unless there was a specific patch... It's less of an issue on the Sony side of the equation as 720P titles were an extreme rarity even during the early years.

I mean, lets take Dragon Age: Inquisition, it's a few years old at this point but it was a 1600x900 game on the base Xbox One and a full 1920x1080 on the base Playstation 4 Pro. - But because there isn't am enhanced patch for that title, the Playstation 4 variant of the game looks better than the Xbox One X. - It's wasted potential... A large part of that blame certainly lays on the developers though rather than the hardware itself.

It will be interesting to see if Microsoft will "enhance" titles from the Xbox One family of consoles with Scarlett by bumping up resolutions like they did with some Original Xbox and Xbox 360 titles.

Mr Puggsly said:

I disagree, the CPU and GPU are fairly balanced in base hardware. Frankly, much of the bottle neck has been on the GPU which is why resolutions have varied and dynamic resolutions became common. People keep saying the 8th gen CPUs are too limited, but I don't think developers are even pushing its limits. I mean Just Cause 4 shouldn't exist if the CPUs were so limited.

In practice, it seems to me the CPUs in 8th gen consoles had enough power for what developers were generally looking to do. You have a theory and I don't feel evidence supports it.

I don't think graphics are just a marketing tool, its something many gamers care about. The 9th gen consoles will have vastly superior CPUs, but generally that will simply mean more 60 fps games.

There are plenty of cases when physics and A.I Calculations increase that performance tanks, which is why a certain Assassins Creed game on the base Xbox One actually had the edge over the Playstation 4 version... Because the higher CPU clock and lower-latency eSRAM and DDR3 Ram gave the Xbox One an advantage in those scenarios.

As for the GPU's, they are clearly the shining star of the 8th gen devices.

But one thing we need to keep in mind is that consoles are a closed environment, so developers will work with whatever limited resources and bottlenecks they have in order to achieve various targets... So if we have anemic CPU's in the consoles, games will be developed with those in mind.

But when games end up ported to the PC, then developers are able to relax as they have orders-of-magnitude more hardware capability at their disposal, so we do get better Physics, Particles, A.I characters and so on than the console releases... And that is all thanks to the CPU.

Mr Puggsly said:

We would have to look at games individually to assess why there were frame drops below 30 fps, but it was often bottleneck on the GPU. Sometimes it might just be poor optimization especially if its a relatively linear game, yet seemingly more complex games can hit 60 fps.

Its worth noting Just Cause 3 really struggled with console CPUs, while Just Cause 4 was a huge improvement. Look at Mass Effect 1 or Oblivion on 7th gen, that gave me the impression they were already fully utilizing the hardware. Then better optimized games came not long after that.

There are also games on the perform better on mid gen upgrades because much of the bottleneck was primarily on GPU.

Just Cause 4 took advantage of allot of more modern CPU instructions and became more parallel in it's CPU workloads, hence why it was a step up over Just Cause 3, there is still significant room for improvement though.

Oblivion though wasn't fully utilizing the Xbox 360/Playstation 3 hardware... The bulk of the work was done on 1 CPU core... I spent allot of time working with Oblivion to run it on Original-Xbox equivalent PC hardware. (Pentium 3+Geforce 3.) It looked like a dogs breakfast as I had to strip/reduce the shader effects and even went to the extent of polygon reductions in models...

It wasn't until Bethesda severely reworked large swathes of the Net Immerse turned Gamebryo turned Creation Engine with Skyrim that we saw better CPU utilization across all platforms, hence the relatively large leap in visuals and general simulation quality... But even on that front there is still substantial room for improvement, it doesn't scale well across more than a few threads... And the 7th gen had 6/7 threads to optimize for.

I'm clarifying, I meant anybody uncomfortable with stealing products no longer for sale on PC. Give money to MS via Xbox instead. I didn't mean just you, because I know you own the product. Wanna keep talking about that? Feel free.

Those cool effects were achieved on the Xbox port of Half Life 2, albeit not as great as a gaming PC, but that's how I played it. 7th gen took what 6th gen was doing to the next level, while 8th gen was more like added polish. Some limitations of the 7th gen may have been overcome with additional RAM, there was evident bottleneck there.

I'll just keep it simple. Is the X1X an upgrade worth getting? I believe many who have it say yes. Therefore I don't think its being wasted. Personally I also like have access to more 60 fps content or more stable performance.

Again, your examples of what comes from better specs is more visual. There was also impressive use of physics in 7th gen content. Just depends on what developers are attempting to do.

My point about games like Just Cause 3, Mass Effect 1 and Oblivion was optimization matters. Many say the CPUs in 8th gen consoles are trash, but primarily on the X1X where GPU bottlenecks are less of an issue, it seems like a very capable CPU.

Last edited by Mr Puggsly - on 15 July 2019

Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Should they ditch low settings on PC in favor of ultra settings?
> No!

Should they ditch a Xbox One version of Halo Infinite in favor of a Xbox Scarlet version?
> No!


Honestly i see no benefit in ditching the Xbox One version from a technical standpoint.
New console generations aren't that much more powerful compared to their previous generation as they used to be in the early days. You could never downgrade a Playstation 1 3D game to a SNES in a playable fashion. But you can very much downgrade current gen games like Doom or Wolfenstein to Nintendo Switch without losing too much of their vision. Or think about Rise of the Tomb Raider. A stunning looking game when it was released and yet it was possible to run on a Xbox 360.
Architectures of modern machines are very similar, so you don't have to rewrite huge parts of your engine to appease the specific hardware and modern engines are very scaleable too.
If developers target 4k resolutions and 60fps on the new console generation it will be even easier to downgrade them to current gen by simply halfing the framerate and quarter the resolution or go even lower, aside from lowering the overal details, because a lot of the extra power will be wasted for that.


So no, ditching the Xbox One version would be an absolutely dumb move because it would make no sense from a technical standpoint and would only limit the audience and the possible sales of the game.