By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Should Halo Infinite drop Xbox One and go Scarlet exclusive?

Tagged games:

 

Should it?

Yes, dump Xbone, next gen exclusive 35 42.68%
 
No, keep it cross gen with Xbone 47 57.32%
 
Total:82
DonFerrari said:
Mr Puggsly said:

Many games this gen didnt really need the power of 8th gen consoles per se. Most of the power just goes into better graphics. However, the biggest limiting factor in large scale games was probably RAM. During the 7th RAM requirements in PC games grew significantly. Muliples times the RAM found in 7th gen consoles.

In the 8th gen however, RAM requirements kinda stayed the same. 8GB was fairly common when the 8th gen started and 8GB is still plenty for most games. VRAM has become important primarily for textures.

Look at Switch, in practice much of its content looks like 7th gen games. Yet its doing fine running Witcher 3 and I credit that to Switch having plenty of RAM compared to 7th gen consoles. I get the impression bringing Witcher 3 to Switch was easier than Witcher 2 to 360. The advent of dynamic resolution also helps Im sure for performance.

Increased specs can certainly allow for games of greater scope or whatever. Im arguing much of that power simply goes to presentation in many cases. I've looked at many games this gen and thought, "did we really need new consoles for this?"

I believe we feel that to a greater extent in the 9th gen. Its gonna be like the 8th gen but even prettier. Dont expect the scope of games to change too much. Also, it takes a lot of power just to achieve 60 fps with 4K.

GTA V is huge, RDR also, TLOU same, and a lot of other open world. So devs could do with limited RAM with their tricks.

Do fine runing Witcher 3? Well that is a little reaching. It does run Witcher 3, but it isn't really fine. Witcher 2 run on X360 with a lot less RAM. Almost any game can run on almost any HW if you readequate enough. But still what you see will be severely impacted.

Most gamers want better presentation so certainly a lot of the performance will be put on the look. Not a problem in that. And the most power the HW have the more is free to use in other stuff besides looks.

I'll wait for gen 9 happening to say all games could be done as good without new HW.

From what CGI-Quality puts around in his thread I'm pretty confident gen 9 will bring a lot of good surprises.

And a good evidence of the HW and scope is that most of the very demanding games of this gen won't show on Switch because the compromise is to big to make it work.

Mr Puggsly said:

Windows Store has improved, much of the complaints were addressed around the time Gears 4 launched. Its not perfect but functional.

Killer Instinct had crossplay between all versions, including Steam. Hopefully MCC will be the same.

Let me clarify again, Fable 3 on PC is shit as long as it has GFW. That version should stay dead or be fixed.

I believe PS3 and 360 have the GPU and CPU potential to run Witcher 3. The RAM though? Nope, couldnt happen. Not unless they make massive changes. Thats my point.

The thing is many games this gen could have worked on last gen. Maybe not with the same engine or visual fidelity, but the scope of the games could have worked on last gen specs. For example, God of War and Uncharted 4  are considered amazing technical acievments. But outside of visuals, Iast gen had more impressive and ambitious games. AC games impressed me more. This is why I argue new hardware doesent necessarily mean more ambitious design, larger scale, etc. I dont think I can clarify further if you still miss the point.

I really question what happen with Halo 5 technically. The game has great looking assets and opted for high quality lighting and shadows, which evidently were not a good fit for 60 fps given the quirks.

I cant help but think Halo 5 may have originally been planned as 30 fps game. It could have potentially been a great looking ~1080p/30 fps game. Instead, Halo 5 looks like a game that 60 fps forced in, not built around it.

They certainly made massive changes to make the game run on Switch though.

God of War was more grounded and UC4 tries to go the realistic route so they certainly wouldn't show you the stuff you got impressed in AC.

Witcher 3 is gonna be a functional product on Switch. No cut content, it will probably be around 30 fps with dips, it will ultimately be the same game.

I agree, better graphics is great especially after ~7 years into a gen. The mid gen upgrades were also great for a visual boost. I also agree the specs can be used in other places. I'm saying games don't necessarily take advantage of specs to make games more ambitious, that's often the case.

You're saying "all games." I'm saying most games won't really utilize the new specs to make games more ambitious or increase scope, etc. I was careful about my word choice.

The 8th gen had a huge spec boost, there could have been surprises there as well. I'm simply arguing they were few in regard to game design.

I argue the 8th gen mostly felt like 7th gen with extra polish. But as the games were designed, much of it could have worked fine on 7th gen from a game design perspective.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Around the Network
Pemalite said:
curl-6 said:

Speaking of hardware, on the CPU side what kind of leap are we most likely looking at going from the Jags in the Xbone to the Zen 2 in Scarlet? 4 times the performance? 5 times? 10 times?

It depends on the instructions being used. - But an 8-12x increase is more than possible in an ideal scenario. (I.E. AVX)
Otherwise 5-6x increase in more conventional workloads is probably a good guesstimate...

In saying that, we have absolutely zero idea on clockrates, so it could be substantially higher if Microsoft/Sony dial those clocks home. Or lower.

Mr Puggsly said:

Windows Store has improved, much of the complaints were addressed around the time Gears 4 launched. Its not perfect but functional.

Many of the complaints are still relevant even today. Shall I list them?

Mr Puggsly said:

Killer Instinct had crossplay between all versions, including Steam. Hopefully MCC will be the same.

It should be the case for all games that run on Windows, irrespective of the store front.

Mr Puggsly said:

Let me clarify again, Fable 3 on PC is shit as long as it has GFW. That version should stay dead or be fixed.

Let me clarify again, GFWL doesn't matter. Just put the game up for sale again.

Mr Puggsly said:

I believe PS3 and 360 have the GPU and CPU potential to run Witcher 3. The RAM though? Nope, couldnt happen. Not unless they make massive changes. Thats my point.

Ram is a massive limiter for the 7th gen.
But the CPU's and GPU's of that console generation leave allot to be desired, having only SM3.0 support on the Xbox 360, poor geometry performance and lacking many of the modern features we take for granted today would make a port of Witcher 3... Well. Difficult.

Mr Puggsly said:

The thing is many games this gen could have worked on last gen. Maybe not with the same engine or visual fidelity, but the scope of the games could have worked on last gen specs. For example, God of War and Uncharted 4  are considered amazing technical acievments. But outside of visuals, Iast gen had more impressive and ambitious games. AC games impressed me more. This is why I argue new hardware doesent necessarily mean more ambitious design, larger scale, etc. I dont think I can clarify further if you still miss the point.

The scope of the games on 7th gen could have been done on 6th gen.
Morrowind on the Original Xbox for example is just as expansive as Oblivion or Skyrim on the Xbox 360.

It is what you do with your limited hardware resources that matters.

New hardware hopefully brings with it technologies that speeds up development... The 8th gen for instance can leverage dynamic lights far more readily than the 7th gen, thus hopefully reducing the workload on texture artists who try to add that kind of detailing into the texture work.

Mr Puggsly said:

I really question what happen with Halo 5 technically. The game has great looking assets and opted for high quality lighting and shadows, which evidently were not a good fit for 60 fps given the quirks.

I cant help but think Halo 5 may have originally been planned as 30 fps game. It could have potentially been a great looking ~1080p/30 fps game. Instead, Halo 5 looks like a game that 60 fps forced in, not built around it.

Halo 5 does have some good looking assets, the dynamic lighting and shadowing was a big step up over Halo 4.

But parts of the engine does run at 10-15fps, which looks extremely jarring. - At 30fps it probably would not have looked as "off" as it does at 60fps.
But in saying that, the movement system probably wouldn't have been as fluid at 30fps which is probably Halo 5's largest strength over it's predecessors.

I feel like the game was far to rushed and could have done with more development time before prime-time.

Either-way... Infinite is running on the Slipspace engine which looks to throw all those niggles out the window.

Oh... lets just drop the Windows Store talk. As a basic store, its fine. I download the game and it generally works fine.

Let me clarify, people who want to play Fable 3 on PC should just steal it or go play it on Xbox.

Those interesting points you make about how better hardware helps development, but its irrelevant.

The potential of 6th gen hardware was much more limited, but I agree. Especially if we're talking about OG Xbox as the baseline of the 6th gen.

If Halo 5 ran at 30 fps, there probably would have been less compromise in the pop in, draw distance of shadows, maybe even the animation frame drops could have stayed 30 fps. I don't think more development time would have fixed the issue of these choices made for a 60 fps game, nothing was really addressed in patches, and nothing was really fixed with the X1X. I'm hoping for some sort of remaster or patches on the Scarlett. That could coincide with the inevitable PC port.

I'm gonna suggest we stop using the word niggle. I loved saying tar baby, but you gotta let these words go in the name of progress.

Last edited by Mr Puggsly - on 14 July 2019

Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

So much riding on this game, if it comes out and it is great quality fun like the halo games were known for,then Gen 9 will be off to a great start. I do think it should be gen 9 only, imagine play it on a launch console from 2013 would be horrible.



curl-6 said:
Mr Puggsly said:

I agree, other aspects of the hardware help simplify port work. However, developers have redesigned effects or removed them to boost performance on Switch. That was also common in cross gen (6th/7th or 7th/8th gen) or PC to console ports to use completely different effects or omit them.

Its the RAM of Switch that allows ports to happen without having to make significant changes to how the game essentially works.

I think Shadow of Mordor is a good example of porting a game without enough RAM. Huge changes were made to the actual game and it looks like there was no room left for textures.

Switch ports from PS4/Xbone do tend to retain most of the 8th gen rendering techniques though, whereas PS3/360 versions of crossgen games almost universally axed all that stuff.

But yeah, RAM followed by GPU were the biggest differences from PS3/360 to PS4/Xbone/Switch, whereas it's looking like the biggest gain going to Scarlet and PS5 could be CPU.

DonFerrari said:

Witcher 2 run on X360 with a lot less RAM.

That's not quite an apples to oranges comparison though; Witcher 2 wasn't open world for one thing.

Pemalite said:

It depends on the instructions being used. - But an 8-12x increase is more than possible in an ideal scenario. (I.E. AVX)
Otherwise 5-6x increase in more conventional workloads is probably a good guesstimate...

In saying that, we have absolutely zero idea on clockrates, so it could be substantially higher if Microsoft/Sony dial those clocks home. Or lower.

So a big leap then. Thanks, I was curious as while it interests me I'm not an expert on technical stuff.

RDR1 and GTA V are though. And you understood the point. Most if not all technical limitations are surpaseable if you cut down enough on the game. And as pemalite said, the biggest plus Switch have that WiiU didn't for the ports is that the architeture is more modern and closer to PS4/X1 so the cuts are less heavy than if ported to last gen.

Mr Puggsly said:
DonFerrari said:

GTA V is huge, RDR also, TLOU same, and a lot of other open world. So devs could do with limited RAM with their tricks.

Do fine runing Witcher 3? Well that is a little reaching. It does run Witcher 3, but it isn't really fine. Witcher 2 run on X360 with a lot less RAM. Almost any game can run on almost any HW if you readequate enough. But still what you see will be severely impacted.

Most gamers want better presentation so certainly a lot of the performance will be put on the look. Not a problem in that. And the most power the HW have the more is free to use in other stuff besides looks.

I'll wait for gen 9 happening to say all games could be done as good without new HW.

From what CGI-Quality puts around in his thread I'm pretty confident gen 9 will bring a lot of good surprises.

And a good evidence of the HW and scope is that most of the very demanding games of this gen won't show on Switch because the compromise is to big to make it work.

They certainly made massive changes to make the game run on Switch though.

God of War was more grounded and UC4 tries to go the realistic route so they certainly wouldn't show you the stuff you got impressed in AC.

Witcher 3 is gonna be a functional product on Switch. No cut content, it will probably be around 30 fps with dips, it will ultimately be the same game.

I agree, better graphics is great especially after ~7 years into a gen. The mid gen upgrades were also great for a visual boost. I also agree the specs can be used in other places. I'm saying games don't necessarily take advantage of specs to make games more ambitious, that's often the case.

You're saying "all games." I'm saying most games won't really utilize the new specs to make games more ambitious or increase scope, etc. I was careful about my word choice.

The 8th gen had a huge spec boost, there could have been surprises there as well. I'm simply arguing they were few in regard to game design.

I argue the 8th gen mostly felt like 7th gen with extra polish. But as the games were designed, much of it could have worked fine on 7th gen from a game design perspective.

They are cutting a lot for Witcher 3, there is no other way around it. I guess what you mean is that the gameplay elements will be kept.

Mid gen upgrades I can agree weren't needed and that the games didn't really improve outside of graphics due to them, but the fault would be that they had to keep support for the baseline versions and also kept same architeture due to compatibility. So in this case very bad CPU that held down the GPU more than it should.

Nope, I the "all games" you are saying, which I don't remember saying, would be that any game could benefit from a better HW to improve scope. Still I wouldn't say all games, because there is plenty of shovelware and indies that would run "exactly" the same if released on PS3.

Plenty of games would have a very lower NPC count and physics if CPU was worse (that is a game design element), but also the bad CPU of both consoles also limited the ambition one could go for on PS4/X1 so it had more juice for graphics on GPU than gameplay on CPU.

If one was willing most of the games today with some heavy tweak could play on PS2 perhaps even PS1. But you can be sure a Halo designed exclusivelly for X4 would have potential to be better than having to launch on X1 base. If the game will be better than og Halo that we can only know when it releases and won't be fault of the better HW if it is worse Halo or game.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Pemalite said:
curl-6 said:

Speaking of hardware, on the CPU side what kind of leap are we most likely looking at going from the Jags in the Xbone to the Zen 2 in Scarlet? 4 times the performance? 5 times? 10 times?

It depends on the instructions being used. - But an 8-12x increase is more than possible in an ideal scenario. (I.E. AVX)
Otherwise 5-6x increase in more conventional workloads is probably a good guesstimate...

In saying that, we have absolutely zero idea on clockrates, so it could be substantially higher if Microsoft/Sony dial those clocks home. Or lower.

Do you know where it's possible to do some actual numbers comparison? I only know of Passmark that has massive data base, and there clock for clock, core to core comparison is not very flattering.



Around the Network
Mr Puggsly said:

Let me clarify, people who want to play Fable 3 on PC should just steal it or go play it on Xbox.

Or not. Never condone the act of stealing.

Mr Puggsly said:

Those interesting points you make about how better hardware helps development, but its irrelevant.

Not really... It's on topic, besides... You then continued the discussion on this point anyway:

Mr Puggsly said:

If Halo 5 ran at 30 fps, there probably would have been less compromise in the pop in, draw distance of shadows, maybe even the animation frame drops could have stayed 30 fps. I don't think more development time would have fixed the issue of these choices made for a 60 fps game, nothing was really addressed in patches, and nothing was really fixed with the X1X. I'm hoping for some sort of remaster or patches on the Scarlett. That could coincide with the inevitable PC port.

There was an effort to enhance Halo 5 for the Xbox one X... Problem was, it wasn't really that much of an improvement. - There is an improvement sure... But the base assets are still geared towards optimal Xbox One performance and visuals rather than showcasing what the Xbox One X can truly do.

But that is a common theme for most Xbox One X enhanced titles, only a marginal improvement where the bulk of games are just notched a little bit higher in visual settings with the bulk of the hardware pushing higher framerates/resolution rather than pushing more intricate effects that PC gamers get to use.

Mr Puggsly said:

I'm gonna suggest we stop using the word niggle. I loved saying tar baby, but you gotta let these words go in the name of progress.

Nah. I'll continue to use it when I deem it as appropriate.

curl-6 said:

So a big leap then. Thanks, I was curious as while it interests me I'm not an expert on technical stuff.

Absolutely massive, probably the single largest leap in CPU capability in generations. - If there is only one criticism that I can give... Is that Sony/Microsoft doesn't go for more than 1 CCX grouping of cores... (But I had been saying that was the path they would take for years anyway due to cost reasons.)

Would have been nice to have 12-16 CPU cores for platform longevity, we just aren't there yet.

It does mean that the 10th gen consoles aren't likely to see the same kind of leap in CPU capability as well though, so the 9th gen is gearing up to be rather interesting.

HoloDust said:

Do you know where it's possible to do some actual numbers comparison? I only know of Passmark that has massive data base, and there clock for clock, core to core comparison is not very flattering.

Probably a bit difficult to get a Jaguar to Zen2 comparison. - But years ago I did a comparison and calculated that 8x Jaguar cores is roughly equivalent to a dual-core Core i3 at the time, operating at around 3ghz.
So taking any Sandy/Ivy-Bridge Dual Core and pitting it against Zen 2 is as accurate as you are probably going to get it at this stage unless I can source some hardware. (Working on it!)



--::{PC Gaming Master Race}::--

Pemalite said:
Mr Puggsly said:

Let me clarify, people who want to play Fable 3 on PC should just steal it or go play it on Xbox.

Or not. Never condone the act of stealing.

Mr Puggsly said:

Those interesting points you make about how better hardware helps development, but its irrelevant.

Not really... It's on topic, besides... You then continued the discussion on this point anyway:

Mr Puggsly said:

If Halo 5 ran at 30 fps, there probably would have been less compromise in the pop in, draw distance of shadows, maybe even the animation frame drops could have stayed 30 fps. I don't think more development time would have fixed the issue of these choices made for a 60 fps game, nothing was really addressed in patches, and nothing was really fixed with the X1X. I'm hoping for some sort of remaster or patches on the Scarlett. That could coincide with the inevitable PC port.

There was an effort to enhance Halo 5 for the Xbox one X... Problem was, it wasn't really that much of an improvement. - There is an improvement sure... But the base assets are still geared towards optimal Xbox One performance and visuals rather than showcasing what the Xbox One X can truly do.

But that is a common theme for most Xbox One X enhanced titles, only a marginal improvement where the bulk of games are just notched a little bit higher in visual settings with the bulk of the hardware pushing higher framerates/resolution rather than pushing more intricate effects that PC gamers get to use.

Mr Puggsly said:

I'm gonna suggest we stop using the word niggle. I loved saying tar baby, but you gotta let these words go in the name of progress.

Nah. I'll continue to use it when I deem it as appropriate.

curl-6 said:

So a big leap then. Thanks, I was curious as while it interests me I'm not an expert on technical stuff.

Absolutely massive, probably the single largest leap in CPU capability in generations. - If there is only one criticism that I can give... Is that Sony/Microsoft doesn't go for more than 1 CCX grouping of cores... (But I had been saying that was the path they would take for years anyway due to cost reasons.)

Would have been nice to have 12-16 CPU cores for platform longevity, we just aren't there yet.

It does mean that the 10th gen consoles aren't likely to see the same kind of leap in CPU capability as well though, so the 9th gen is gearing up to be rather interesting.

HoloDust said:

Do you know where it's possible to do some actual numbers comparison? I only know of Passmark that has massive data base, and there clock for clock, core to core comparison is not very flattering.

Probably a bit difficult to get a Jaguar to Zen2 comparison. - But years ago I did a comparison and calculated that 8x Jaguar cores is roughly equivalent to a dual-core Core i3 at the time, operating at around 3ghz.
So taking any Sandy/Ivy-Bridge Dual Core and pitting it against Zen 2 is as accurate as you are probably going to get it at this stage unless I can source some hardware. (Working on it!)

Well considering CPU was basically skipped this gen (as far as being 4-8x better than previous gen), then it should really be the biggest jump, basically 2 gens worth. So when people were talking about only 4x gain compared to base consoles of 8th gen I thought it was to little.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."

Pemalite said:
Mr Puggsly said:

Let me clarify, people who want to play Fable 3 on PC should just steal it or go play it on Xbox.

Or not. Never condone the act of stealing.

Mr Puggsly said:

Those interesting points you make about how better hardware helps development, but its irrelevant.

Not really... It's on topic, besides... You then continued the discussion on this point anyway:

Mr Puggsly said:

If Halo 5 ran at 30 fps, there probably would have been less compromise in the pop in, draw distance of shadows, maybe even the animation frame drops could have stayed 30 fps. I don't think more development time would have fixed the issue of these choices made for a 60 fps game, nothing was really addressed in patches, and nothing was really fixed with the X1X. I'm hoping for some sort of remaster or patches on the Scarlett. That could coincide with the inevitable PC port.

There was an effort to enhance Halo 5 for the Xbox one X... Problem was, it wasn't really that much of an improvement. - There is an improvement sure... But the base assets are still geared towards optimal Xbox One performance and visuals rather than showcasing what the Xbox One X can truly do.

But that is a common theme for most Xbox One X enhanced titles, only a marginal improvement where the bulk of games are just notched a little bit higher in visual settings with the bulk of the hardware pushing higher framerates/resolution rather than pushing more intricate effects that PC gamers get to use.

Mr Puggsly said:

I'm gonna suggest we stop using the word niggle. I loved saying tar baby, but you gotta let these words go in the name of progress.

Nah. I'll continue to use it when I deem it as appropriate.

curl-6 said:

So a big leap then. Thanks, I was curious as while it interests me I'm not an expert on technical stuff.

Absolutely massive, probably the single largest leap in CPU capability in generations. - If there is only one criticism that I can give... Is that Sony/Microsoft doesn't go for more than 1 CCX grouping of cores... (But I had been saying that was the path they would take for years anyway due to cost reasons.)

Would have been nice to have 12-16 CPU cores for platform longevity, we just aren't there yet.

It does mean that the 10th gen consoles aren't likely to see the same kind of leap in CPU capability as well though, so the 9th gen is gearing up to be rather interesting.

HoloDust said:

Do you know where it's possible to do some actual numbers comparison? I only know of Passmark that has massive data base, and there clock for clock, core to core comparison is not very flattering.

Probably a bit difficult to get a Jaguar to Zen2 comparison. - But years ago I did a comparison and calculated that 8x Jaguar cores is roughly equivalent to a dual-core Core i3 at the time, operating at around 3ghz.
So taking any Sandy/Ivy-Bridge Dual Core and pitting it against Zen 2 is as accurate as you are probably going to get it at this stage unless I can source some hardware. (Working on it!)

Go buy Fable 3 (or better yet, 2) on Xbox if the prospect of stealing upsets you. Eitherway, that version on PC is dead.

Simplifying the development process with better specs isnt relevant. Also your examples were more about visual aspects. Im talking more about gameplay mechanics and gameplay related design in general. But Im sure you will continue to drag on the argument instead of admitting you get my point.

Halo 5 on the X1X was basically just a resolution upgrade, you cant say that was an attempt to enhance or fix the visual quirks of that game.

Generally speaking just increasing things like draw distance is not a huge task, it would certainly be expected in a PC port. The only excuse I could come up with is increasing those settings somehow impacts the gameplay, like the game is somehow built around some visual compromises. Thus it could only be addressed in a remaster. I doubt that's the case with all visual aspects, maybe poor draw distance of shadows and poor antistrophic filtering could have been addressed.

Last edited by Mr Puggsly - on 15 July 2019

Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

DonFerrari said:
curl-6 said:

Switch ports from PS4/Xbone do tend to retain most of the 8th gen rendering techniques though, whereas PS3/360 versions of crossgen games almost universally axed all that stuff.

But yeah, RAM followed by GPU were the biggest differences from PS3/360 to PS4/Xbone/Switch, whereas it's looking like the biggest gain going to Scarlet and PS5 could be CPU.

That's not quite an apples to oranges comparison though; Witcher 2 wasn't open world for one thing.

RDR1 and GTA V are though. And you understood the point. Most if not all technical limitations are surpaseable if you cut down enough on the game. And as pemalite said, the biggest plus Switch have that WiiU didn't for the ports is that the architeture is more modern and closer to PS4/X1 so the cuts are less heavy than if ported to last gen.

Agreed, all I meant was that both the more modern tech and the much larger RAM help, rather than it being primarily one or the other.

Pemalite said:

Absolutely massive, probably the single largest leap in CPU capability in generations. - If there is only one criticism that I can give... Is that Sony/Microsoft doesn't go for more than 1 CCX grouping of cores... (But I had been saying that was the path they would take for years anyway due to cost reasons.)

Would have been nice to have 12-16 CPU cores for platform longevity, we just aren't there yet.

It does mean that the 10th gen consoles aren't likely to see the same kind of leap in CPU capability as well though, so the 9th gen is gearing up to be rather interesting.

Probably a bit difficult to get a Jaguar to Zen2 comparison. - But years ago I did a comparison and calculated that 8x Jaguar cores is roughly equivalent to a dual-core Core i3 at the time, operating at around 3ghz.
So taking any Sandy/Ivy-Bridge Dual Core and pitting it against Zen 2 is as accurate as you are probably going to get it at this stage unless I can source some hardware. (Working on it!)

Interesting stuff. Not sure if it's going to happen, it'll depend on devs, but I'd love to see a lot more simulation and interactivity in next gen games, that's kinda what I wanted from this gen but didn't really get.



Pemalite said:
HoloDust said:

Do you know where it's possible to do some actual numbers comparison? I only know of Passmark that has massive data base, and there clock for clock, core to core comparison is not very flattering.

Probably a bit difficult to get a Jaguar to Zen2 comparison. - But years ago I did a comparison and calculated that 8x Jaguar cores is roughly equivalent to a dual-core Core i3 at the time, operating at around 3ghz.
So taking any Sandy/Ivy-Bridge Dual Core and pitting it against Zen 2 is as accurate as you are probably going to get it at this stage unless I can source some hardware. (Working on it!)

Hm...done that, using Passmark - all normalized for single core (normalized additionally @1GHz by me):

- around 380 for Jaguar (Athlon 5150, E2-3000 and Opteron X2170)
- 510-530
for i3 (i3 2100, i3 3210)
- 765-810 for Ryzens (3600-3900X)
- 805 for i9 9900K

That's what surprised me the first time - I'm not sure what Passmark actually benchmarks, but I don't know of any other benchmark that has such a comprehensive data base and that has single threaded rating as well.