By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Should Halo Infinite drop Xbox One and go Scarlet exclusive?

Tagged games:

 

Should it?

Yes, dump Xbone, next gen exclusive 35 42.68%
 
No, keep it cross gen with Xbone 47 57.32%
 
Total:82
Pemalite said:
Mr Puggsly said:

For a moderator you sure like pushing an argument, very mature. I'm trying to end the Fable 3 talk but you keep going. I won't backpedal, I'll just admit you're a great person with a thick cock.

Me being a moderator has absolutely nothing to do with it, don't delve into logical fallacies to try and win your argument.

Mr Puggsly said:

Half Life 2 ran pretty crappy on the OG Xbox from my memory. A nice mix of CPU and GPU bottleneck with soupy textures given the lack of RAM I'm sure. Its a shame we don't see developers making slide shows to push physics! Might have been different if it were really built for Xbox.

It ran fine. It cold have been better, but it still ran fine.
The fact they retained the physics engine and the gravity gun was exactly my point though...

Mr Puggsly said:

As I mentioned before, we saw the 6th and 7th gen really struggling with limited RAM. PC's were taking advantage of significantly more RAM than consoles had during those generations. However, the 8th gen is the first time it felt like consoles have plenty or at least enough RAM. Even on PC 8GB is generally fine.

Of course they were struggling with Ram. Even the 8th gen struggles with Ram... 5-6GB isn't allot of memory when a few gigabytes of that is for the GPU as well.

Lack of memory is one of the Achilles heels of any fixed device that cannot be upgraded.

Mr Puggsly said:

Even if the X1X were supported as a lead platform, I don't think that would mean a base X1 port would be impossible. The primary difference of X1 and X1X GPU power, that's seemingly the easiest aspect of games to scale back given its mostly effects. So yeah, the biggest benefit of X1X is resolution and frame rate... but sometimes textures improve given the RAM increase, some games also use the extra GPU power to enable or increase effects.

A base Xbox One port wouldn't be impossible, but how many cutbacks do you make to fit a game onto an inferior platform? And when do you reach a point when you probably shouldn't bother?

Sometimes a downgraded port isn't possible without seriously re-engineering large swathes of a game... Even some games that get ported to Switch get some extra changes in order for the game to be a better experience on that hardware. (Wolfenstein for example, with some extra objects to block the views and remove the need to render distant landscapes.)

Mr Puggsly said:

So X1X gets people more immersed? I can agree on that but I don't feel high visual fidelity is necessarily that important.

Then that just removes one of the single largest selling points of the Xbox One X... Good thing you don't speak for all gamers and what they need/want/desire.

Mr Puggsly said:

The 8th gen CPUs are the most capable trash I've ever seen. I mean a game like Horizon 4 at 60 fps!? What a pile of shit. Either way, I'm glad more capable CPUs will be in the next gen consoles for practical reasons.

I actually have an understanding of how Horizon achieved what it did on the processor it did. - It doesn't make Jaguar any less crippling.
Allot of simulation-level effects were absent in that game, water being one of the larger ones... But in return those extra CPU cycles were spent elsewhere like on ants crawling up a tree.

Yes, Jaguar is a piece of crap... It's AMD's worst CPU at a time when they had the worst CPU's... Keep that into perspective.

Yes, Ryzen for next gen is going to be amazing.

Mr Puggsly said:

The funny thing about games in MCC, they're old but their scope seems bigger than many modern games that are more linear in comparison. Which gets back to a point I made many times already, better specs doesn't always mean increased scope.

Halo games tend to be linear in the way you traverse the campaign, but with wider-vistas thanks to the sandbox... That allows for data streaming to be fairly effective.

Open World games have certainly become more common today... And that is thanks to the increase in hardware capabilities enabling such scope.

Mr Puggsly said:

Ashes of Singularity simply wouldn't work on console CPUs? Also, is there an optimization issue or would scaling back its CPU needs really changes the experience of the game? I've seen the game and it seems to be a fairly standard RTS, maybe it would just run like shit on consoles during heavy action? For the record, I was already aware of this game because it often looked at for its CPU demands.

Ashes of the Singularity is running a degree of simulation that would cripple Jaguar.

Something like Supreme Commander runs well on consoles because the level of A.I simulation is kept relatively simple... That isn't the case for Ashes of the Singularity.

Mr Puggsly said:

The 8th gen consoles are too limited to achieve 60 fps in many games. There would be too much resolution/graphics compromise and there is already too much CPU bottleneck to achieve that. In the next gen though, if the CPUs are as capable as we hope, CPU bottleneck is going to be less of an issue. Meanwhile 1440p to 4K will likely become pretty standard. Taking all that into consideration, it should be easier to give 60 fps options in the next gen versus current gen.

But you said that Jaguar was "capable". - If the hardware was capable, there would be more 60fps games, you need to stop contradicting yourself, especially in the same post.

The CPU bottleneck should be non-existent next-gen and a GPU/Ram bottleneck will become more pronounced... But just because the CPU bottleneck has been alleviated doesn't mean we are going to have 60fps games coming out the wazoo.

Apologies for the late reply, on Holiday.

DonFerrari said:

A lot of work that could be better made at the CPU have been made on the GPU because of the lower performance of the CPU compared to it. So for games like competitive online, fighting and racing 60fps is usually the first target, then resolution second. And for that they may have to simplify effects and other IQ elements to hit the performance budget.

Also I don't see anything wrong on pemalite posts. He isn't breaking any rules nor using his position as moderator to demand you shut up or accept his argument. So I don't see what a moderator have to do different than you, like I also find strange when people demand better behavior from others because of the position/job, behavior they don't uphold themselves.

My behavior is exactly the same before and after I was a moderator... So using that point to have a moan isn't really getting him anywhere.

Every console generation we have gotten more powerful CPU's... And yet in the history of consoles, we still haven't gotten 60fps guaranteed in any console generation, next gen is not going to be any different.

I clarified any confusion during our Fable 3 discussion, but you kept pressing. Give it a rest, I generally expect better from the mods. It wasn't even a discussion to win.

Half Life 2 on Xbox didn't run fine, but it certainly ran. Performance was the worst aspect of that port and makes it a difficult version to revisit.

You missed the point in regard to RAM. While 5GB certainly is not a ton of RAM for games, but the RAM requirements for PC gaming stayed relatively stagnant. Hence, modern games haven't seem to hit a wall due to struggling with RAM limitations like previous gens did. Even the Switch is doing impressive games like Witcher 3 with even less RAM, albeit struggling with textures.

I don't feel the disparity in specs between base X1 and X1X are significant enough. Therefore anything that could be developed to take full advantage of the X1X at 1080p/30 fps, should be able to scale back relatively easily for a base X1 if they mostly scale back GPU heavy effects. It seems like almost most 8th gen games can work on Switch because the specs disparity just isn't big enough, even if there are minor compromises. The example you gave for Wolfenstein 2 on Switch is mostly aesthetic and was likely done to boost performance. Anyhow, we all know the X1X's primary focus was making X1 games look and play better, which at the very least it certainly does that. Sometimes the disparity is so big it seems like the games were developed for X1X specs, Soul Calibur VI for example looks bad and loads horribly on base hardware.

Open world games were pretty common last gen as well. The big difference this gen is more online open world stuff. I imagine RAM was helpful for that but they still existed on last gen.

Well there isn't much a debate to have on Ashes of Singularity, maybe its complex AI is incredibly demanding, maybe its an optimization issue. I do see video of a FX-6300 running the game relatively poorly, but it runs. I mention that because that CPU in practice seems to give similar performance to consoles.

Oh lord... let me elaborate. I feel the Jaguar CPUs in the current consoles have shown great potential. For example, I'm playing Gears 4 (Gears 5 soon), Forza Horizon 4 and other titles that stick relatively close or stay at 60 fps. There are also games that did a good job hitting 60 fps on base hardware like Forza, GT, MGS, Halo, BF, CoD (some better than others), etc. In my mind, that's pretty good for CPUs people call trash. Either way, I can't deny there is CPU bottleneck in many games that make hitting 60 fps impossible. However, GPU was also limited for high quality visuals/effects, high resolutions (900p-1080p) and 60 fps at the same time.

People often say it was the CPU that was too limited in the 8th gen, but GPU was also a culprit. Because even when CPU bottleneck wasn't a primary issue for 60 fps, it still takes a lot of GPU power to achieve 60 fps with high visual fidelity. Limited GPU power is why dynamic resolution is common in 60 fps games.

In the next gen however, we seem to agree bottleneck on CPU shouldn't be an issue for 60 fps. Also, resolution at 1440p-4K will become even more common. Essentially the compromises needed for 60 fps become less work. For example, the X1X offers more 60 fps content because it has a little extra CPU power and they can drop the resolution (and effects) to reduce GPU bottleneck. Hence, less work to hit 60 fps means more games should (WILL) offer it.

Last edited by Mr Puggsly - on 23 July 2019

Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Around the Network

Personally, I was happy with a lot of the experiences that the Espresso CPU in the Wii U managed to deliver, but I still won't dispute that it was still, objectively speaking, a terribly weak CPU for a home console releasing in 2012.



curl-6 said:

Personally, I was happy with a lot of the experiences that the Espresso CPU in the Wii U managed to deliver, but I still won't dispute that it was still, objectively speaking, a terribly weak CPU for a home console releasing in 2012.

The Wii U in general was not impressive hardware for 2012. I bought a Wii U for exclusives and they weren't technically much better than 360 or PS3.

I'm not defending the Jaguar CPUs because its running games I enjoy. I'm just pointing out in practice Jaguar CPUs were used for impressive content and achieved 60 fps more often than given credit for.

People also say 8th gen consoles had good GPUs but very dated CPUs. Essentially they were unbalanced, but I disagree on that as well. The best looking games on that hardware were generally 30 fps, primarily because that's frame rate those GPUs could handle with that visual fidelity and the Jaguar CPUs were perfectly suited for that. However, those same CPUs could produce 60 fps games if that was the focus.

Last edited by Mr Puggsly - on 24 July 2019

Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:
curl-6 said:

Personally, I was happy with a lot of the experiences that the Espresso CPU in the Wii U managed to deliver, but I still won't dispute that it was still, objectively speaking, a terribly weak CPU for a home console releasing in 2012.

The Wii U in general was not impressive hardware for 2012. I bought a Wii U for exclusives and they weren't technically much better than 360 or PS3.

I'm not defending the Jaguar CPUs because its running games I enjoy. I'm just pointing out in practice Jaguar CPUs were used for impressive content and achieved 60 fps more often than given credit for.

People also say 8th gen consoles had good GPUs but very dated CPUs. Essentially they were unbalanced, but I disagree on that as well. The best looking games on that hardware were generally 30 fps, primarily because that's frame rate those GPUs could handle with that visual fidelity and the Jaguar CPUs were perfectly suited for that. However, those same CPUs could produce 60 fps games if that was the focus.

A lot of Wii U games were 60fps as well, because those games were tailored to the capabilities of Espresso, just as games made on PS4 and Xbone were tailored to the capabilities of the Jags. Doesn't change the fact that the CPUs in all 3 consoles are objectively weak.



definitely not, even Nintendo dropped BOTW on the wiiu and there was only about 15m of those things sold, If MS skipped the X1 with Halo 6 then that would be 43million of the hardcore MS Xbox fans that would feel somewhat let down by the move.

There really is no reason not to put some form of a port of the game onto the X1 family of systems, especially with how many titles now use things like dynamic resolutions or checkerboarding so it should allow the game run easily on the X1X at the very least, hardware wise a 6TF machine should be fine running anything from next gen with a few cutbacks.



Why not check me out on youtube and help me on the way to 2k subs over at www.youtube.com/stormcloudlive

Around the Network
Mr Puggsly said:

I clarified any confusion during our Fable 3 discussion, but you kept pressing. Give it a rest, I generally expect better from the mods. It wasn't even a discussion to win.

Now you are changing the tact of the discussion to step away from your prior fallacies...

Mr Puggsly said:

Half Life 2 on Xbox didn't run fine, but it certainly ran. Performance was the worst aspect of that port and makes it a difficult version to revisit.

It ran fine, not amazing, I have the game on the Original Xbox so I can see first hand.
It ran better than allot of other titles of the era like Morrowind that's for sure.

Back then it was okay to push titles out with framerates under 30... Heck, it wasn't exactly uncommon during the 7th generation either.

The Xbox 360 port fixed up allot of the deficiencies as expected.

Mr Puggsly said:

You missed the point in regard to RAM. While 5GB certainly is not a ton of RAM for games, but the RAM requirements for PC gaming stayed relatively stagnant. Hence, modern games haven't seem to hit a wall due to struggling with RAM limitations like previous gens did. Even the Switch is doing impressive games like Witcher 3 with even less RAM, albeit struggling with textures.

Ram requirements have remained stagnant because hardware on that front has generally remained stagnant.

The sweet spot for Ram on PC is around 16Gb currently because of price/performance issues earlier on in the generation on PC which meant people weren't  upgrading their Ram as prolifically as they did prior... That will likely change as we transition into DDR5 memory over the next year or two.

That doesn't mean games can't use more Ram, it just doesn't generally happen.

GPU Ram requirements on the other hand have increased massively since the 8th gen consoles launched, we went from 1-2GB GPU's to 8GB-16GB... For some reason allot of people ignore those jumps on that front.

Mr Puggsly said:

I don't feel the disparity in specs between base X1 and X1X are significant enough. Therefore anything that could be developed to take full advantage of the X1X at 1080p/30 fps, should be able to scale back relatively easily for a base X1 if they mostly scale back GPU heavy effects. It seems like almost most 8th gen games can work on Switch because the specs disparity just isn't big enough, even if there are minor compromises. The example you gave for Wolfenstein 2 on Switch is mostly aesthetic and was likely done to boost performance. Anyhow, we all know the X1X's primary focus was making X1 games look and play better, which at the very least it certainly does that. Sometimes the disparity is so big it seems like the games were developed for X1X specs, Soul Calibur VI for example looks bad and loads horribly on base hardware.

We don't know if the specifications of the Xbox One and Xbox One X are significant enough, how you feel is ultimately irrelevant to this point...

We have absolutely no information on the hardware demands that Halo: Infinite will bring or what cutbacks Microsoft had to make in order to make the game viable on inferior platforms.

We do have some information on some of the rendering effects being employed thanks to the trailers showcasing them, but that's about it.

I am not saying a port is impossible, clearly it is possible as it's actually happening. - My point of contention is what did they have to give up to get there?
Is it just resolution and framerate? Unlikely. Will they have to go back and re-engineer parts of the game to reduce the rendering load? Cutback of effects?

Mr Puggsly said:

Open world games were pretty common last gen as well. The big difference this gen is more online open world stuff. I imagine RAM was helpful for that but they still existed on last gen.

Player counts thanks to the CPU jump was one of the big enablers this gen, Battlefield for instance saw an increase in multiplayer counts per map.

Plus Physics.

Openworld is more common because less engineering needs to be dumped into making it possible... Last gen developers were getting cleaver with various approaches like implementing impostering, compressed texture and mesh streaming from disk, which the CPU would unpack... And so on just to fit everything into less than 512MB of memory.

Mr Puggsly said:

Well there isn't much a debate to have on Ashes of Singularity, maybe its complex AI is incredibly demanding, maybe its an optimization issue. I do see video of a FX-6300 running the game relatively poorly, but it runs. I mention that because that CPU in practice seems to give similar performance to consoles.

The FX-6300 is faster than Jaguar anyway... And all AMD FX processors were generally crap anyway. - It's not an optimization issue... You really need to play the game and see the details they added into the complex A.I. - It's probably the best game to showcase what you can do on the A.I front when you have oodles of CPU time to play with.

Mr Puggsly said:

Oh lord... let me elaborate. I feel the Jaguar CPUs in the current consoles have shown great potential. For example, I'm playing Gears 4 (Gears 5 soon), Forza Horizon 4 and other titles that stick relatively close or stay at 60 fps. There are also games that did a good job hitting 60 fps on base hardware like Forza, GT, MGS, Halo, BF, CoD (some better than others), etc. In my mind, that's pretty good for CPUs people call trash. Either way, I can't deny there is CPU bottleneck in many games that make hitting 60 fps impossible. However, GPU was also limited for high quality visuals/effects, high resolutions (900p-1080p) and 60 fps at the same time.

Those titles achieve 60fps because they are fairly light on the CPU effects like Physics.
The games on PC are simply doing more... And as a result tend to use more CPU cycles.

But just because a game isn't hitting 60fps doesn't mean it's a CPU limitation, if you are GPU limited you won't hit 60fps either.

Now because I own Forza, Gears, Halo, Battlefield and Call of Duty... Many of which I played on base hardware and on my Xbox One X, I can say those games weren't doing much in the way utilizing the CPU's heavily... So Jaguar is of course not going to be a hindrance.

But what-if we had 10x more CPU power at our disposal? Those same games will still be 60fps, but we would have far better A.I, more physics, better scripting, more characters on screen, better positional audio... We wouldn't have needed Microsoft's original push to leverage the cloud for destruction for Crackdown 3 for example.

The CPU's are certainly trash, Jaguar was garbage even on it's original release on the PC, even it's predecessor Brazos was pretty average on release... Sprinkle half a decade on top of that and it hasn't done it any favors.

Scarlett is set to change all that in a big way of course... And for once I am actually excited that console manufacturers are taking CPU performance seriously for the first time in generations.

Mr Puggsly said:

People often say it was the CPU that was too limited in the 8th gen, but GPU was also a culprit. Because even when CPU bottleneck wasn't a primary issue for 60 fps, it still takes a lot of GPU power to achieve 60 fps with high visual fidelity. Limited GPU power is why dynamic resolution is common in 60 fps games.

For the base Xbox One, GPU and memory bandwidth was a limiter from even it's launch day, many games ended up at 720P because of it.

Dynamic resolution is there to make full use of the limited GPU resources, but because you are GPU limited doesn't mean you aren't CPU limited either, it's being disingenuous to assert that only one can happen at a time or that one doesn't exist because you have 1080P/60fps.

Games generally just feel like a prettier version of the 7th gen, one of the reasons for that is because of the CPU side of the equation, things didn't take a massive leap, it was a more conservative jump on the performance scale... More so if you were coming from the Playstation 3.

Mr Puggsly said:

In the next gen however, we seem to agree bottleneck on CPU shouldn't be an issue for 60 fps. Also, resolution at 1440p-4K will become even more common. Essentially the compromises needed for 60 fps become less work. For example, the X1X offers more 60 fps content because it has a little extra CPU power and they can drop the resolution (and effects) to reduce GPU bottleneck. Hence, less work to hit 60 fps means more games should (WILL) offer it.

CPU bottlenecks can exist even if you are at 60fps.

Console developers generally work within the constraints of what they have on a console... So developers make significant cutbacks in various areas... On PC those limitations tend to be removed and we can see what developers originally envisioned from some aspects... And the difference that a CPU makes can be rather large in a few key areas, some of which has been alluded prior in my post.

The Xbox One X can achieve 60fps more often because it has more bandwidth, more memory, more GPU performance... And yes, a slightly faster CPU... But you can't give all the thanks to the CPU, it's still a limitation.

curl-6 said:

Personally, I was happy with a lot of the experiences that the Espresso CPU in the Wii U managed to deliver, but I still won't dispute that it was still, objectively speaking, a terribly weak CPU for a home console releasing in 2012.

Espresso was a more capable CPU than the Xbox 360 though, less than the Playstation 3, mostly thanks to the fact it wasn't an in-order design... But it was held back by clockrates.

But overall, it fit nicely in the 7th gen in terms of CPU capability. Nintendo typically emphasizes 60fps in it's titles anyway, so they work with what they have really well.

Mr Puggsly said:

The Wii U in general was not impressive hardware for 2012. I bought a Wii U for exclusives and they weren't technically much better than 360 or PS3.

I'm not defending the Jaguar CPUs because its running games I enjoy. I'm just pointing out in practice Jaguar CPUs were used for impressive content and achieved 60 fps more often than given credit for.

Indeed. The WiiU's hardware wasn't super impressive. - Many games when engineered with the WiiU's limitation in mind though did shine on the hardware... Mostly Nintendo exclusives.
But overall, if the console had more memory bandwidth it could have almost been an Xbox 360 Pro from a hardware perspective.


Last edited by Pemalite - on 25 July 2019

--::{PC Gaming Master Race}::--

Pemalite said:
Mr Puggsly said:

I clarified any confusion during our Fable 3 discussion, but you kept pressing. Give it a rest, I generally expect better from the mods. It wasn't even a discussion to win.

Now you are changing the tact of the discussion to step away from your prior fallacies...

Mr Puggsly said:

Half Life 2 on Xbox didn't run fine, but it certainly ran. Performance was the worst aspect of that port and makes it a difficult version to revisit.

It ran fine, not amazing, I have the game on the Original Xbox so I can see first hand.
It ran better than allot of other titles of the era like Morrowind that's for sure.

Back then it was okay to push titles out with framerates under 30... Heck, it wasn't exactly uncommon during the 7th generation either.

The Xbox 360 port fixed up allot of the deficiencies as expected.

Mr Puggsly said:

You missed the point in regard to RAM. While 5GB certainly is not a ton of RAM for games, but the RAM requirements for PC gaming stayed relatively stagnant. Hence, modern games haven't seem to hit a wall due to struggling with RAM limitations like previous gens did. Even the Switch is doing impressive games like Witcher 3 with even less RAM, albeit struggling with textures.

Ram requirements have remained stagnant because hardware on that front has generally remained stagnant.

The sweet spot for Ram on PC is around 16Gb currently because of price/performance issues earlier on in the generation on PC which meant people weren't  upgrading their Ram as prolifically as they did prior... That will likely change as we transition into DDR5 memory over the next year or two.

That doesn't mean games can't use more Ram, it just doesn't generally happen.

GPU Ram requirements on the other hand have increased massively since the 8th gen consoles launched, we went from 1-2GB GPU's to 8GB-16GB... For some reason allot of people ignore those jumps on that front.

Mr Puggsly said:

I don't feel the disparity in specs between base X1 and X1X are significant enough. Therefore anything that could be developed to take full advantage of the X1X at 1080p/30 fps, should be able to scale back relatively easily for a base X1 if they mostly scale back GPU heavy effects. It seems like almost most 8th gen games can work on Switch because the specs disparity just isn't big enough, even if there are minor compromises. The example you gave for Wolfenstein 2 on Switch is mostly aesthetic and was likely done to boost performance. Anyhow, we all know the X1X's primary focus was making X1 games look and play better, which at the very least it certainly does that. Sometimes the disparity is so big it seems like the games were developed for X1X specs, Soul Calibur VI for example looks bad and loads horribly on base hardware.

We don't know if the specifications of the Xbox One and Xbox One X are significant enough, how you feel is ultimately irrelevant to this point...

We have absolutely no information on the hardware demands that Halo: Infinite will bring or what cutbacks Microsoft had to make in order to make the game viable on inferior platforms.

We do have some information on some of the rendering effects being employed thanks to the trailers showcasing them, but that's about it.

I am not saying a port is impossible, clearly it is possible as it's actually happening. - My point of contention is what did they have to give up to get there?
Is it just resolution and framerate? Unlikely. Will they have to go back and re-engineer parts of the game to reduce the rendering load? Cutback of effects?

Mr Puggsly said:

Open world games were pretty common last gen as well. The big difference this gen is more online open world stuff. I imagine RAM was helpful for that but they still existed on last gen.

Player counts thanks to the CPU jump was one of the big enablers this gen, Battlefield for instance saw an increase in multiplayer counts per map.

Plus Physics.

Openworld is more common because less engineering needs to be dumped into making it possible... Last gen developers were getting cleaver with various approaches like implementing impostering, compressed texture and mesh streaming from disk, which the CPU would unpack... And so on just to fit everything into less than 512MB of memory.

Mr Puggsly said:

Well there isn't much a debate to have on Ashes of Singularity, maybe its complex AI is incredibly demanding, maybe its an optimization issue. I do see video of a FX-6300 running the game relatively poorly, but it runs. I mention that because that CPU in practice seems to give similar performance to consoles.

The FX-6300 is faster than Jaguar anyway... And all AMD FX processors were generally crap anyway. - It's not an optimization issue... You really need to play the game and see the details they added into the complex A.I. - It's probably the best game to showcase what you can do on the A.I front when you have oodles of CPU time to play with.

Mr Puggsly said:

Oh lord... let me elaborate. I feel the Jaguar CPUs in the current consoles have shown great potential. For example, I'm playing Gears 4 (Gears 5 soon), Forza Horizon 4 and other titles that stick relatively close or stay at 60 fps. There are also games that did a good job hitting 60 fps on base hardware like Forza, GT, MGS, Halo, BF, CoD (some better than others), etc. In my mind, that's pretty good for CPUs people call trash. Either way, I can't deny there is CPU bottleneck in many games that make hitting 60 fps impossible. However, GPU was also limited for high quality visuals/effects, high resolutions (900p-1080p) and 60 fps at the same time.

Those titles achieve 60fps because they are fairly light on the CPU effects like Physics.
The games on PC are simply doing more... And as a result tend to use more CPU cycles.

But just because a game isn't hitting 60fps doesn't mean it's a CPU limitation, if you are GPU limited you won't hit 60fps either.

Now because I own Forza, Gears, Halo, Battlefield and Call of Duty... Many of which I played on base hardware and on my Xbox One X, I can say those games weren't doing much in the way utilizing the CPU's heavily... So Jaguar is of course not going to be a hindrance.

But what-if we had 10x more CPU power at our disposal? Those same games will still be 60fps, but we would have far better A.I, more physics, better scripting, more characters on screen, better positional audio... We wouldn't have needed Microsoft's original push to leverage the cloud for destruction for Crackdown 3 for example.

The CPU's are certainly trash, Jaguar was garbage even on it's original release on the PC, even it's predecessor Brazos was pretty average on release... Sprinkle half a decade on top of that and it hasn't done it any favors.

Scarlett is set to change all that in a big way of course... And for once I am actually excited that console manufacturers are taking CPU performance seriously for the first time in generations.

Mr Puggsly said:

People often say it was the CPU that was too limited in the 8th gen, but GPU was also a culprit. Because even when CPU bottleneck wasn't a primary issue for 60 fps, it still takes a lot of GPU power to achieve 60 fps with high visual fidelity. Limited GPU power is why dynamic resolution is common in 60 fps games.

For the base Xbox One, GPU and memory bandwidth was a limiter from even it's launch day, many games ended up at 720P because of it.

Dynamic resolution is there to make full use of the limited GPU resources, but because you are GPU limited doesn't mean you aren't CPU limited either, it's being disingenuous to assert that only one can happen at a time or that one doesn't exist because you have 1080P/60fps.

Games generally just feel like a prettier version of the 7th gen, one of the reasons for that is because of the CPU side of the equation, things didn't take a massive leap, it was a more conservative jump on the performance scale... More so if you were coming from the Playstation 3.

Mr Puggsly said:

In the next gen however, we seem to agree bottleneck on CPU shouldn't be an issue for 60 fps. Also, resolution at 1440p-4K will become even more common. Essentially the compromises needed for 60 fps become less work. For example, the X1X offers more 60 fps content because it has a little extra CPU power and they can drop the resolution (and effects) to reduce GPU bottleneck. Hence, less work to hit 60 fps means more games should (WILL) offer it.

CPU bottlenecks can exist even if you are at 60fps.

Console developers generally work within the constraints of what they have on a console... So developers make significant cutbacks in various areas... On PC those limitations tend to be removed and we can see what developers originally envisioned from some aspects... And the difference that a CPU makes can be rather large in a few key areas, some of which has been alluded prior in my post.

The Xbox One X can achieve 60fps more often because it has more bandwidth, more memory, more GPU performance... And yes, a slightly faster CPU... But you can't give all the thanks to the CPU, it's still a limitation.

curl-6 said:

Personally, I was happy with a lot of the experiences that the Espresso CPU in the Wii U managed to deliver, but I still won't dispute that it was still, objectively speaking, a terribly weak CPU for a home console releasing in 2012.

Espresso was a more capable CPU than the Xbox 360 though, less than the Playstation 3, mostly thanks to the fact it wasn't an in-order design... But it was held back by clockrates.

But overall, it fit nicely in the 7th gen in terms of CPU capability. Nintendo typically emphasizes 60fps in it's titles anyway, so they work with what they have really well.

Mr Puggsly said:

The Wii U in general was not impressive hardware for 2012. I bought a Wii U for exclusives and they weren't technically much better than 360 or PS3.

I'm not defending the Jaguar CPUs because its running games I enjoy. I'm just pointing out in practice Jaguar CPUs were used for impressive content and achieved 60 fps more often than given credit for.

Indeed. The WiiU's hardware wasn't super impressive. - Many games when engineered with the WiiU's limitation in mind though did shine on the hardware... Mostly Nintendo exclusives.
But overall, if the console had more memory bandwidth it could have almost been an Xbox 360 Pro from a hardware perspective.


No, Fable 3 on PC is trash because it has the GFW crap and that was my original point. I stand by that, fairly certain thats why it was delisted as well. You feel MS doesent want to make keys, I feel they dont want to sell products with GFW. Not really a winner and loser debate.

Abysmal frame rates were more tolerated at the time, but that doesent mean it ran fine. DF looked at that version in HL2 retrosoective, it hung in the teens and nose dived for heavy physics. I played it, enjoyed it at the time, but it was rough and doesent mean it ran fine. We just had lowered performance expectations for technical marvels I guess.

In previous gens RAM on PC was generally significantly higher on PC than consoles. I remember needing 256MB of RAM to play a game thats virtually the same on Xbox. PC has generally been less efficient and has to run an OS like Windows.

I looked at RAM usage of AAA releases in 2013-14, 4GB was fairly common. While many modern games can function fine with 8GB. And again, PC is just less efficient.

Again, 7th gen had a lot of open world games. Red Faction Guerilla was even a mix of great physics and an open world at the same time. BF lowering the player count was like for performance reasons.

In practice, the console CPUs have out performed the FX 6300. Again, just an example of superior console optimization. Im pointing out that CPU can run the game and we can only speculate what the console CPUs could do with good optimization.

Again, I dont feel 10x CPU necessarily changes how a game would be designed in most cases. They may splurge on CPU heavy effects that are easy to add, but I feel something like AI is generally design related more than spec limitations.

I suspect Crackdown 3's destruction ambitions were scaled back just because it was difficult to actually create. At some point they just threw something together.

I feel the Jaguar CPUs were fine for what developers were looking to do this gen. Frankly, they had more CPU power this gen and didnt do much I consider ambitious compared to the previous.

Maybe CPU is being taken seriously because game design in general has been kinda stagnant, more demand for 60 fps, will help loading, split screen, practical stuff. Generally speaking, I dont expect more CPU to make many fresh experiences.

On a side note, I think its time MS play its Windows card and allow Xbox to run PC games maybe in a curated fashion, kinda like backwards compatibility. A great CPU would help with that. It would also mean the Xbox library could be easily in cases developers dont want to make a Xbox port.



Recently Completed
River City: Rival Showdown
for 3DS (3/5) - River City: Tokyo Rumble for 3DS (4/5) - Zelda: BotW for Wii U (5/5) - Zelda: BotW for Switch (5/5) - Zelda: Link's Awakening for Switch (4/5) - Rage 2 for X1X (4/5) - Rage for 360 (3/5) - Streets of Rage 4 for X1/PC (4/5) - Gears 5 for X1X (5/5) - Mortal Kombat 11 for X1X (5/5) - Doom 64 for N64 (emulator) (3/5) - Crackdown 3 for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Mr Puggsly said:

No, Fable 3 on PC is trash because it has the GFW crap and that was my original point. I stand by that, fairly certain thats why it was delisted as well. You feel MS doesent want to make keys, I feel they dont want to sell products with GFW. Not really a winner and loser debate.

Eitherway... Games for Windows Live! Isn't a legitimate excuse. Just remove it (And patch all the older GFWL titles too) and sell more keys on Steam. Microsoft is just being lazy, they deserve the criticism on this front either way.

Mr Puggsly said:

Abysmal frame rates were more tolerated at the time, but that doesent mean it ran fine. DF looked at that version in HL2 retrosoective, it hung in the teens and nose dived for heavy physics. I played it, enjoyed it at the time, but it was rough and doesent mean it ran fine. We just had lowered performance expectations for technical marvels I guess.

I fired it up not long ago, performance wise it's not something unexpected, it had semi-decent frame pacing so it still controlled okay... I mean if we were to look at Halo 3 that had terrible framerates and poor framepacing which meant the controls felt extremely floaty... But the game still played fine for that era, obviously it's a far better experience on the Xbox One, just like Half Life 2 was a far better experience on the Xbox 360.

Either way, I feel like this is side stepping my actual point... That improvements in hardware capability tends to open up the possibility of new gameplay mechanics... The original Xbox was a showcase for this... Shadowing and Lighting took big strides thanks to programmable pixel shaders... And a few games actually leveraged this, can you guess which?

The CPU was also the best of the generation which enabled games like Elder Scrolls Morrowind and Half Life 2 on console.

Mr Puggsly said:

In previous gens RAM on PC was generally significantly higher on PC than consoles. I remember needing 256MB of RAM to play a game thats virtually the same on Xbox. PC has generally been less efficient and has to run an OS like Windows.

The PC tends to run with better, more expensive visuals, that's not free.
It's like comparing SSAO that many consoles games used which is lower quality, less memory and GPU intensive than HBAO... The PC's hardware does get used extensively you know.

Oblivion uses 512MB~ of memory on Xbox 360... When I re-engineered the shaders, polygon reduced allot of the assets, reduced texture resolution, I was running the game on only 128MB of memory on PC... Does that mean the Xbox 360 is less efficient? No. It's just pushing better visuals.

At one point though the PC was running a more memory intensive Operating System, especially with the advent of Vista, even when we shifted from the 9x to NT kernels... But since the 8th gen, that is no longer the case where the Xbox One and Playstation 4 reserve a few Gigabytes for OS/background duties.

Plus we have more and larger caches, it wasn't unheard of for graphics drivers at one point to duplicate the GPU's vram in system memory, Intel was notorious for this which became a significant issue when Aero came about... But over time the PC does get more efficient and it's no longer the case.

Mr Puggsly said:

I looked at RAM usage of AAA releases in 2013-14, 4GB was fairly common. While many modern games can function fine with 8GB. And again, PC is just less efficient.

The PC is slightly less efficient, but not a generational difference. Again... Read above.

Keep in mind that PC games using 4GB in 2013 is being compared to games that would use 5-6GB on the 8th gen... But was probably still pushing out higher visuals. - That doesn't make consoles less efficient, it's just resources being used differently.

Mr Puggsly said:

Again, 7th gen had a lot of open world games. Red Faction Guerilla was even a mix of great physics and an open world at the same time. BF lowering the player count was like for performance reasons.

I am aware that the 7th gen had a lot of open world games, never said it didn't. - Heck the 6th gen had a heap of open world games, but it wasn't the norm like it was in the 8th gen, especially from outlets like Ubisoft.

Mr Puggsly said:

In practice, the console CPUs have out performed the FX 6300. Again, just an example of superior console optimization. Im pointing out that CPU can run the game and we can only speculate what the console CPUs could do with good optimization.

Not really.

Just because Ashes of the Singularity... A PC exclusive that leverages all the CPU time you can throw at it tanks a low-end, last generation CPU, doesn't make the PC significantly less efficient, it's a different use of resources.

The FX 6300 is more than capable of playing the majority of multiplats just fine... Shit. I can run the vast majority of multiplats on a Core 2 Quad from 12 years ago... That's a 7th gen equivalent CPU. - Jaguar hasn't pushed up the CPU bar all that much in the last decade on the gaming front... It's simply a shit CPU.

Even Digital Foundry recognizes the limitations of AMD's Jaguar... So why don't you?

Mr Puggsly said:

Again, I dont feel 10x CPU necessarily changes how a game would be designed in most cases. They may splurge on CPU heavy effects that are easy to add, but I feel something like AI is generally design related more than spec limitations.

We will wait and see.

Although the gaming industry is extremely mature at this point... And big publishers don't like to make allot of gambles and would thus rather push out yearly releases from reliable franchises...

However, I would imagine there would be some experimentation that will happen on the Physics, Particle and A.I. side of the equation next gen, maybe something like Supreme Commander on console with actual decent A.I?

On the graphics side, because we don't know much about the Ray-Tracing implementation going on in next-gen, the CPU might be employed to assist in culling or some-such. - Can only speculate though.

Mr Puggsly said:

I suspect Crackdown 3's destruction ambitions were scaled back just because it was difficult to actually create. At some point they just threw something together.

Crackdown 3 was a colossal failure, the Xbox One just didn't have the hardware resources to achieve what their original advertised vision entailed.

Destruction isn't a new concept, Red Faction has been doing it for ages, Battlefield has been doing it for ages, it's a known quantity.

However the processing that goes into such a scheme is significant... And the scope that the original idea for crackdown 3's showcase meant that the cloud was a necessity because Jaguar was simply not up to the task.

Mr Puggsly said:

I feel the Jaguar CPUs were fine for what developers were looking to do this gen. Frankly, they had more CPU power this gen and didnt do much I consider ambitious compared to the previous.

Jaguar was the only option this generation. AMD was not in a good place on the CPU side of the fence... Which is a reversal to where they are now, where their GPU side is where they are dropping the proverbial ball.

But what improvements have we seen this generation over last generation on the CPU side? We are seeing larger multiplayer maps with more people, we are seeing more extensive use of physics based particle effects. - But considering how marginal the CPU improvement Jaguar brought to the gaming table over the 7th gen was, I think developers have done well with the hand they were dealt.

When consoles rely on PC technology and use only low-end and mid-range components, there is only so many options available I guess.

Mr Puggsly said:

Maybe CPU is being taken seriously because game design in general has been kinda stagnant, more demand for 60 fps, will help loading, split screen, practical stuff. Generally speaking, I dont expect more CPU to make many fresh experiences.

It will only assist loading if there is heavy scripting, unpacking, decompression and procedural generation and so on going on during the initial load phases.
The main limitation for load times is the last century optical and mechanical disks.

Mr Puggsly said:

On a side note, I think its time MS play its Windows card and allow Xbox to run PC games maybe in a curated fashion, kinda like backwards compatibility. A great CPU would help with that. It would also mean the Xbox library could be easily in cases developers dont want to make a Xbox port.

I absolutely agree... It seems it's been the path they are heading towards anyway... But considering I have every Xbox console and 500+ games, I wouldn't be against the idea, but I refuse to use the Windows Store... So I would prefer it to be a stand alone application.

The reverse could also happen where PC games operate on Xbox... We saw some hints towards that with the App side of the equation this generation, with even some emulators popping up on Xbox and some games implementing a store-front for mods and so on.

But that is also a double edged sword... It will mean there is less incentive to pick up an Xbox console... One of the reasons why I even bothered with Xbox was due to a couple of games like Halo and Fable.. They were day 1 purchases which made me jump on the Original Xbox... They did get PC releases later though, but by that point I was invested.
That meant the Xbox 360 and Xbox One were must-have purchases on day 1.

At this point though I have gaming devices for every area of the home, the Xbox One X in the lounge room for couch-gaming, Nintendo consoles in the bedroom for gaming in bed, PC and Playstations in the games room.

And a Ryzen notebook for gaming when away from home/traveling long distance... My platform of choice is of course the PC.



--::{PC Gaming Master Race}::--

Pemalite said:

The PC tends to run with better, more expensive visuals, that's not free.Either way, I feel like this is side stepping my actual point... That improvements in hardware capability tends to open up the possibility of new gameplay mechanics... The original Xbox was a showcase for this... Shadowing and Lighting took big strides thanks to programmable pixel shaders... And a few games actually leveraged this, can you guess which?

Side note: holy shit it is annoying as fuck trying to isolate just one passage in VGC's text editor and delete everything else; it should take me a matter of seconds, instead it takes minutes cos every time I try to delete just one paragraph the stupid thing automatically highlights everything above that and tries to make me delete it all.

Anywho, yeah, it's quite ironic that the the last time a Halo game launched an Xbox console, with the original game, it was very much a showcase for the capabilities of next gen hardware that simply couldn't have been replicated on the hardware of the prior gen. And I can't help but feel that a Combat Evolved that was multiplat with N64/PS1 tier hardware would not have been the classic it became as its signature sophisticated AI and huge levels just wouldn't have been possible. 



Would it though?

I just don't see the incentive for people to buy Xbox anymore. The Xbox One has been out for 5 years, and it's sold only 45 million. The OG Xbox was out for 4 years and sold only 24 million. The Xbox One is pushing into six years now, and has only sold just under double that? It's almost on par sales percentage wise. Especially considering nearly all of their exclusives are available on PC now... why would anyone really spend the money on a new Xbox when they can just play all of those games on a PC? Reasons I'm saying this is because all my friends that are into gaming are either playing on a PS4 & switch, just a PS4, or sold their consoles and built gaming rigs. Halo is a massive franchise, but it's been in decline since Bungie handed the reigns over to 343. I'm stoked for the new Halo and intrigued as always, but I really don't think it would be in anyones interest at Microsoft to make Halo a Scarlett exclusive. It would hurt their pockets way too much given how dismal their sales were this generation, and it would be a huge disservice to all of the people that bought Xbox's and have been waiting for 3 years for this game. I really don't think Scarlett will be big either. I see it as a stepping stone for their transition into game streaming. I truly believe Scarlett will be the last physical console that Microsoft puts out. They've proven unable to topple Sony, and they have a new competitor with a giant load of money and developer support behind them (Google).