By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mr Puggsly said:

I didn't mean YOU should buy Fable 3. Also, Since YOU already own it, MS doesn't need to sell it anymore.

You referenced me in that sentence. Here I will provide the appropriate quotation in bold:

Mr Puggsly said:

Go buy Fable 3 (or better yet, 2) on Xbox if the prospect of stealing upsets you.

So obviously I am going to ascertain that you were referencing me from the get go.
But sure... Backpedal and all that.

*****************

Mr Puggsly said:

The cool physics in Half Life 2 isn't what made that a great game though, it was more like a neat feature. In case you forgot by the way, I'm simply arguing you don't need a vast improvement in specs (like Scarlett) to make interesting new games. Hence, Halo Infinite wouldn't suddenly become a much more ambitious project if it were simply moved to Scarlett exclusively. Many 8th gen games feel derivative or smaller in scope than many 7th gen games.

The cool Physics in Half Life 2 certainly helped make it a great game, you take away all those small "touches" that were simply awesome... And you end up with a droll game like all the other clones at the time, it's the little things that set it apart from all the other shooters.

Halo Infinite specifically has already spent years in development with the anemic base Xbox One hardware in mind, thus even if Microsoft/343i were to remove base Xbox One support, the game will already be limited by that hardware unless years more of development work was spent to fully take advantage of newer hardware.

Mr Puggsly said:

X1X has untapped potential, but I wouldn't say its mostly wasted. Even if we argue it not properly utilized, it still delivers a better way to play X1 games. It takes a lot of GPU power to increase them to 1800p, 4K, or whatever.

If it's not properly utilized, then it's wasted.

That doesn't mean there aren't benefits to having an Xbox One X... But the bulk of benefits are resolution and framerates, even then... Many games that were 720P locked on the base Xbox One are locked to 720P on the Xbox One X unless there was a specific patch... It's less of an issue on the Sony side of the equation as 720P titles were an extreme rarity even during the early years.

I mean, lets take Dragon Age: Inquisition, it's a few years old at this point but it was a 1600x900 game on the base Xbox One and a full 1920x1080 on the base Playstation 4 Pro. - But because there isn't am enhanced patch for that title, the Playstation 4 variant of the game looks better than the Xbox One X. - It's wasted potential... A large part of that blame certainly lays on the developers though rather than the hardware itself.

It will be interesting to see if Microsoft will "enhance" titles from the Xbox One family of consoles with Scarlett by bumping up resolutions like they did with some Original Xbox and Xbox 360 titles.

Mr Puggsly said:

I disagree, the CPU and GPU are fairly balanced in base hardware. Frankly, much of the bottle neck has been on the GPU which is why resolutions have varied and dynamic resolutions became common. People keep saying the 8th gen CPUs are too limited, but I don't think developers are even pushing its limits. I mean Just Cause 4 shouldn't exist if the CPUs were so limited.

In practice, it seems to me the CPUs in 8th gen consoles had enough power for what developers were generally looking to do. You have a theory and I don't feel evidence supports it.

I don't think graphics are just a marketing tool, its something many gamers care about. The 9th gen consoles will have vastly superior CPUs, but generally that will simply mean more 60 fps games.

There are plenty of cases when physics and A.I Calculations increase that performance tanks, which is why a certain Assassins Creed game on the base Xbox One actually had the edge over the Playstation 4 version... Because the higher CPU clock and lower-latency eSRAM and DDR3 Ram gave the Xbox One an advantage in those scenarios.

As for the GPU's, they are clearly the shining star of the 8th gen devices.

But one thing we need to keep in mind is that consoles are a closed environment, so developers will work with whatever limited resources and bottlenecks they have in order to achieve various targets... So if we have anemic CPU's in the consoles, games will be developed with those in mind.

But when games end up ported to the PC, then developers are able to relax as they have orders-of-magnitude more hardware capability at their disposal, so we do get better Physics, Particles, A.I characters and so on than the console releases... And that is all thanks to the CPU.

Mr Puggsly said:
DonFerrari said:

Considering most games had bad drops in fps and some even passed long time below 30 while dynamic res and sub fullhd wasn't that problematic on ps4. gpu cpu were balanced, but when comparing to pc the cpu were lower tier to gpu.

We would have to look at games individually to assess why there were frame drops below 30 fps, but it was often bottleneck on the GPU. Sometimes it might just be poor optimization especially if its a relatively linear game, yet seemingly more complex games can hit 60 fps.

Its worth noting Just Cause 3 really struggled with console CPUs, while Just Cause 4 was a huge improvement. Look at Mass Effect 1 or Oblivion on 7th gen, that gave me the impression they were already fully utilizing the hardware. Then better optimized games came not long after that.

There are also games on the perform better on mid gen upgrades because much of the bottleneck was primarily on GPU.

Just Cause 4 took advantage of allot of more modern CPU instructions and became more parallel in it's CPU workloads, hence why it was a step up over Just Cause 3, there is still significant room for improvement though.

Oblivion though wasn't fully utilizing the Xbox 360/Playstation 3 hardware... The bulk of the work was done on 1 CPU core... I spent allot of time working with Oblivion to run it on Original-Xbox equivalent PC hardware. (Pentium 3+Geforce 3.) It looked like a dogs breakfast as I had to strip/reduce the shader effects and even went to the extent of polygon reductions in models...

It wasn't until Bethesda severely reworked large swathes of the Net Immerse turned Gamebryo turned Creation Engine with Skyrim that we saw better CPU utilization across all platforms, hence the relatively large leap in visuals and general simulation quality... But even on that front there is still substantial room for improvement, it doesn't scale well across more than a few threads... And the 7th gen had 6/7 threads to optimize for.

Last edited by Pemalite - on 15 July 2019

--::{PC Gaming Master Race}::--