Pemalite said:
Mr Puggsly said:
I clarified any confusion during our Fable 3 discussion, but you kept pressing. Give it a rest, I generally expect better from the mods. It wasn't even a discussion to win.
|
Now you are changing the tact of the discussion to step away from your prior fallacies...
Mr Puggsly said:
Half Life 2 on Xbox didn't run fine, but it certainly ran. Performance was the worst aspect of that port and makes it a difficult version to revisit.
|
It ran fine, not amazing, I have the game on the Original Xbox so I can see first hand. It ran better than allot of other titles of the era like Morrowind that's for sure.
Back then it was okay to push titles out with framerates under 30... Heck, it wasn't exactly uncommon during the 7th generation either.
The Xbox 360 port fixed up allot of the deficiencies as expected.
Mr Puggsly said:
You missed the point in regard to RAM. While 5GB certainly is not a ton of RAM for games, but the RAM requirements for PC gaming stayed relatively stagnant. Hence, modern games haven't seem to hit a wall due to struggling with RAM limitations like previous gens did. Even the Switch is doing impressive games like Witcher 3 with even less RAM, albeit struggling with textures.
|
Ram requirements have remained stagnant because hardware on that front has generally remained stagnant.
The sweet spot for Ram on PC is around 16Gb currently because of price/performance issues earlier on in the generation on PC which meant people weren't upgrading their Ram as prolifically as they did prior... That will likely change as we transition into DDR5 memory over the next year or two.
That doesn't mean games can't use more Ram, it just doesn't generally happen.
GPU Ram requirements on the other hand have increased massively since the 8th gen consoles launched, we went from 1-2GB GPU's to 8GB-16GB... For some reason allot of people ignore those jumps on that front.
Mr Puggsly said:
I don't feel the disparity in specs between base X1 and X1X are significant enough. Therefore anything that could be developed to take full advantage of the X1X at 1080p/30 fps, should be able to scale back relatively easily for a base X1 if they mostly scale back GPU heavy effects. It seems like almost most 8th gen games can work on Switch because the specs disparity just isn't big enough, even if there are minor compromises. The example you gave for Wolfenstein 2 on Switch is mostly aesthetic and was likely done to boost performance. Anyhow, we all know the X1X's primary focus was making X1 games look and play better, which at the very least it certainly does that. Sometimes the disparity is so big it seems like the games were developed for X1X specs, Soul Calibur VI for example looks bad and loads horribly on base hardware.
|
We don't know if the specifications of the Xbox One and Xbox One X are significant enough, how you feel is ultimately irrelevant to this point...
We have absolutely no information on the hardware demands that Halo: Infinite will bring or what cutbacks Microsoft had to make in order to make the game viable on inferior platforms.
We do have some information on some of the rendering effects being employed thanks to the trailers showcasing them, but that's about it.
I am not saying a port is impossible, clearly it is possible as it's actually happening. - My point of contention is what did they have to give up to get there? Is it just resolution and framerate? Unlikely. Will they have to go back and re-engineer parts of the game to reduce the rendering load? Cutback of effects?
Mr Puggsly said:
Open world games were pretty common last gen as well. The big difference this gen is more online open world stuff. I imagine RAM was helpful for that but they still existed on last gen.
|
Player counts thanks to the CPU jump was one of the big enablers this gen, Battlefield for instance saw an increase in multiplayer counts per map.
Plus Physics.
Openworld is more common because less engineering needs to be dumped into making it possible... Last gen developers were getting cleaver with various approaches like implementing impostering, compressed texture and mesh streaming from disk, which the CPU would unpack... And so on just to fit everything into less than 512MB of memory.
Mr Puggsly said:
Well there isn't much a debate to have on Ashes of Singularity, maybe its complex AI is incredibly demanding, maybe its an optimization issue. I do see video of a FX-6300 running the game relatively poorly, but it runs. I mention that because that CPU in practice seems to give similar performance to consoles.
|
The FX-6300 is faster than Jaguar anyway... And all AMD FX processors were generally crap anyway. - It's not an optimization issue... You really need to play the game and see the details they added into the complex A.I. - It's probably the best game to showcase what you can do on the A.I front when you have oodles of CPU time to play with.
Mr Puggsly said:
Oh lord... let me elaborate. I feel the Jaguar CPUs in the current consoles have shown great potential. For example, I'm playing Gears 4 (Gears 5 soon), Forza Horizon 4 and other titles that stick relatively close or stay at 60 fps. There are also games that did a good job hitting 60 fps on base hardware like Forza, GT, MGS, Halo, BF, CoD (some better than others), etc. In my mind, that's pretty good for CPUs people call trash. Either way, I can't deny there is CPU bottleneck in many games that make hitting 60 fps impossible. However, GPU was also limited for high quality visuals/effects, high resolutions (900p-1080p) and 60 fps at the same time.
|
Those titles achieve 60fps because they are fairly light on the CPU effects like Physics. The games on PC are simply doing more... And as a result tend to use more CPU cycles.
But just because a game isn't hitting 60fps doesn't mean it's a CPU limitation, if you are GPU limited you won't hit 60fps either.
Now because I own Forza, Gears, Halo, Battlefield and Call of Duty... Many of which I played on base hardware and on my Xbox One X, I can say those games weren't doing much in the way utilizing the CPU's heavily... So Jaguar is of course not going to be a hindrance.
But what-if we had 10x more CPU power at our disposal? Those same games will still be 60fps, but we would have far better A.I, more physics, better scripting, more characters on screen, better positional audio... We wouldn't have needed Microsoft's original push to leverage the cloud for destruction for Crackdown 3 for example.
The CPU's are certainly trash, Jaguar was garbage even on it's original release on the PC, even it's predecessor Brazos was pretty average on release... Sprinkle half a decade on top of that and it hasn't done it any favors.
Scarlett is set to change all that in a big way of course... And for once I am actually excited that console manufacturers are taking CPU performance seriously for the first time in generations.
Mr Puggsly said:
People often say it was the CPU that was too limited in the 8th gen, but GPU was also a culprit. Because even when CPU bottleneck wasn't a primary issue for 60 fps, it still takes a lot of GPU power to achieve 60 fps with high visual fidelity. Limited GPU power is why dynamic resolution is common in 60 fps games.
|
For the base Xbox One, GPU and memory bandwidth was a limiter from even it's launch day, many games ended up at 720P because of it.
Dynamic resolution is there to make full use of the limited GPU resources, but because you are GPU limited doesn't mean you aren't CPU limited either, it's being disingenuous to assert that only one can happen at a time or that one doesn't exist because you have 1080P/60fps.
Games generally just feel like a prettier version of the 7th gen, one of the reasons for that is because of the CPU side of the equation, things didn't take a massive leap, it was a more conservative jump on the performance scale... More so if you were coming from the Playstation 3.
Mr Puggsly said:
In the next gen however, we seem to agree bottleneck on CPU shouldn't be an issue for 60 fps. Also, resolution at 1440p-4K will become even more common. Essentially the compromises needed for 60 fps become less work. For example, the X1X offers more 60 fps content because it has a little extra CPU power and they can drop the resolution (and effects) to reduce GPU bottleneck. Hence, less work to hit 60 fps means more games should (WILL) offer it.
|
CPU bottlenecks can exist even if you are at 60fps.
Console developers generally work within the constraints of what they have on a console... So developers make significant cutbacks in various areas... On PC those limitations tend to be removed and we can see what developers originally envisioned from some aspects... And the difference that a CPU makes can be rather large in a few key areas, some of which has been alluded prior in my post.
The Xbox One X can achieve 60fps more often because it has more bandwidth, more memory, more GPU performance... And yes, a slightly faster CPU... But you can't give all the thanks to the CPU, it's still a limitation.
curl-6 said:
Personally, I was happy with a lot of the experiences that the Espresso CPU in the Wii U managed to deliver, but I still won't dispute that it was still, objectively speaking, a terribly weak CPU for a home console releasing in 2012.
|
Espresso was a more capable CPU than the Xbox 360 though, less than the Playstation 3, mostly thanks to the fact it wasn't an in-order design... But it was held back by clockrates.
But overall, it fit nicely in the 7th gen in terms of CPU capability. Nintendo typically emphasizes 60fps in it's titles anyway, so they work with what they have really well.
Mr Puggsly said:
The Wii U in general was not impressive hardware for 2012. I bought a Wii U for exclusives and they weren't technically much better than 360 or PS3.
I'm not defending the Jaguar CPUs because its running games I enjoy. I'm just pointing out in practice Jaguar CPUs were used for impressive content and achieved 60 fps more often than given credit for.
|
Indeed. The WiiU's hardware wasn't super impressive. - Many games when engineered with the WiiU's limitation in mind though did shine on the hardware... Mostly Nintendo exclusives. But overall, if the console had more memory bandwidth it could have almost been an Xbox 360 Pro from a hardware perspective.
No, Fable 3 on PC is trash because it has the GFW crap and that was my original point. I stand by that, fairly certain thats why it was delisted as well. You feel MS doesent want to make keys, I feel they dont want to sell products with GFW. Not really a winner and loser debate.
Abysmal frame rates were more tolerated at the time, but that doesent mean it ran fine. DF looked at that version in HL2 retrosoective, it hung in the teens and nose dived for heavy physics. I played it, enjoyed it at the time, but it was rough and doesent mean it ran fine. We just had lowered performance expectations for technical marvels I guess.
In previous gens RAM on PC was generally significantly higher on PC than consoles. I remember needing 256MB of RAM to play a game thats virtually the same on Xbox. PC has generally been less efficient and has to run an OS like Windows.
I looked at RAM usage of AAA releases in 2013-14, 4GB was fairly common. While many modern games can function fine with 8GB. And again, PC is just less efficient.
Again, 7th gen had a lot of open world games. Red Faction Guerilla was even a mix of great physics and an open world at the same time. BF lowering the player count was like for performance reasons.
In practice, the console CPUs have out performed the FX 6300. Again, just an example of superior console optimization. Im pointing out that CPU can run the game and we can only speculate what the console CPUs could do with good optimization.
Again, I dont feel 10x CPU necessarily changes how a game would be designed in most cases. They may splurge on CPU heavy effects that are easy to add, but I feel something like AI is generally design related more than spec limitations.
I suspect Crackdown 3's destruction ambitions were scaled back just because it was difficult to actually create. At some point they just threw something together.
I feel the Jaguar CPUs were fine for what developers were looking to do this gen. Frankly, they had more CPU power this gen and didnt do much I consider ambitious compared to the previous.
Maybe CPU is being taken seriously because game design in general has been kinda stagnant, more demand for 60 fps, will help loading, split screen, practical stuff. Generally speaking, I dont expect more CPU to make many fresh experiences.
On a side note, I think its time MS play its Windows card and allow Xbox to run PC games maybe in a curated fashion, kinda like backwards compatibility. A great CPU would help with that. It would also mean the Xbox library could be easily in cases developers dont want to make a Xbox port.
|