By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
AkimboCurly said:

So yes empirically you're absolutely right most were not the full 720p vs. 1080p. I didn't intend it to be read like that was the norm, only that in extremis you get a 50% resolution hit. 

The reason I say it has to do with the RAM (and by extension the ESram) is because of the way it was partitioned up. I'm not a developer obviously but to my limited understanding, unless you can fit your render target into the 32MB buffer of extra-speedy RAM, you're forced to relegate it into the DDR3. This meant that, especially in multiplats which seldom used the buffer, the Xbox One would construct its frame with DDR3 (which is supposed to be system RAM) while the PS4 was able to use its GDDR5. The bandwidth differences then become seriously 2.0. The new Series S has a similar tiered memory architecture which people suspect the slower 2GB will be used for the OS. But even the faster memory (8GB GDDR6) has less bandwidth than the slowest tier memory in the Series X. So to my mind, unless developers scale their rendering targets nicely, both for the GPU but ALSO for the memory bandwidth, the Series S will get the short end of the stick. In practice that means that cutting texture resolutions AS WELL as internal resolution is basically non-negotiable. Watch Dogs got it right and AC Valhalla got it wrong

The eSRAM was basically up to developer choice.
Some developers could have used it as an additional level of cache for the CPU to bolster CPU performance.

Basically that 32MB was a massive limiter, it was necessary to leverage to get the most out of the machine, but developers learned to work around it by taking a tiled-approach over multiple passes to make the absolute most out of it.

DDR3 and GDDR5 are both "System Memories" and "Graphics Memories" in the 8th gen consoles, DDR3 definitely has the latency advantage (Remember DRAM latency is a result of clockrate.) which meant CPU tasks had an edge on the Xbox One (Plus clockrate advantage) and GDDR5 had the bandwidth advantage which meant graphics duties were simply superior on the Playstation 4... Plus the Playstation 4 just had the GPU compute to make up for the CPU deficiency, all comes down to the developer and engine.

Sadly the Xbox One was GPU limited more often than not, but when it wasn't and the CPU was the limitation, it definitely held a slight edge in gaming... A certain Assassins Creed title comes to mind when lots of actors were on-screen.

Things like Alpha Effects will be scaled back on the Series S, Resolution will be the first cutback which will save on bandwidth/fillrate massively. - Around 256GB/s of bandwidth is a good number for 1080P and the Series S fits into that ballpark fairly well, especially when you start to account for delta colour compression, primitive shaders, draw stream binning rasterization.

The Series S will possibly always draw the short straw on game optimizations, it's likely not going to be a massive developer priority unless it sells extremely well all generation long.

AkimboCurly said:

Yeah absolutely. Cut internal resolution to save on GPU and cut texture resolution to save on bandwidth. Since CPU won't be a bottleneck I really hope this philosophy is embraced and that developers aren't under pressure to hit 1440p.

Off the top of my head I remember Golf Club (the one which was supposed to be PGA Tour 15) was 720p on One and 1080p on PS4. Metal Gear Ground Zeroes was also 720p/1080p. You also had compromises like in Tomb Raider: Definitive edition it ran at 60fps on PS4 and 30fps on the One, which is in effect 50% fewer pixels temporally. 

Honestly I would rather see developers aim for 900P-1080P on the Series S and just push for higher fidelity.

Well MS expect Series S to sell much better than Series X. So that could actually mean that in the end Series X usually won't outperform PS5 version while Series S will be competent enough because that is where devs/pubs will sell most of the SW on Xbox.



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."