By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Eddie_Raja said:
Pemalite said:


No. It doesn't.
You have this thing called "Prediction" where you predict the data you are going to require ahead of time, it's a technique which has been used to varying degree's for decades, extremely effective on fixed-hardware devices such as consoles for obvious reasons.
Converesly, both devices will be streaming a significant amount of data not from Ram, but from mechanical and optical disc storage which is stupidly slow, we saw that put to great use last generation.

The real limitation to the Xbox One is not Bandwidth, it's actually the reduced GPU resources used to draw all the pretty things on your screen.
Look at other GPU designs in the PC space as an example, AMD Fury has an abundance of bandwidth, more than any other graphics card to ever exist, more than several Geforce cards combined... Yet has minimal benefit from it. Why? Because there is not enough hardware to make use of such a wide and fast highway.

Haha that extra bandwidth IS going to use in 4K and will make a massive difference in a year or two when it is actually utilized.  Just look at the 7970 vs the 680 for an example.  At first everyone acted like the extra bandwidth in the 7970 was wasted, but once games actually started needing to feed that much information it got pathetic - so pathetic that a 7970 pretty much matches a 780 now.

Not to mention that the ESRAM isn't even much faster than the GDDR5 the PS4 is using.  Usually this special RAM is like 2-4x faster (Or more), but its not evn 50% faster.  In fact it is slower than the 10MB of ESRAM in the Xbox 360.


I already game at resolutions higher than 4k. (7680x1440 eyefinity to be exact.)

And I have four Radeon R9 290's in crossfire. Before that, Triple Radeon 7970's. Before that... Radeon 6950's unlocked into 6970's.
And I will likely buy into AMD's Fury successor on the 1x node.
Guess where I see the largest performance increase? It's not overclocking my GPU Ram. It's actually on the core clocks, I am compute and ROP bound before memory bandwidth.

If you are thinking the Playstation 4 will be playing Crysis 4... At 4k, with everything dialed up to 11. You are dreaming. No console has the power to do that.

Also...

Check out the 4k benchmarks with nVidia's Titan X (336GB/s of bandwidth) vs AMD Fury (512GB/s of bandwidth)
Despite the nVidia Titan having 176GB/s less bandwidth than AMD's Fury... It still wins the vast majority of benchmarks.
But don't take my word for it...
http://anandtech.com/bench/product/1513?vs=1447

Again. You reach a point where more bandwidth does nothing, because the hardware cannot make use of it.





www.youtube.com/@Pemalite