By using this site, you agree to our Privacy Policy and our Terms of Use. Close
czecherychestnut said:
I'll only make two points. 1) Regarding GDDR5 vs DDR3 latency, yes GDDR5 has higher latency (clock cycles) than DDR3, however what this analysis fails to take into account is the difference in clock speed between the two. The PS4 uses GDDR5 running at 5.5GT/s, which gives it a master clock speed of 1.375GHz (GDDR does 4 transfers per cycle). DDR3-1600 runs at 800 MHz (DDR3 does 2 transfers per cycle), so if the latency is measured in ns (which matters more than clock cycles as its the actual time the CPU will be waiting for data) then GDDR5 latency can be 70% higher when measured in clock cycles and still be faster than DDR3-1600. I'm currently on my tablet which sucks but later i'll link a Samsung GDDR5 data sheet that shows the latency is less than 70% higher in clock cycles.

The other point is few people take into account the size of their ps4 rivalling PC's when they claim they can build a PC for cheaper. The ps4 is tiny, much smaller than micro-atx. Are you really happy to have a large, 400 watt sucking PC under your TV? People pay extra for smaller PC's and if you don't take that into account then you haven't done a valid comparison.

Besides, though I have both a gaming PC and a console, I play the console more because I don't have to stuff around with driver updates every time a new game comes out, and for the console exclusives.

400 watt is kind of an exxageration. Even an HD7970 equipped system won't reach 400W at full load. There is only a handful of cards in the market that will make a PC consume more than 400 watts, and they all cost 500$ ++. But regardless, if you want proccessing power, you have to pay for it. PS4 is a nice design, although I would have liked to see Pitcairn more aggressively binned. 2 CUs disabled and only 800 MHz is too conservative for me.

On the memory department you are spot on. Using DDR3 for rendering is a joke.