By using this site, you agree to our Privacy Policy and our Terms of Use. Close
superchunk said:
IBM Power >>> AMD in terms of CPUs. Basically, any AMD (or Intel) CPU would have to have much higher specs to match an IBM CPU.

I beg to differ. Any modern quad core AMD CPU would level an IBM-based CPU into the ground for games.  An A10-5800K CPU is 22% slower than a 1st generation quad-core i7 870 2.93ghz (same architecture as Nehalem i7). Even the FX-8350 "8-core" is still slower in games than 1st generation i7 but only by about 5%.

http://www.computerbase.de/artikel/prozessoren/2012/test-amd-fx-8350-vishera/6/

For PC gamers this is a big deal since they might want to spend $100 more to go from GTX670 to 680 for just 10% more performance. With a slower CPU, that would be $100 wasted. That's why Intel CPUs are more popular for games since they allieviate as much CPU bottleneck as possible for those high-end gaming GPUs. Also because Intel CPUs use less power in both stock and overclocked states, they are preferable. PC gamers want the maximum performance if they can afford it, which puts AMD out of the running unless your budget can't fit an i5/i7. That's why AMD's CPUs are not popular on the PC when paired with high-end GPUs. But let's not project from that a notion that AMD CPUs are slow compared to anything else. If Intel is like Bugatti Veyron Super Sport/Ferrari F70, then AMD is still like a Porsche 911 compared to everyone else making gaming CPUs.  

Metro 2033 developer estimated that just one of the dual-threaded cores in i7 870 (Nehalem/Lynnfield 1st generation i7s) is more powerful than the entire Xenon 3 core 6 threaded IBM PC in the Xbox 360:

http://www.eurogamer.net/articles/digitalfoundry-tech-interview-metro-2033?page=4

It's more complicated since according to that interview, the developer says the 360's CPU could exceed PC on a per-thread per clock basis if they used properly vectorized instructions. But that sounds more theoretical in nature than actually practical. Very few developers will go out of their way to actually tap the full potential of PowerPC/Cell CPUs (and we know very few did). If the CPU is an x86 based, the optimization and performance is a lot easier/better right away. That minimizes development costs and allows the developers to focus on GPU-based optimizations which are more important imo.

I would say a quad-core AMD CPU would be easily as fast as an 8-core IBM Power PC one in most circumstances. AMD's budget APUs may be 30-40% behind Intel's $225 i5-3570K or $325 i7-3770K but you gotta look at the big picture where those 2 brands are still relatively close compared to everyone else in the CPU business.

Here is a good chart of what happens if you pair an A10-5800K CPU + HD7950 3GB vs. i7 3770K + HD7950 3GB:

http://techreport.com/review/23662/amd-a10-5800k-and-a8-5600k-trinity-apus-reviewed/16

That chart is showing that an A10-5800K CPU, when paired with a $280 1792 Shader HD7950 GPU, renders 40 fps in 1% of the frame rates. That means in 99% of the cases, when paired with a $280 HD7950, the A10-5800K CPU gets >>>> 40 fps in the games they tested (BF3, Skyrim and Batman AC). The charts of average frame rates highlight that average frame rates are still way higher than PS3/360. This is running at 1920x1080:

 

Sure, the performance looks dissapointing compared to Intel CPUs but keep in mind how much those Intel CPUs cost. Is it realistic to expect an i5 or i7 in a console if you are aiming to keep the price reasonable in this economy? Probably not. The next logical trade-off is going with an AMD CPU since it offers 70% of the performance in games for $100-200 less. For a next gen console I think that's a good compromise since most people play on their HDTVs where rendering above 60 fps in not particularly beneficial. You'd definitely have a way faster gaming console if you had an A10 CPU + HD7950 rather than if you had a Core i5 and HD7770.

Even though A10-5800K is slower than Intel's CPUs for games, I don't see this being as critical in the console environment where MS / Sony are not really targeting 70-100 fps. Also, given the popularity of PS3/360, it seems console gamers are not too concerned about 60 fps average being a requirement in all games. Just going from 30 fps for most console games to 40-50 fps will be huge and an A10 AMD quad core will allow you to get there assuming your GPU is capable. If I was allocating the CPU vs. GPU budget, I would be MUCH more concerned with having a potent GPU that can actually render 30-40 fps in next gen 2017-2019 titles than worry about a CPU that can get me above 60 fps in 2013-2015 games. Sony's decision to go for an x86 CPU (even if only quad-core) is the best thing they've done since it would dramatically drop the price of the console vs. going with an Intel CPU while resulting in a reasonable reduction in performance.

Think about it if 1 core in a modern 1st generation i7 was more powerful than the entire 3-core 6 threaded IBM CPU in the Xbox 360, even if IBM doubled the instructions per clock and doubled the core count, and released a 3.2ghz 6-core 12-threaded IBM CPU, that would only match a 1st generation Core i7. Are we seriously expecting IBM to double the instructions per clock and release a 6-core 12-threaded 3.2ghz Out of Order CPU for the Xbox 720? I think Sony made the better call on the CPU side in terms of price/performance compromise. The key missing variable is: IS there a dedicated GPU along that APU, or not?