AMD's next-gen CPU architecture.
AMD's next-gen CPU architecture.
How is it different then what they use now? Also it would be more badass if they named it "Killdozer"
Sharky54 said: How is it different then what they use now? Also it would be more badass if they named it "Killdozer" |
Their first completely new chip since 2003. It's an unknown quantity, we can't speculate based on current performance or architecture like we can with everything that's just an upgrade of an older design.
What we do know is that:
- 8 cores for desktop, 16 cores for server, on 32nm. Backwards-compatible with current AM3 boards. The top bin will not exceed 140W power for 16 cores and the most efficient bin will not exceed 35W for 8 cores.
- It will have something a bit like Intel's Hyperthreading, where each "module" contains two integer cores but shares an FPU, cache and a decode stage. What this effectively means is that you get almost as much performance as two cores (+80% on one core) in only 5% more space than one core (a much higher gain than Intel's HT, which is +20% on one core typically and undercertain workloads it actually decreases performance, and also uses about 5% more space).
- As a conservative estimate (this is officially), the 16-core version will be 60% faster in highly threaded workloads than the upcoming 12-core Magny-Cours product.
JWS said: I wouldn't bother upgrading the video card until the next gen consoles are out. Pc games are becoming console ports with console graphics. The card you have is more powerful than the ps3 or the 360 graphics cards , so it will play almost all pc games on high settings.(yes i know there are exceptions). |
Honestly this is my take on it as well. There is no point spending $200-$300+ on a video card when a much cheaper video card will get you basically the same results. I would keep your video card (which frankly will play most games on high settings anyways) and wait at least a year to upgrade.
Demon's Souls Official Thread | Currently playing: Left 4 Dead 2, LittleBigPlanet 2, Magicka
Wait, I'm confused lol. Its going to be 8 cores. But smaller cores? Explain it in simpler terms. I am so bad with all this computer mumbo jumbo. What i get is 8 cores, something like hyper-threading, so basically 16 thread like things for the 8 core and 32 for the 16? how is this different then my intel quadcore with 8 threads?
Sharky54 said: Wait, I'm confused lol. Its going to be 8 cores. But smaller cores? Explain it in simpler terms. I am so bad with all this computer mumbo jumbo. What i get is 8 cores, something like hyper-threading, so basically 16 thread like things for the 8 core and 32 for the 16? how is this different then my intel quadcore with 8 threads? |
I wasn't sure if you were interested so I didn't go into a lot of detail. I'll do a longer explanation. Some words I use may be how AMD uses them but not how Intel does.
Definition of core: integer code execution unit. Basic workhorse for CPU tasks.
Definition of FPU: Floating point execution unit. Less commonly used by desktop CPU tasks. More like a Cell SPU in terms of what it can do.
Intel's hyperthreading is one core with the ability to run two threads to fill up gaps in the execution pipeline. Best case (encoding) this can improve performance, worst case (some games) this can actually perform slower than running only one thread because they choke the integer core. Performance is 95% to 120% of a single non-HT core.
AMD's version (not called CMT, but I'll use that name) is two integer cores with some parts (like the bit that reads and decodes instructions, and L2 Cache) shared and a single FPU between them (This is a 'module'). Since the shared parts aren't bottlenecked this only incurs a slight performance penalty compared to two completely seperate cores. The FPU is shared because you need less of them to do normal CPU tasks. But there's still two full integer cores there, executing one thread each. Performance is consistently 180% of a single core.
Both methods add about 5% to the die size of a single core without HT/CMT.
8 core Bulldozer die = 8 integer cores, 4 modules, 4 FPUs
16 core Bulldozer die = 16 integer cores, 8 modules, 8 FPUs
Single-thread performance for Bulldozer will be much higher than current AMD chips. And as I said, a 16-core, 16-thread Bulldozer will perform 60% better than a 12-core, 12-thread current Opteron. From current server performance data, clock for clock, AMD's 6-core, 6-thread Opteron performs similarly to Intel's 4-core 8-thread Xeon, because physical cores (AMD's current ones and future CMT ones) are better than virtual cores (HT). So we can expect Bulldozer's performance with 8 cores to be MUCH greater than your 8-thread.
AMD believe there method is better than Intel's in terms of performance and consistency of performance. We will see when the chip comes out whether they are right.
Ahh, that sounds like some intense stuff, what do you think of it? Think it will be as good as they say?
Sharky54 said: Ahh, that sounds like some intense stuff, what do you think of it? Think it will be as good as they say? |
It is ambitious, more so than the Athlon 64, more so than Core 2 or Nehalem were. It will either be a huge success or complete failure, and it is impossible to tell which beforehand.
If it fails, they will go bankrupt. They are on the edge of profitability and losing any more marketshare in any segment would put them permenantly underwater. So, I hope it works.
Intel's Sandy Bridge looks conservative and cost-saving. They're bringing the GPU on die and the first version will only have 4 cores according to rumours; i.e. it is a low-power, mobile-focused product. I'm sure it will perform better than Nehalem but not like Nehalem was to Core 2. That's an opportunity for AMD.
It looks like around june I will come into an extra 4-6 grand depending on how some stuff goes. Whats the best of the best as far as cards? Those new cards people talked about. How much will they be? What was it called again?
Right now, the best is the ATi 5970, but by that time, Nvidia should have new cards out, and ATi will have a refresh of their lineup as well. So, you'll really just have to wait and see for now.