NJ5 said:
Soleron said:
NJ5 said:
What about power consumption and size? Certainly each core won't use 50 times less power in 6 years (especially if you keep increasing its performance by 1.5 each year), if you put 50 cores there you've got a big, hot, power sucking beast. Is that realistic?
I don't think so.
|
It's not reasonable that each shader will increase in performance by 1.5x. The most I've ever seen for a GPU is 10-15%. Most of the performance increase is due to the die shrink. What do you mean by "cores"? GPUs don't have cores, and shaders are only a small part (30-40%) of the die area. If you mean X2 chips, well the scaling is poor and is less than 2 and a half times for 4 chips, what do you think 50 would be?
NJ5, you are correct. Most power reduction is from die shrinks, and you're lucky if you get much more than 30% reduction from a shrink. The 65nm 9800GTX used 20% less power than the 8800Ultra while being similar in design. 6 years means 3 die shrinks, so using 1/5 of the power is optimistic. 1/50 is ridiculous, especially if you also want to increase performance.
|
The usage of the word cores is a bit weird, so I assumed he meant multiplying the number of shader units by 50.
|
Do you expect the general architectures of GPUs as we know them today to scale much longer? That's not the direction in which we are going, as unified shader units are becoming de-facto more and more similar to CPU cores.
I used the term cores because that's what you will have: x86 cores in a Larrabee, SPUs in a Cell, and so on. Again, you must see this in the future perspective in which a GPU is basically an array of general purpose stream processors with very little specialized silicon compared to the amount of programmable cores.
And yes, power is the main trouble with this direction, as 32-cores Larrabee prototypes (65nm) are said to require about 300W. Flexibility is obviously not necessarily conciliable with efficiency. And yet, I'm writing this on a quad-core 3-something MHz desktop computer, not on a 10MHz one.
PS: I'm not saying his numbers are accurate, just trying to read them in the light of foreseeable hardware trends so that they make at least some sense. Since I don't have access to his actual words I can't really know what _he_ meant.