alpha_dk said:
I agree with what you are saying, but keep in mind that the way personal computing has trended in the past is that the server technology ('big iron' as some call it) tends to be a precursor for what is going to happen in personal computers. Decades ago, there was the mainframe and dumb terminals. Then, the technology got small enough that everyone could have their own mini-mainframe. Thus, personal computers were born. Mainframes continued to get faster and more streamlined, making new advancements to speed up execution (hyperthreading, multiple processors, larger caches, branch prediction, etc). This eventually made its way into personal computers as the price went down. Today we are already seeing the benefits of the move to multiple cores/processors. We are also at an interesting junction in the split of server technology. On the one hand, you have Intel moving with more and more general purpose cores. On the other hand, you have AMD buying ATI and including some of their tech into AMD's processors as a Specialized Processing Unit, IBM and Sony developing the Cell as another format of General CPU + multiple specialized processing Units. It's an interesting time to be following processor tech, in that no-one is quite sure yet which will be the better solution, a small number of general cores with specialized cores that can be used as needed, or a larger number of general cores. They each have their own strengths and weaknesses, and each could easily be viewed as a 'waste' of die space by proponents of the other view (why have a separate core for XXX if task XXX will never be needed? Why have a whole 8 general cores when you will only ever need 4? etc) To say that the cell is not useful for normal (read: games) programming is only true because it is still remarkably new tech. As compilers get better at SIMD-izing code, and new programming toolkits are made to make SIMD-izing easier (IBM has a really cool one under development (octopus if you are interested in looking it up), and I assume other companies do as well). These kind of tools will only improve with time, whereas we are reaching the point of diminishing returns with standard CPU compilers; it has yet to be shown how far we can go with specialized processors. I agree that the Cell is not as good for use in games yet; Sony definitely jumped the gun on when it would be ready for mass acceptance. Next generation the compiler technology for the Cell and processors like it will likely be so much better that it could be used with very little extra work; now it is just too early in the Cell's lifecycle. |
As it stands today, most server applications are built to take advantage of multi-processors/cores. This is nothing new there. Multi-threaded gaming in general is much newer. IBM was in a position to put the Cell right to work. They new this. What Sony did with it wasn't their problem. So far as gaming applications are concerned, it remains to be seen if the Cell can deliver. As it stands today, the Cell is clearly not the Swiss Army Processor. As I am fond of saying whenever people start to put it on too high of a pedistal, the proof is in the pudding. When it lives up to the potential that many Sony fanboys think it has, then I'll believe it. Until that point, they are just parroting Sony's marketing.
As far a technology goes, it is interesting to see Processor tech fragment in that way. Since engineers have had to find another way to get performance than just increasing clock speed, processor tech has gone in different directions. Some solutions, however, may not be great for all applications. In all likelyhood, there will not be one better technology that serves everyone.
I give that post a 9.8.
Thank god for the disable signatures option.