By using this site, you agree to our Privacy Policy and our Terms of Use. Close

@ alpha_dk said:

"Numbers of linux kernel developers has raised from 483 to 1057 in 3 years"

response:

Various companies have been pouring in money to get their architectures supported as linux has become the standard unix. That doesn't say anything about the OS industry as a whole. Just think about how many OS's experimental and commercial there were in the 80s. And anyhow, its not just AIX, I got the same impression from MS (not that I would ever work for them or AIX for that matter, just went to the interviews for free vacations). And you think there are tons of people with MACH experience, HAH.

 And most commercial OS's these days are more in maintenance mode than they were in previous years. A lot of the changes people think are part of the OS are really some userspace app.

 Plus have you ever met any of these developers? They tend to be older than the average developer in my experience with the exception of the Darwin (Mach) guys.

@ alpha_dk said:

I can't argue with this, but I fail to see how it applies to the argument; this has been ongoing for years, and development of low-level stuff has only increased recently (see LLVM, rate of Linux development, rate of BSD development, etc.  They are all rapidly growing) 

response:

Rate of BSD development compared to when? Once upon a time they invented the TCP stack (which MS stole w/o giving anything back, yay BSD liscence). And the rate of increase is going to be pretty high when you go from 5 to 10.

But the main point was that there was a time when you needed to know how things really worked inorder to do any kind of development. Hence the transition was easier. Now it's impossible, you cannot train a VB programmer to do kernel development in 6 months.

Additionally people were more open minded to different architectures. If you think cell is a complex architecture just look at the old Cray machines.

This is of course all anecdotal as if im going to be doing any research its going to be on my thesis. But I can tell you the our CS department is under extreme pressure to change the curricula and take out C as many of the big schools have. Maybe its all in their heads, I dunno. But Fortran (which in many situations is more efficient than C) and asm are no longer required taught (outside of a general P-lang or architecture class). Was this the case 15 years ago?

alpha_dk said:

There is more hard with programming the cell than the lack of cache coherency.  That is a thing that is hard about it, and programmers will have to get used to it, but people are not used to SIMD'izing general purpose code; we have a lot of experience with SIMDizing certain types of algorithms, but not much for others because there has not been much need to do it yet, as there have really only been special-purpose SIMD processors.  With the advent of SIMD general-purpose processors, more research is going into this, and Cell (and other similar) compilers are only going to get better as time goes on.  Which is the entire point of me saying that Sony decided to use it too early; IBM has had great success with its Cell compilers, but it is focusing on their own needs and not Sony's (as they should).  In a few years, this research will likely get into more generalized speed ups.

response:

SIMD is not THAT radical. Seymore Cray would like to have a word with you. gcc, suncc, and icc have had loop vectorization for a while. You just need to know what you are doing a little bit. x86 programmers have been using it for awhile now. You never use the SSE flags when you compile your linux apps (you come across as either a linux or bsd guy). And anyhow is SIMD not exactly what everyone was doing in the vector processor heydey?

Cache-coherency breaks the threading model people are used to otoh. They only "just" ported p-threads to cell on linux, and it's not your normal p-threads it's all cracked out with contexts and blah, blah. I don't even bother with it.