haxxiy said:
Even if you are right, that's an irrelevant argument to adress the issue in question. Laughing at economic doomsayers back in 1999 because they missed their schedule didn't stop the global recession from happening when it actually did. As for computing, a lot of money keeps going into it, you should know that, having studied the subject. Two decades ago Intel for instance spent 10 times less money on R&D to get the same results, proportionally speaking. Ever heard of diminishing returns instead? It's just a matter of time that the issue faced by GPUs nowadays grows and spreads to the rest of the market. This isn't a theoretical prediction - we are actually seeing it happening. Moore's law wouldn't last forever anyways. It's a bogus pseudo-law like Bode's law. It works while it still works. |
Not sure what your point is with all of this useless info. Moore's Law isn't in any more danger now than it was 30 years ago.
"At any rate, it seems that the laws of physics present no barrier to reducing the size of computers until bits are the size of atoms, and quantum behavior holds dominant sway."Richard Feynman
http://www.cs.princeton.edu/courses/archive/fall05/frs119/papers/feynman85_optics_letters.pdf







