By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Khuutra said:

Does that mean we're potentially coming up against a physical limitation concerning the power and scale of classic computing? Like.... within the next few decades?

Indeed. We will have to fight hard to have our classic computation implemented at those scales of a few nanometers, maybe use totally different hardware techniques that rely on quantum effects to achieve the same (classic) results.

To give you a timeline, we are talking of 22nm tech chips in 2011 (22nm being the size of a memory element). Ten years ago we were working at scales of 150nm, almost a 7x factor in size.

Thus, it's unlikely that we'll see the same factor of raw miniaturization in the next ten years that we've seen in the last decade, unless a radical breakthrough comes up. More probably, the race to small scales will level off, a bit like the race to many-GHz clocks has.

There are many other ways, of course, in which classic computing devices will improve. Many cores can be stacked like sandwiches, a recent proof of concept of a memristor could lead the way to more effcient, permanent memory chips and so on.

 

 



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman