By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC Discussion - which time we will see PC technology reach its limit ?

I think what you are looking for is the Singularity. It's not so much when it stops getting better, but when 1. We are no longer able to notice the difference and 2. When AI is self replicating and enhancing.

There is a point where the curve of Moore's Law sings up so drastically that it climbs almost infinitely fast.

(Which is predicted to be 2020.)



I would cite regulation, but I know you will simply ignore it.

Around the Network

I should point out that Moore's Law is based on a somewhat outdated view of matter in general. Where as we used to think the atom (atom is actually a greek word which is essentially an object that is so fundamental that it is not made of anything because it is the most basic piece) was the fundamental particle (hence the name).

We've already moved past atoms though, and while most people will cite quarks as the new fundamental particle in reality under the current reigning theory a quark is one of many fundamental particles which can be split up into two groups Fermions and Bosons (a quark is a Fermion). Essentially fermions make up the matter we see and bosons are responsible for the fundamental forces (ie electromagnetism, strong, weak, and technically gravity but thats a whole other can of worms).

But whats more is that the prevailing theory right now is that even these particles are not actually fundamental and that there exists a single underlying fundamental particle below this which has been dubbed the "string" (because they would look like strings if they exist). Suffice it to say they haven't confirmed their existence but there is mounting evidence, as well as several aspects of the theory that have turned up some surprisingly elegant solutions to problems that physicists had been working on for years. One in particular got a lot of people interested but I doubt anyone wants me to explain what a Calabi-Yau space is =P

In short we keep finding smaller and smaller fundamental particles and as along as we do the sky is the limit. Of course then again things like the Heisenberg Uncertainty principle and the "weirdness" that results from its implications gives me a headache to think how we could even utilize these particles effectively.



To Each Man, Responsibility

@Sqrl: As far as I know, Moore's law says nothing about a limit in growth. It only quantifies the exponential improvement rate of chip technology.



My Mario Kart Wii friend code: 2707-1866-0957

*remembers floppy discs*



NJ5 said:
@Sqrl: As far as I know, Moore's law says nothing about a limit in growth. It only quantifies the exponential improvement rate of chip technology.

Agreed but if we hit a fundamental particle barrier the only way to exponentially increase power is through the exponential increase in the use of existing components...that won't take too long for it to hit a cieling of practicality in terms of size.

 



To Each Man, Responsibility
Around the Network
Sqrl said:
NJ5 said:
@Sqrl: As far as I know, Moore's law says nothing about a limit in growth. It only quantifies the exponential improvement rate of chip technology.

Agreed but if we hit a fundamental particle barrier the only way to exponentially increase power is through the exponential increase in the use of existing components...that won't take too long for it to hit a cieling of practicality in terms of size.

 

Moore's "law" (I'd call it a prediction) is precisely that - that the number of transistors on CPUs doubles every 18 months. Performance improvements are just a corollary. It is sometimes stated as being about performance, but that's an altered version of his original statement.

When Moore's law can't hold anymore due to impracticality of shoving more and more transistors in the same planar integrated circuit, 3D circuits or another paradigm will take over (potentially using knowledge about smaller particles as you said).

 



My Mario Kart Wii friend code: 2707-1866-0957

NJ5 said:
Sqrl said:
NJ5 said:
@Sqrl: As far as I know, Moore's law says nothing about a limit in growth. It only quantifies the exponential improvement rate of chip technology.

Agreed but if we hit a fundamental particle barrier the only way to exponentially increase power is through the exponential increase in the use of existing components...that won't take too long for it to hit a cieling of practicality in terms of size.

 

Moore's "law" (I'd call it a prediction) is precisely that - that the number of transistors on CPUs doubles every 18 months. Performance improvements are just a corollary. It is sometimes stated as being about performance, but that's an altered version of his original statement.

When Moore's law can't hold anymore due to impracticality of shoving more and more transistors in the same planar integrated circuit, 3D circuits or another paradigm will take over (potentially using knowledge about smaller particles as you said).

 

Trust me as a computer electronics engineering student I've run across it dozens of times, but there really isn't any use in trying to fight the minor semantic error, especially when its used in a more muindane conversation and still gets the idea across just as well.

Also, multi-level circuits are already in use, although not widespread use. There have also been several attempts to rethink the binary approach to computing, and I'm sure most people are aware of the attempts to get a quantum computer working which has had some surprising results already.

 



To Each Man, Responsibility
Sqrl said:

Trust me as a computer electronics engineering student I've run across it dozens of times, but there really isn't any use in trying to fight the minor semantic error, especially when its used in a more muindane conversation and still gets the idea across just as well.

Also, multi-level circuits are already in use, although not widespread use. There have also been several attempts to rethink the binary approach to computing, and I'm sure most people are aware of the attempts to get a quantum computer working which has had some surprising results already.

 

 

Agreed.

Regarding Quantum Computing, its importance tends to be overstated. As far as current knowledge goes, QC is just a way to get fast solutions to some particular problems like searching/inverting a function, or hard problems like factoring. It is not a general way to get exponential speedups on every computational problem. As far as current knowledge goes, a Quantum Computer could be used as a co-processor for some applications, not as a replacement of traditional computing.

 



My Mario Kart Wii friend code: 2707-1866-0957

When hell freezes over.



Probably never...there's still quantum computing anyway.