By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - AMD sees the era of Moore's law coming to a close

dsgrue3 said:
They've been predicting the end of Moore's Law every decade since it began. Shit's tired. We're always finding new ways.

Even if you are right, that's an irrelevant argument to adress the issue in question. Laughing at economic doomsayers back in 1999 because they missed their schedule didn't stop the global recession from happening when it actually did. 

As for computing, a lot of money keeps going into it, you should know that, having studied the subject. Two decades ago Intel for instance spent 10 times less money on R&D to get the same results, proportionally speaking. Ever heard of diminishing returns instead?

It's just a matter of time that the issue faced by GPUs nowadays grows and spreads to the rest of the market. This isn't a theoretical prediction - we are actually seeing it happening. Moore's law wouldn't last forever anyways. It's a bogus pseudo-law like Bode's law. It works while it still works.



 

 

 

 

 

Around the Network
haxxiy said:
dsgrue3 said:
They've been predicting the end of Moore's Law every decade since it began. Shit's tired. We're always finding new ways.

Even if you are right, that's an irrelevant argument to adress the issue in question. Laughing at economic doomsayers back in 1999 because they missed their schedule didn't stop the global recession from happening when it actually did. 

As for computing, a lot of money keeps going into it, you should know that, having studied the subject. Two decades ago Intel for instance spent 10 times less money on R&D to get the same results, proportionally speaking. Ever heard of diminishing returns instead?

It's just a matter of time that the issue faced by GPUs nowadays grows and spreads to the rest of the market. This isn't a theoretical prediction - we are actually seeing it happening. Moore's law wouldn't last forever anyways. It's a bogus pseudo-law like Bode's law. It works while it still works.

Not sure what your point is with all of this useless info. Moore's Law isn't in any more danger now than it was 30 years ago. 

"At any rate, it seems that the laws of physics present no barrier to reducing the size of computers until bits are the size of atoms, and quantum behavior holds dominant sway."Richard Feynman

http://www.cs.princeton.edu/courses/archive/fall05/frs119/papers/feynman85_optics_letters.pdf



dsgrue3 said:
They've been predicting the end of Moore's Law every decade since it began. Shit's tired. We're always finding new ways.

Predicting, yes. But the end is NOW. This year. No company other than Intel can afford chips half the size of the ones they had two years ago.

It's not about "new ways", we know what they are for 16nm and 11nm. It's about cost.

AMD and Nvidia just stopped doing yearly graphics card releases, which have happened since the PC graphics card became an actual thing around 1998.

There is no science barrier but that's irrelevant.



Wouldn't that mean that tablets and smartphones will catch up really quickly? so that we have ps3/360 quality on our phones in a year or 2?



 

Face the future.. Gamecenter ID: nikkom_nl (oh no he didn't!!) 

NiKKoM said:
Wouldn't that mean that tablets and smartphones will catch up really quickly? so that we have ps3/360 quality on our phones in a year or 2?

No, process node scaling affects phone chips too. Expect slower progress there, especially on battery life.

But, uh, the iPad 4 = a PS3 GPU. http://www.anandtech.com/show/6877/the-great-equalizer-part-3/3