Forums - Gaming Discussion - AMD sees the era of Moore's law coming to a close

 

 

AMD sees the era of Moore's law coming to a close

Summary: Moore's law has remained relevant for over half a century, but evidence is mounting to suggest that it is coming to an end. What will the end of Moore's law mean to the tech industry?

 

Back in 1965, Intel co-founder Gordon Moore noted that the total number of transistors within integrated circuits had doubled approximately every two years since 1958, and he predicted that this trend would continue "for at least 10 years". It continued a lot longer than that, up until 2010, in fact, when the International Technology Roadmap for Semiconductors finally started seeing evidence that the pace was slowing down at such a rate that by the end of 2013, it would take three years for the transistor count to double.

Now an industry insider is seeing the same thing. Speaking to The Inquirer, John Gustafson, chief graphics product architect at AMD, claimed that Moore's law is hitting the buffers because the law was always about the economics.

"You can see how Moore's law is slowing down," said Gustafson. "The original statement of Moore's law is the number of transistors that is more economical to produce will double every two years. It has become warped into all these other forms, but that is what he originally said."

And Gustafson should know what Moore said, because the phrase "Moore's law" was first coined by professor Carver Mead at Caltech, and Gustafson was a student of Mead's.

"We [AMD] want to also look for the sweet spot," said Gustafson, "because if you print too few transistors your chip will cost too much per transistor and if you put too many it will cost too much per transistor. We've been waiting for that transition from 28nm to 20nm to happen, and it's taking longer than Moore's law would have predicted.

"I'm saying you are seeing the beginning of the end of Moore's law."

Does the end of Moore's law mean that the sky is going to come crashing down on us? Well, yes and no. The PC industry is in terrible shape, and anything that comes along to put the brakes on progress is not good. A slower hardware upgrade cycle will mean that there will be less in the way of new CPUs and GPUs to tempt buyers.

The end of Moore's law is not good for the PC.

That said, CPU and GPU performance is already at the point where it offers more power than most users know what to do with. Silicon already comes with more cores and more threads than most applications can handle. What people want more than a faster CPU or GPU is a cheaper CPU or GPU.

Consumers are also turning their backs on traditional desktop and notebook PCs, and instead focusing on mobile devices such as smartphones and tablets. While there's a drive here to make silicon smaller and more compact, it is primarily driven by the desire to reduce battery consumption. While a breakdown of Moore's law might slow down progress here somewhat, users are far less likely to notice because the sale post-PC devices have rarely focused on the speed of the silicon.

Moore's law has served us well, and rather than be surprised that its era is drawing to a close, I for one am surprised that it reigned for as long as it did.

http://www.zdnet.com/amd-sees-the-era-of-moores-law-coming-to-a-close-7000013413/



Around the Network
I present to you, graphene. http://www.gizmag.com/graphene-interconnects-integrated-circuits/11934/

Not only that, it could be used purely in the processor and theoretically, all components requiring silicon. Graphene offers up almost 0 resistance, so speeds are astronomical. Graphene antennas are currently capable (in this early prototype form) of wireless data transfer rates of 100tb/s. For a comparison, the Xbox 360 has a memory bandwidth of ~22gb/s through silicon. Graphene will take a while to completely replace silicon in components, but in the meantime it will greatly help in keeping Moore's law in effect.

Graphene is also only 1 atom thick, and producing it doesn't cost a lot. Scientists recently took a consumer grade LightScribe DVD burner, and placed over 100 graphene micro-supercapacitors on a single disc in under 30 minutes. This helps the whole....economical problem as well as size constraints. 14nm fabrication? Please. Carbon atoms are .22nm in diameter.....try wrap your head around that one as a processor. o.O

Once science gets cracking on implementing graphene regularly, Moore's law should be just fine.

RicardJulianti said:
I present to you, graphene. http://www.gizmag.com/graphene-interconnects-integrated-circuits/11934/

Not only that, it could be used purely in the processor and theoretically, all components requiring silicon. Graphene offers up almost 0 resistance, so speeds are astronomical. Graphene antennas are currently capable (in this early prototype form) of wireless data transfer rates of 100tb/s. For a comparison, the Xbox 360 has a memory bandwidth of ~22gb/s through silicon. Graphene will take a while to completely replace silicon in components, but in the meantime it will greatly help in keeping Moore's law in effect.

Graphene is also only 1 atom thick, and producing it doesn't cost a lot. Scientists recently took a consumer grade LightScribe DVD burner, and placed over 100 graphene micro-supercapacitors on a single disc in under 30 minutes. This helps the whole....economical problem as well as size constraints. 14nm fabrication? Please. Carbon atoms are .22nm in diameter.....try wrap your head around that one as a processor. o.O

Once science gets cracking on implementing graphene regularly, Moore's law should be just fine.

If anything is right about this world, the person who patented this technology will officially name it Blast Processing.



I don't remember what I was going to put here.

Well, it's not the end of the world, and in turn it could also be positive.

I mean, 64-bit processors have been here for many years yet most of the programs we use are still coded to work in 32-bit environments. If all our software run taking advantage of what the CPUs can actually do, we would see faster PCs even though the CPUs wouldn't advance at such a higher pace as before.

 

Please excuse my bad English.

Currently gaming on:

+Console : Wii U, Xbox360

+PC: i5-4670k@stock (for now), 16Gb RAM 1600 MHz, HD5850 1Gb

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm more a single player gamer.

Well when they artificially slow down the evolution of technology (lets just make small improvements because people will still buy our stuff) because games slowed down the graphics evolution thanks to old consoles. Noone is going to buy computers.

I mean gamers are like the biggest group of PC buyers. Sure there is companies etc but usually they buy a cheap office PC and thats it for 5 years. Whereas Gamers bought PCs like every 2 years.

Seriously when you bought a high end computer in 2008 you actually had no reason to buy another one. PLaying the newest game ever with 35FPS or 50 FPS doesnt make "much" difference. Or changing the settings from ultra high to just high.

Sure there is those people that always want to have the best possible experience but everyone I know didnt buy a computer since 2009. Me and all of my friends usually bought a new computer every 2 years because we had a reason to do so.

There is this occasional game like GTA4 that runs shitty on most old PCs because of the CPU limitation but thats it. My 2008 NVIDIA 8800GT(oced) can play Borderlands 2 with highest settings with around 40 FPS in 1080p.

The problem is that Nvidia and AMD etc thought that they could milk the customers with their slow evolving products. But they somewhat killed the market on their own. Compare the average 150 dollar card from 2002 to 2009 and you will always see massive performance boosts. stuff like 6600GT vs 7600GT vs 8800GT then with the arrival of the 9800GT and GTX200 series it drastically slowed down. You either had to pay 3x the money to get the same performance boost or you got 20% boost for the same money....

I wanted to buy a new computer for Skyrim because my PC falls to like 26 FPS occasionally but then I thought: WOW when I think about it buying new hardware is totally stupid you waste hundreds of dollars and then you have like +14 FPS?!? no thanks!. I will wait for the PS4 nextbox to come out and 1 year later everything will be affordable and super fast like it was before the consoles slowed down everything.



Around the Network
I am starting my job in Intel in two months. I was talking to an employee a few days ago and he said the technology they are hoping to start by the end of this year will eclipse anything out there. So I don't think Moore's law is coming to a close any time soon.

For everyone except Intel, it is as he says "no longer economical to produce" twice the transistor density every two years.

Intel had 22nm in 2012
TSMC will have 22nm in 2013
Globalfoundries/Samsung will have 22nm in 2014

Everyone else has given up. 5 years ago more than 20 companies had the latest process node.

So with NO COMPETITION, Intel may find it economical to slow down the pace of new nodes. Despite new technology it may be more profitable to wait. Moore's law isn't about what CAN be done.

Also AMD is extremely dead because their CPU tech is no longer competitive and their manufacturing via GF will be two years behind Intel even if that wasn't the case.



Soleron said:

For everyone except Intel, it is as he says "no longer economical to produce" twice the transistor density every two years.

Intel had 22nm in 2012
TSMC will have 22nm in 2013
Globalfoundries/Samsung will have 22nm in 2014

Everyone else has given up. 5 years ago more than 20 companies had the latest process node.

So with NO COMPETITION, Intel may find it economical to slow down the pace of new nodes. Despite new technology it may be more profitable to wait. Moore's law isn't about what CAN be done.

Also AMD is extremely dead because their CPU tech is no longer competitive and their manufacturing via GF will be two years behind Intel even if that wasn't the case.

Samsung will probably catch up with Intels current technologies but yes I agree, for other companies there is no point to even try and compete.

OFF topic: Poor AMD



I'm actually happy that Moore's law is coming to an end because this will ensure that the next gen consoles will remain relevant for a long time. PC's simply won't outmatch the consoles as fast as they used to.

Even when I'm a PC gamer (I'm currently not) I feel it's stressing that there's always new and better hardware around the corner, making my current PC obsolete ahead of time.

Around the Network
They've been predicting the end of Moore's Law every decade since it began. Shit's tired. We're always finding new ways.