By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General - IBM develops 'instantaneous' memory, 100x faster than flash

zarx said:
Rainbird said:
Suck it Von Neumann bottleneck! You're no longer as bad as you used to be!

Well until Graphene based 100GHz CPUs 

http://www.engadget.com/2010/02/07/ibm-demonstrates-100ghz-graphene-transistor

Well, that and harddrives are still slow as fuck.



Around the Network
Rainbird said:
zarx said:
Rainbird said:
Suck it Von Neumann bottleneck! You're no longer as bad as you used to be!

Well until Graphene based 100GHz CPUs 

http://www.engadget.com/2010/02/07/ibm-demonstrates-100ghz-graphene-transistor

Well, that and harddrives are still slow as fuck.


Um the "instantaneous" memory could be used in SSDs so HDD shouldn't be an issue



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

manuel said:
zarx said:

 

Yay no more loading screens 


Somebody WILL find a way to develop something so complex/big that the loading times will be as slow as they are today. :P


Well, that seems to be the rule this days... I still remember how my ass hurt of laughing so hard when I thought about runing out of hd memory when I bought my 2 GB disk ten years ago, or how I dumbly stared to The Force Unleashed on HD twins back in 2008 E3 videos after 9 years away from gaming and I thought how could they make a game look better and how it was any necessary for any game to do so.



pariz said:
manuel said:
zarx said:

 

Yay no more loading screens 


Somebody WILL find a way to develop something so complex/big that the loading times will be as slow as they are today. :P


Well, that seems to be the rule this days... I still remember how my ass hurt of laughing so hard when I thought about runing out of hd memory when I bought my 2 GB disk ten years ago, or how I dumbly stared to The Force Unleashed on HD twins back in 2008 E3 videos after 9 years away from gaming and I thought how could they make a game look better and how it was any necessary for any game to do so.

Yeah, I also had my fair share of WTF moments.

I remember the internet at a friends place (via ISDN) was very fast in 1997. All sites loaded fast. Then broadband internet came, the sites got more complicated and loaded with flash and although the connection speed was a lot faster, the sites actually loaded slower than before. :/

That's why I believe that every time something better/faster is developed, content producers will find a way to negate those advancements with ease. :P



Need something off Play-Asia? http://www.play-asia.com/

IBM come out with a new experimental memory type every five minutes but none of them make it to actual products.

@zarx

If the transistors can only be made on a scale ~50x larger than silicon that's its entire advantage negated, and I think that's likely,



Around the Network
zarx said:
Rainbird said:
zarx said:
Rainbird said:
Suck it Von Neumann bottleneck! You're no longer as bad as you used to be!

Well until Graphene based 100GHz CPUs 

http://www.engadget.com/2010/02/07/ibm-demonstrates-100ghz-graphene-transistor

Well, that and harddrives are still slow as fuck.

Um the "instantaneous" memory could be used in SSDs so HDD shouldn't be an issue

Hot damn, you're right, I was only thinking of it as appropriate for RAM. 



Soleron said:
IBM come out with a new experimental memory type every five minutes but none of them make it to actual products.

@zarx

If the transistors can only be made on a scale ~50x larger than silicon that's its entire advantage negated, and I think that's likely,


Currently yes but that is why we don't have graphene chips now in 5-10 years we should have solved that problem if not moved on to something else entirely.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

zarx said:
Soleron said:
IBM come out with a new experimental memory type every five minutes but none of them make it to actual products.

@zarx

If the transistors can only be made on a scale ~50x larger than silicon that's its entire advantage negated, and I think that's likely,


Currently yes but that is why we don't have graphene chips now in 5-10 years we should have solved that problem if not moved on to something else entirely.

Well CMOS is good down to 11nm according to Intel.

By the time we need something better than 11nm, process R&D costs will be so high that no one will be able to afford the next transition for a while. If you need to spend more than 1.3 billion dollars every two years, I'm not sure any company other than Intel will want to do it. And if there's no competition there is no advancement.



Companies like IBM and Samsung have been promissing us alien tech in the last 15 years and few of it ever make it to the market.

I honestly expect computer industry to become like guns or cars in the mid run - you developed it to be faster, okay, but where to go now, besides new gimmickier and more efficient designs? Costs are too high, simple physics doesn't allow it, no developers willing do anything new with your product...



 

 

 

 

 

Soleron said:
zarx said:
Soleron said:
IBM come out with a new experimental memory type every five minutes but none of them make it to actual products.

@zarx

If the transistors can only be made on a scale ~50x larger than silicon that's its entire advantage negated, and I think that's likely,


Currently yes but that is why we don't have graphene chips now in 5-10 years we should have solved that problem if not moved on to something else entirely.

Well CMOS is good down to 11nm according to Intel.

By the time we need something better than 11nm, process R&D costs will be so high that no one will be able to afford the next transition for a while. If you need to spend more than 1.3 billion dollars every two years, I'm not sure any company other than Intel will want to do it. And if there's no competition there is no advancement.


The semiconductor industry is currently worth over 100 billion a year and is forcast to continue it's trend of increasing by ~15% (last year was up 17.9%) even if it slows down it's no likely to stop even if research will move away from CMOS shrinking and more towords alternative meterials (graphane) or even 3D processors.  Even if Intels competitors fall behind, Intel will still be compeled to develop better tech because their business model relies on customers buying a new chip every few years, they may slow down a tiny bit and prices will certainly increase. But advancement won't stop until no advancements can be made.

Besides IBM is still up there with R&D, AMD is barely hanging in there, Nvidia could do well once they move into the SOC space in a bigger way and Samsung and Toshiba are both rising fast, Samsung especially with all the phone and tablet chips they are making now.



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!