By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Wii 2 may get Blu-ray drive

TomaTito said:
gumby_trucker said:
TomaTito said:

For next genration I'd like to hear the fan instead of the optical drive.

LOL

Maybe some clever engineers can build a fan that when rotating produces similar tones to that of an optical drive, and then program it to operate half a phase out of sync with whatever speed the drive is doing at that moment

Free Noise Cancellation FTW!!

That would be great! Although you'd have to find a very noisy fan for that to work xD

This is ofcourse just a joke. It would be much less of a pipe-dream to invest in some regular active noise canceling technology to do the trick. Shouldn't be too expensive either to try and add one of your own systems to a home console or PC.. I'm sure somebody somewhere on the web has already tried this lol



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

Around the Network

"True, though you'd still see greater expense that would fall entirely to the publishers, and we know how bratty they can be about this sort of thing."

Well developers have been bratty enough that I give on that point.

"Fair enough. I've explained my side of the argument, though. What is your reasoning to believe they would go with flash and have a built-in optical drive for b/c?

Other than, because it's cool."

Because BC is a good thing? Because it helps make customers from previous systems more comfortable? I'm not stating BC is just a neat feature. It adds to the system library.



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

Dr.Grass said:
LordTheNightKnight said:
Dr.Grass said:
zarx said:
Viper1 said:

  PS3 wouldn't match the fill rate of 3 high end cards from 2002-2003.  Not even close.


PS3 RSX GPU fillrate 4000 MP/s pixel 12000 MT/s texture Nvidia's most powerful GPU as of 1st of jan 2004 3800 MP/s 3800 MT/s


Viper, I'm forced now to not take anything you say seriously again. You just don't know what your talking about and somehow showing off as an expert.


Viper1 was discussing a graphical leap, not a raw numbers leap. And Moore's law hit a brick wall years ago, which is why multi-core processors are the major advancements instead of processor speed.


Then he shouldn't start using numbers in his arguments. And the raw numbers go hand in hand with the graphical leap so what's your point?

Moore's law hasn't hit a brick wall. The fact that multiple processors are used doesn't take away from the fact that the processing speeds of pc's have gone way up in accordance with the the law.

This is simply not true. Transistors have indeed continued to shrink more or less as Moore predicted, but that in no way means performance gains at the same pace. I'm no expert but I think there is a problem with increasing the switching speed of a transistor too much at these tiny scales as they become prone to "leaking" current simply due to Brownian motion. This I think is one of the reasons why so called "processing speeds" (amount of MHz/GHz) have barely gone up in the last decade. Check Wikipedia and you'll see Pentium 4 processors at speeds above 3 GHz have been on the market as far back as 2002.

With the move to multi-threaded and later on multi core CPUs it becomes possible to perform more tasks in parallel so as far as the consumer is concerned they are getting more "bang" per MHz due to parallelization, but this scales much worse then you think it does. ie: twice the cores is in most practical cases much less than twice the performance on any given application. Some of this can be improved by fundamentally re-writing programs which were meant to run serially, but even this process takes many years and it's not always clear how much it benefits performance.



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

LordTheNightKnight said:

"True, though you'd still see greater expense that would fall entirely to the publishers, and we know how bratty they can be about this sort of thing."

Well developers have been bratty enough that I give on that point.

"Fair enough. I've explained my side of the argument, though. What is your reasoning to believe they would go with flash and have a built-in optical drive for b/c?

Other than, because it's cool."

Because BC is a good thing? Because it helps make customers from previous systems more comfortable? I'm not stating BC is just a neat feature. It adds to the system library.

BC is good but everything has it's price and what you are suggesting seems prohibitive in terms of cost. Remember that even Sony removed what many argued was their most compelling feature in the early days of the PS3 (hardware emulation) in order to save a few bucks and the hardware they removed for that was much less costly than a physical drive.

You probably missed my edit so I'm copying it here:

edit: btw if you're interested I'd be willing to wager a bet, for fun. Loser treats winner to 800 points at the Nintendo shop, or something like that. We should know enough details by the end of E3.

*just to be clear on this: you are saying Nintendo will announce Cafe with the primary physical medium being flash storage, and  that the system will also have a built-in DVD drive for older games. I am saying they won't have flash as the primary medium (support for SD cards like Wii or something similar doesn't count as it's a given imo) and will instead have some form of optical drive which will allow playing old as well as new games.



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

"BC is good but everything has it's price and what you are suggesting seems prohibitive in terms of cost. Remember that even Sony removed what many argued was their most compelling feature in the early days of the PS3 (hardware emulation) in order to save a few bucks and the hardware they removed for that was much less costly than a physical drive."

The PS3 was sold at a loss, so any extra cost had to be removed. The BC was not prohibitively expensive. The entire system was prohibitively expensive. This would not apply to the next Nintendo system.

And I'm not claiming that Nintendo will do it, just that they might, and I'm arguing against your claim that it's not practical in terms of cost.



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

Around the Network
LordTheNightKnight said:

"BC is good but everything has it's price and what you are suggesting seems prohibitive in terms of cost. Remember that even Sony removed what many argued was their most compelling feature in the early days of the PS3 (hardware emulation) in order to save a few bucks and the hardware they removed for that was much less costly than a physical drive."

The PS3 was sold at a loss, so any extra cost had to be removed. The BC was not prohibitively expensive. The entire system was prohibitively expensive. This would not apply to the next Nintendo system.

And I'm not claiming that Nintendo will do it, just that they might, and I'm arguing against your claim that it's not practical in terms of cost.

if it's practical in terms of cost why wouldn't they do it?



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

gumby_trucker said:
Dr.Grass said:
LordTheNightKnight said:
Dr.Grass said:
zarx said:
Viper1 said:

  PS3 wouldn't match the fill rate of 3 high end cards from 2002-2003.  Not even close.


PS3 RSX GPU fillrate 4000 MP/s pixel 12000 MT/s texture Nvidia's most powerful GPU as of 1st of jan 2004 3800 MP/s 3800 MT/s


Viper, I'm forced now to not take anything you say seriously again. You just don't know what your talking about and somehow showing off as an expert.


Viper1 was discussing a graphical leap, not a raw numbers leap. And Moore's law hit a brick wall years ago, which is why multi-core processors are the major advancements instead of processor speed.


Then he shouldn't start using numbers in his arguments. And the raw numbers go hand in hand with the graphical leap so what's your point?

Moore's law hasn't hit a brick wall. The fact that multiple processors are used doesn't take away from the fact that the processing speeds of pc's have gone way up in accordance with the the law.

This is simply not true. Transistors have indeed continued to shrink more or less as Moore predicted, but that in no way means performance gains at the same pace. I'm no expert but I think there is a problem with increasing the switching speed of a transistor too much at these tiny scales as they become prone to "leaking" current simply due to Brownian motion. This I think is one of the reasons why so called "processing speeds" (amount of MHz/GHz) have barely gone up in the last decade. Check Wikipedia and you'll see Pentium 4 processors at speeds above 3 GHz have been on the market as far back as 2002.

With the move to multi-threaded and later on multi core CPUs it becomes possible to perform more tasks in parallel so as far as the consumer is concerned they are getting more "bang" per MHz due to parallelization, but this scales much worse then you think it does. ie: twice the cores is in most practical cases much less than twice the performance on any given application. Some of this can be improved by fundamentally re-writing programs which were meant to run serially, but even this process takes many years and it's not always clear how much it benefits performance.

 

http://www.intel.com/technology/mooreslaw/

 

That's Moore's law still aplying right there. Here's more:

 

"Moore’s law isn’t tracking exactly, but the spirit of the law is still alive in that the dies are still shrinking, and CPUs become more and more capable every 12-18 months or so," said Joel Santo Domingo, lead analyst, desktops at PCMag.com. His former boss agrees. 

"I did the math, and while it’s not exactly doubling every two years, it’s pretty close," agreed Michael Miller, the award-winning math geek and former editor in chief of PCMag.com.  



Read more: http://www.foxnews.com/scitech/2011/01/04/years-later-does-moores-law-hold-true/#ixzz1Mq8tUOy2

EDIT: So if that graph up there is straight then the growth is exponential then the law pretty much holds. It looks close enough to me.



gumby_trucker said:
TomaTito said:
gumby_trucker said:
TomaTito said:

For next genration I'd like to hear the fan instead of the optical drive.

LOL

Maybe some clever engineers can build a fan that when rotating produces similar tones to that of an optical drive, and then program it to operate half a phase out of sync with whatever speed the drive is doing at that moment

Free Noise Cancellation FTW!!

That would be great! Although you'd have to find a very noisy fan for that to work xD

This is ofcourse just a joke. It would be much less of a pipe-dream to invest in some regular active noise canceling technology to do the trick. Shouldn't be too expensive either to try and add one of your own systems to a home console or PC.. I'm sure somebody somewhere on the web has already tried this lol

I know :P Buying some noise cancelling headphones can do the trick just as well, you just need a long cord



@Twitter | Switch | Steam

You say tomato, I say tomato 

"¡Viva la Ñ!"

Dr.Grass said:
gumby_trucker said:
Dr.Grass said:
LordTheNightKnight said:
Dr.Grass said:
zarx said:
Viper1 said:

  PS3 wouldn't match the fill rate of 3 high end cards from 2002-2003.  Not even close.


PS3 RSX GPU fillrate 4000 MP/s pixel 12000 MT/s texture Nvidia's most powerful GPU as of 1st of jan 2004 3800 MP/s 3800 MT/s


Viper, I'm forced now to not take anything you say seriously again. You just don't know what your talking about and somehow showing off as an expert.


Viper1 was discussing a graphical leap, not a raw numbers leap. And Moore's law hit a brick wall years ago, which is why multi-core processors are the major advancements instead of processor speed.


Then he shouldn't start using numbers in his arguments. And the raw numbers go hand in hand with the graphical leap so what's your point?

Moore's law hasn't hit a brick wall. The fact that multiple processors are used doesn't take away from the fact that the processing speeds of pc's have gone way up in accordance with the the law.

This is simply not true. Transistors have indeed continued to shrink more or less as Moore predicted, but that in no way means performance gains at the same pace. I'm no expert but I think there is a problem with increasing the switching speed of a transistor too much at these tiny scales as they become prone to "leaking" current simply due to Brownian motion. This I think is one of the reasons why so called "processing speeds" (amount of MHz/GHz) have barely gone up in the last decade. Check Wikipedia and you'll see Pentium 4 processors at speeds above 3 GHz have been on the market as far back as 2002.

With the move to multi-threaded and later on multi core CPUs it becomes possible to perform more tasks in parallel so as far as the consumer is concerned they are getting more "bang" per MHz due to parallelization, but this scales much worse then you think it does. ie: twice the cores is in most practical cases much less than twice the performance on any given application. Some of this can be improved by fundamentally re-writing programs which were meant to run serially, but even this process takes many years and it's not always clear how much it benefits performance.

 

http://www.intel.com/technology/mooreslaw/

 

That's Moore's law still aplying right there. Here's more:

 

"Moore’s law isn’t tracking exactly, but the spirit of the law is still alive in that the dies are still shrinking, and CPUs become more and more capable every 12-18 months or so," said Joel Santo Domingo, lead analyst, desktops at PCMag.com. His former boss agrees. 

"I did the math, and while it’s not exactly doubling every two years, it’s pretty close," agreed Michael Miller, the award-winning math geek and former editor in chief of PCMag.com.  



Read more: http://www.foxnews.com/scitech/2011/01/04/years-later-does-moores-law-hold-true/#ixzz1Mq8tUOy2

EDIT: So if that graph up there is straight then the growth is exponential then the law pretty much holds. It looks close enough to me.

Moore's law is that transistors shrink, not that switching gets faster. That was precisely my point and is in agreement with what your graph shows. Notice I said as much myself in my first sentence, and I didn't bold the part of your statement that referenced Moore's law.
 



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

gumby_trucker said:
Dr.Grass said:
gumby_trucker said:
Dr.Grass said:
LordTheNightKnight said:
Dr.Grass said:
zarx said:
Viper1 said:

  PS3 wouldn't match the fill rate of 3 high end cards from 2002-2003.  Not even close.


PS3 RSX GPU fillrate 4000 MP/s pixel 12000 MT/s texture Nvidia's most powerful GPU as of 1st of jan 2004 3800 MP/s 3800 MT/s


Viper, I'm forced now to not take anything you say seriously again. You just don't know what your talking about and somehow showing off as an expert.


Viper1 was discussing a graphical leap, not a raw numbers leap. And Moore's law hit a brick wall years ago, which is why multi-core processors are the major advancements instead of processor speed.


Then he shouldn't start using numbers in his arguments. And the raw numbers go hand in hand with the graphical leap so what's your point?

Moore's law hasn't hit a brick wall. The fact that multiple processors are used doesn't take away from the fact that the processing speeds of pc's have gone way up in accordance with the the law.

This is simply not true. Transistors have indeed continued to shrink more or less as Moore predicted, but that in no way means performance gains at the same pace. I'm no expert but I think there is a problem with increasing the switching speed of a transistor too much at these tiny scales as they become prone to "leaking" current simply due to Brownian motion. This I think is one of the reasons why so called "processing speeds" (amount of MHz/GHz) have barely gone up in the last decade. Check Wikipedia and you'll see Pentium 4 processors at speeds above 3 GHz have been on the market as far back as 2002.

With the move to multi-threaded and later on multi core CPUs it becomes possible to perform more tasks in parallel so as far as the consumer is concerned they are getting more "bang" per MHz due to parallelization, but this scales much worse then you think it does. ie: twice the cores is in most practical cases much less than twice the performance on any given application. Some of this can be improved by fundamentally re-writing programs which were meant to run serially, but even this process takes many years and it's not always clear how much it benefits performance.

 

http://www.intel.com/technology/mooreslaw/

 

That's Moore's law still aplying right there. Here's more:

 

"Moore’s law isn’t tracking exactly, but the spirit of the law is still alive in that the dies are still shrinking, and CPUs become more and more capable every 12-18 months or so," said Joel Santo Domingo, lead analyst, desktops at PCMag.com. His former boss agrees. 

"I did the math, and while it’s not exactly doubling every two years, it’s pretty close," agreed Michael Miller, the award-winning math geek and former editor in chief of PCMag.com.  



Read more: http://www.foxnews.com/scitech/2011/01/04/years-later-does-moores-law-hold-true/#ixzz1Mq8tUOy2

EDIT: So if that graph up there is straight then the growth is exponential then the law pretty much holds. It looks close enough to me.

Moore's law is that transistors shrink, not that switching gets faster. That was precisely my point and is in agreement with what your graph shows. Notice I said as much myself in my first sentence, and I didn't bold the part of your statement that referenced Moore's law.
 


The original point was that processing speed increases significantly all the time. That's what you had a problem with. That's why you say the next generation wont offer a significant leap over this one.