By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Wii 2 may get Blu-ray drive

Dr.Grass said:
gumby_trucker said:

Moore's law is that transistors shrink, not that switching gets faster. That was precisely my point and is in agreement with what your graph shows. Notice I said as much myself in my first sentence, and I didn't bold the part of your statement that referenced Moore's law.
 


The original point was that processing speed increases significantly all the time. That's what you had a problem with. That's why you say the next generation wont offer a significant leap over this one.


Don't confuse me with Viper1 or anyone else that was debating this topic with you. I simply "butted in" to say that what you refer to as processing speed has not increased significantly. The part of your post that I bolded specifically said that regardless of the addition of cores/threads, speeds have also gone up dramatically. This is not true. It also isn't Moore's law.



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

Around the Network
Dr.Grass said:
gumby_trucker said:
Dr.Grass said:
gumby_trucker said:
Dr.Grass said:
LordTheNightKnight said:
Dr.Grass said:
zarx said:
Viper1 said:

  PS3 wouldn't match the fill rate of 3 high end cards from 2002-2003.  Not even close.


PS3 RSX GPU fillrate 4000 MP/s pixel 12000 MT/s texture Nvidia's most powerful GPU as of 1st of jan 2004 3800 MP/s 3800 MT/s


Viper, I'm forced now to not take anything you say seriously again. You just don't know what your talking about and somehow showing off as an expert.


Viper1 was discussing a graphical leap, not a raw numbers leap. And Moore's law hit a brick wall years ago, which is why multi-core processors are the major advancements instead of processor speed.


Then he shouldn't start using numbers in his arguments. And the raw numbers go hand in hand with the graphical leap so what's your point?

Moore's law hasn't hit a brick wall. The fact that multiple processors are used doesn't take away from the fact that the processing speeds of pc's have gone way up in accordance with the the law.

This is simply not true. Transistors have indeed continued to shrink more or less as Moore predicted, but that in no way means performance gains at the same pace. I'm no expert but I think there is a problem with increasing the switching speed of a transistor too much at these tiny scales as they become prone to "leaking" current simply due to Brownian motion. This I think is one of the reasons why so called "processing speeds" (amount of MHz/GHz) have barely gone up in the last decade. Check Wikipedia and you'll see Pentium 4 processors at speeds above 3 GHz have been on the market as far back as 2002.

With the move to multi-threaded and later on multi core CPUs it becomes possible to perform more tasks in parallel so as far as the consumer is concerned they are getting more "bang" per MHz due to parallelization, but this scales much worse then you think it does. ie: twice the cores is in most practical cases much less than twice the performance on any given application. Some of this can be improved by fundamentally re-writing programs which were meant to run serially, but even this process takes many years and it's not always clear how much it benefits performance.

 

http://www.intel.com/technology/mooreslaw/

 

That's Moore's law still aplying right there. Here's more:

 

"Moore’s law isn’t tracking exactly, but the spirit of the law is still alive in that the dies are still shrinking, and CPUs become more and more capable every 12-18 months or so," said Joel Santo Domingo, lead analyst, desktops at PCMag.com. His former boss agrees. 

"I did the math, and while it’s not exactly doubling every two years, it’s pretty close," agreed Michael Miller, the award-winning math geek and former editor in chief of PCMag.com.  



Read more: http://www.foxnews.com/scitech/2011/01/04/years-later-does-moores-law-hold-true/#ixzz1Mq8tUOy2

EDIT: So if that graph up there is straight then the growth is exponential then the law pretty much holds. It looks close enough to me.

Moore's law is that transistors shrink, not that switching gets faster. That was precisely my point and is in agreement with what your graph shows. Notice I said as much myself in my first sentence, and I didn't bold the part of your statement that referenced Moore's law.
 


The original point was that processing speed increases significantly all the time. That's what you had a problem with. That's why you say the next generation wont offer a significant leap over this one.


And you said yourself that Moore's law has hit a brick wall. 



gumby_trucker said:
Dr.Grass said:
gumby_trucker said:

Moore's law is that transistors shrink, not that switching gets faster. That was precisely my point and is in agreement with what your graph shows. Notice I said as much myself in my first sentence, and I didn't bold the part of your statement that referenced Moore's law.
 


The original point was that processing speed increases significantly all the time. That's what you had a problem with. That's why you say the next generation wont offer a significant leap over this one.


Don't confuse me with Viper1 or anyone else that was debating this topic with you. I simply "butted in" to say that what you refer to as processing speed has not increased significantly. The part of your post that I bolded specifically said that regardless of the addition of cores/threads, speeds have also gone up dramatically. This is not true. It also isn't Moore's law.

I don't always check who writes what. soz



Dr.Grass said:
Dr.Grass said:
gumby_trucker said:
Dr.Grass said:
gumby_trucker said:
Dr.Grass said:
LordTheNightKnight said:
Dr.Grass said:
zarx said:
Viper1 said:

  PS3 wouldn't match the fill rate of 3 high end cards from 2002-2003.  Not even close.


PS3 RSX GPU fillrate 4000 MP/s pixel 12000 MT/s texture Nvidia's most powerful GPU as of 1st of jan 2004 3800 MP/s 3800 MT/s


Viper, I'm forced now to not take anything you say seriously again. You just don't know what your talking about and somehow showing off as an expert.


Viper1 was discussing a graphical leap, not a raw numbers leap. And Moore's law hit a brick wall years ago, which is why multi-core processors are the major advancements instead of processor speed.


Then he shouldn't start using numbers in his arguments. And the raw numbers go hand in hand with the graphical leap so what's your point?

Moore's law hasn't hit a brick wall. The fact that multiple processors are used doesn't take away from the fact that the processing speeds of pc's have gone way up in accordance with the the law.

This is simply not true. Transistors have indeed continued to shrink more or less as Moore predicted, but that in no way means performance gains at the same pace. I'm no expert but I think there is a problem with increasing the switching speed of a transistor too much at these tiny scales as they become prone to "leaking" current simply due to Brownian motion. This I think is one of the reasons why so called "processing speeds" (amount of MHz/GHz) have barely gone up in the last decade. Check Wikipedia and you'll see Pentium 4 processors at speeds above 3 GHz have been on the market as far back as 2002.

With the move to multi-threaded and later on multi core CPUs it becomes possible to perform more tasks in parallel so as far as the consumer is concerned they are getting more "bang" per MHz due to parallelization, but this scales much worse then you think it does. ie: twice the cores is in most practical cases much less than twice the performance on any given application. Some of this can be improved by fundamentally re-writing programs which were meant to run serially, but even this process takes many years and it's not always clear how much it benefits performance.

 

http://www.intel.com/technology/mooreslaw/

 

That's Moore's law still aplying right there. Here's more:

 

"Moore’s law isn’t tracking exactly, but the spirit of the law is still alive in that the dies are still shrinking, and CPUs become more and more capable every 12-18 months or so," said Joel Santo Domingo, lead analyst, desktops at PCMag.com. His former boss agrees. 

"I did the math, and while it’s not exactly doubling every two years, it’s pretty close," agreed Michael Miller, the award-winning math geek and former editor in chief of PCMag.com.  



Read more: http://www.foxnews.com/scitech/2011/01/04/years-later-does-moores-law-hold-true/#ixzz1Mq8tUOy2

EDIT: So if that graph up there is straight then the growth is exponential then the law pretty much holds. It looks close enough to me.

Moore's law is that transistors shrink, not that switching gets faster. That was precisely my point and is in agreement with what your graph shows. Notice I said as much myself in my first sentence, and I didn't bold the part of your statement that referenced Moore's law.
 


The original point was that processing speed increases significantly all the time. That's what you had a problem with. That's why you say the next generation wont offer a significant leap over this one.


And you said yourself that Moore's law has hit a brick wall. 

what the hell??

Are you trying to pull a "Duck Season/Rabbit Season" switch on me by quoting yourself instead of my last reply?

Now you're just trolling as I never said such a thing. If I did, show me exactly where I did.



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

Dr.Grass said:
gumby_trucker said:


Don't confuse me with Viper1 or anyone else that was debating this topic with you. I simply "butted in" to say that what you refer to as processing speed has not increased significantly. The part of your post that I bolded specifically said that regardless of the addition of cores/threads, speeds have also gone up dramatically. This is not true. It also isn't Moore's law.

I don't always check who writes what. soz

Great way to have a conversation, pal...



Until you've played it, every game is a system seller!

the original trolls

Wii FC: 4810 9420 3131 7558
MHTri: name=BOo BoO/ID=BZBLEX/region=US

mini-games on consoles, cinematic games on handhelds, what's next? GameBoy IMAX?

Official Member of the Pikmin Fan Club

Around the Network

"And you said yourself that Moore's law has hit a brick wall."

That was me. So you are confusing us.

And the brick wall I meant was about heat and other factors that prevent too many transistors the way the law stated.



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

LordTheNightKnight said:
Squilliam said:

It definately makes sense for them to use Blu Ray. It's probably the cheapest way to get dense, higher speed media without resorting to flash.


What if they do resort to it?

Hmm a return to Flash cartridges? They'd probably still need to spend money on a DVD drive for backwards compatibility unless they intend to plug the Wii in as an accessory to the NES 6. Assuming thats the case then they could possibly use it to forgo many of the software royalties from third parties and simply make money on the system. Then the price of Flash wouldn't matter because third parties would be free to choose what price they want to pay for flash or charge for games.



Tease.

Squilliam said:
LordTheNightKnight said:
Squilliam said:

It definately makes sense for them to use Blu Ray. It's probably the cheapest way to get dense, higher speed media without resorting to flash.


What if they do resort to it?

Hmm a return to Flash cartridges? They'd probably still need to spend money on a DVD drive for backwards compatibility unless they intend to plug the Wii in as an accessory to the NES 6. Assuming thats the case then they could possibly use it to forgo many of the software royalties from third parties and simply make money on the system. Then the price of Flash wouldn't matter because third parties would be free to choose what price they want to pay for flash or charge for games.


They wouldn't really return as a) the N64 used a different kind of cart, and b) they've been sort of using flash carts with the DS systems already.

And I agree they would have to have a legacy optical drive (guess you missed me discussing it with others earlier in the thread).



A flashy-first game is awesome when it comes out. A great-first game is awesome forever.

Plus, just for the hell of it: Kelly Brook at the 2008 BAFTAs

LordTheNightKnight said:
Squilliam said:

Hmm a return to Flash cartridges? They'd probably still need to spend money on a DVD drive for backwards compatibility unless they intend to plug the Wii in as an accessory to the NES 6. Assuming thats the case then they could possibly use it to forgo many of the software royalties from third parties and simply make money on the system. Then the price of Flash wouldn't matter because third parties would be free to choose what price they want to pay for flash or charge for games.


They wouldn't really return as a) the N64 used a different kind of cart, and b) they've been sort of using flash carts with the DS systems already.

And I agree they would have to have a legacy optical drive (guess you missed me discussing it with others earlier in the thread).

Well the advantages of flash aren't as pronounced with stationary consoles. They'd have to change the business structure in order for it to make sense and from that they'd probably need to take away most of the physical media royalties paid to the console manufacturer in order to work. You have to think about it in terms of an operating structure which allows it to work. So when they were charging $7 or whatever per game they may only be able to charge $3 or $4 if they use Flash at most.



Tease.

snfr said:
cr00mz said:

does a non-movie playback make it cheaper compared to if it would be able to play movies?

As far as I know it made the Wii cheaper, but I'm not really sure...


The Wii doesn't use actual DVD's.  Nintendo developed a slightly different disk technology with Matsushita (Panasonic) that is very similar to DVD technology, but Nintendo didn't want to license DVD technology.  The idea is that the Wii would cost more to play DVD's (movies), but probably not a lot more.  

Frankly, I don't think Nintendo will use Blu-Ray.  If anything, it'll follow suit with Matsushita to develop their own Blu-Ray analog so they can have better piracy controls, and so they don't have to pay Sony to license the Blu-Ray medium.

For that matter, even Sony won't use Blu-Ray next time.  Games like MGS4 and L.A. Noire have already maxed the medium.  And Sony themselves have never used the same media more than once in any of their machines.  CD, DVD, Blu-Ray, UMD, and the NGP will use a card format like the DS/3DS.