Gballzack said:
Wee, I get to be the whore of Babylon. |
Biblical references FTW.

Thanks to Blacksaber for the sig!
Gballzack said:
Wee, I get to be the whore of Babylon. |
Biblical references FTW.

Thanks to Blacksaber for the sig!
| Quantum-Tarantino said: Whoa there Dio The BOX had a 766Mhz CELERON processor The CUBE had a 485 Mhz POWER PC Processor When comparing speeds between the 2, you need to AT LEAST double the MHZ speed of the Power PC (thats for comparing to Pentiums, and Celerons are FAR less powerful" In Short, the the CPU on the CUBE was FAR better |
Actually to clear some things up the Xbox does not have a 766Mhz Celeron, it has a modified Pentium 3 running at 733 Mhz. It's main difference from a Celeron is that it has an 8-way set associative L2 cache instead of the 4-way set associative cache of the Celeron. That makes it about 10% faster than a Celeron.
The Gamecube's cpu is based on a Power PC 750, a regular Celeron running at 733 Mhz beats a Power PC 750 running at 500 Mhz. Now since the Gamecube's Power PC cpu is running slower than 485 Mhz and the Xbox's cpu is 10% faster than a Celeron it's *extremely* likely that the Xbox's cpu is more powerful than the Gamecube's.
What I'm trying to say is that maybe you should give up this argument because the Xbox has a more powerful GPU, CPU, more ram, and superior sound, built in ethernet, and a built in harddrive. Of course you don't have to believe me since I'm providing a link to Anandtech to confirm it.
source: http://www.anandtech.com/showdoc.aspx?i=1566&p=14
Legend11 said:
Actually to clear some things up the Xbox does not have a 766Mhz Celeron, it has a modified Pentium 3 running at 733 Mhz. It's main difference from a Celeron is that it has an 8-way set associative L2 cache instead of the 4-way set associative cache of the Celeron. That makes it about 10% faster than a Celeron. The Gamecube's cpu is based on a Power PC 750, a regular Celeron running at 733 Mhz beats a Power PC 750 running at 500 Mhz. Now since the Gamecube's Power PC cpu is running slower than 485 Mhz and the Xbox's cpu is 10% faster than a Celeron it's *extremely* likely that the Xbox's cpu is more powerful than the Gamecube's. What I'm trying to say is that maybe you should give up this argument because the Xbox has a more powerful GPU, CPU, more ram, and superior sound, built in ethernet, and a built in harddrive. Of course you don't have to believe me since I'm providing a link to Anandtech to confirm it. |
The Gekko is a custom processor based on the PowerPC G3 (750) architecture but has an advanced instruction set (50 instructions added) larger L2 cache and an advanced bus to handle the fast 1T-SRAM in the Gamecube. If you're calling the Gekko. At the time the G3 was considered (approximately) 1.5 times as powerful per clock cycle as a Pentium 3 and the G4 was considered twice as powerful as a Pentium 4.
If you didn't realize it this is the same core that both the PS3 and XBox 360 CPUs are based off of and by your logic my 3 year old Pentium 4 3GHz is more powerful than either the PS3 or XBox 360's CPU; trust me it isn't.
Entroper said:
The RAMDAC converts the contents of the framebuffer to an analog signal that can be carried over composite/component/whatever. For a digital 720p output, the RAMDAC is irrelevant, but it matters for component. The details of the Wii's RAMDAC have not been disclosed. Also, the Wii's graphics chip has an embedded framebuffer. If it's the same size as the Gamecube's embedded framebuffer, then it's 2 MB. That's enough space to double buffer a 640x480x24 display, just. It isn't enough for 720p, even single-buffered. But of course, the details of the Wii's GPU have not been disclosed, either. I think it's likely that the texture cache has been increased, and not the framebuffer, but that's pure speculation on my part. |
Interesting... did some more reading here:
http://wiinside.blogspot.com/2007/04/inside-wii.html
...and it mentions this:
"3 Megs embedded Ram Same ( Wii able to use A-Ram as Additional GPU/CPU Ram )"
The basic (embedded) frame buffer is the same - but the GPU may be able to use external memory as well? Probably not for the frame buffer though. Also mentions that the Wii is "locked" at 32-bit colour (GC was 18bit colour).
So I think you are right, not enough frame buffer memory for 720p. May also not be fast enough(?) to sample the frame buffer to achieve proper 720p?
When you consider that the Wii only has to push around 300k pixels / frame (versus 2m for full 1080p!), you can see that a reduced fill rate is still very effective - and can mean a lot more effects can be implemented.
Gesta Non Verba
Nocturnal is helping companies get cheaper game ratings in Australia:
Wii code: 2263 4706 2910 1099
Gballzack said:
Exactly, I also think that any of the big three abandoning their guns now would be a loss of face to the consumer that I don't think any of them could recover from in the next gen. |
I think everyone STILL underestimates Nintendo.
All companies have seen how well the Wii has done, and it will no doubt be a factor in the planning of future consoles.
But Nintendo will know this - and will use this as an opportunity to move in a different direction. Miyamoto was hinting at something pretty significant (at last E3) - but would not say more, except for a hint - "I want to move away from TV displays completely" (re: consoles).
...
I would love to see a Wii II in a few years (2009?), which would give Nintendo time to move in a completely different direction for the future.
What could the future hold? Holographic/3D displays? What about a projection system, to turn a room into something like a holodeck? With cameras completely monitoring all movements of the person?
Can games evolve beyond full VR?
Gesta Non Verba
Nocturnal is helping companies get cheaper game ratings in Australia:
Wii code: 2263 4706 2910 1099
| shams said: Interesting... did some more reading here: http://wiinside.blogspot.com/2007/04/inside-wii.html ...and it mentions this: "3 Megs embedded Ram Same ( Wii able to use A-Ram as Additional GPU/CPU Ram )" The basic (embedded) frame buffer is the same - but the GPU may be able to use external memory as well? Probably not for the frame buffer though. Also mentions that the Wii is "locked" at 32-bit colour (GC was 18bit colour). So I think you are right, not enough frame buffer memory for 720p. May also not be fast enough(?) to sample the frame buffer to achieve proper 720p? When you consider that the Wii only has to push around 300k pixels / frame (versus 2m for full 1080p!), you can see that a reduced fill rate is still very effective - and can mean a lot more effects can be implemented.
|
It should be fast enough to do 720p with similar effects as the Gamecube. 720p is 2.67 times larger than 480p, and Wii has 3x the fillrate of Gamecube.
I don't have a Wii SDK, so I don't know the relationship between the RAMDAC and the embedded framebuffer. It may be that you render a frame to the embedded RAM do any post-processing effects there, then copy it into graphics memory for the RAMDAC (double buffering). There still is not enough framebuffer for one 720p frame (unless you use 16-bit color), but it may be possible to, for example, draw the top half of the frame, copy it, then draw the bottom half. But this may all be moot if the RAMDAC simply isn't up to the task. :)
Full stereoscopic 3D is something I've spoken about in the past, even having written my own stereoscopic graphics demo. It's absolutely bonkers, and I hope one of the three companies is able to do it and be successful with it. That's a console I'd pay $500 for -- Sony, are you listening?
| Entroper said: Full stereoscopic 3D is something I've spoken about in the past, even having written my own stereoscopic graphics demo. It's absolutely bonkers, and I hope one of the three companies is able to do it and be successful with it. That's a console I'd pay $500 for -- Sony, are you listening? |
I did red/blue using the OpenGL accumulation buffer -- unfortunately this means it only works on nVidia graphics cards, since nVidia seems to be the only company willing to implement the accumulation buffer. But you could do it on other platforms using offscreen render targets.
From a software perspective, it couldn't be simpler to implement stereoscopic 3D. You set up the camera for the left eye and draw the scene, then you set up the camera for the right eye and draw the scene. My demo was extremely simple, just some gouraud-shaded spaceships flying around with a skybox in the background. But it looked fantastic, the things floated right out of the screen and into your face. :) Unfortunately, it requires red/blue glasses, and works best if you have a widescreen monitor and don't render to the edges of the screen.
A console implementation would require dedicated hardware. Not shutter glasses, as these induce eye strain and headaches. You need a visor with two small LCD screens. Small, high-resolution LCD screens are getting cheap -- the DS sells for $130. The barrier to entry is that every player needs one, and they all need fillrate. Fortunately, these won't be HD-resolution screens. 640x480 is probably the limit, so a GPU not much more powerful than the Wii could handle up to 4 players. I think you could actually launch a system like this with one visor in the box in the $299-399 range in 2011.
HappySqurriel said:
The Gekko is a custom processor based on the PowerPC G3 (750) architecture but has an advanced instruction set (50 instructions added) larger L2 cache and an advanced bus to handle the fast 1T-SRAM in the Gamecube. If you're calling the Gekko. At the time the G3 was considered (approximately) 1.5 times as powerful per clock cycle as a Pentium 3 and the G4 was considered twice as powerful as a Pentium 4. If you didn't realize it this is the same core that both the PS3 and XBox 360 CPUs are based off of and by your logic my 3 year old Pentium 4 3GHz is more powerful than either the PS3 or XBox 360's CPU; trust me it isn't.
|
You should take a look at the link I posted.
Thanks Entroper.
Agree on the shutter glasses, they were always problematic. Interesting to hear your take on a console implementation with LCD visors. I always imagined it using a projection type system. Images for left/right eyes form by light polarised in different directions (ie. horizontal for left and vertical for right), then using glasses with correspondingly polarised filters, each eye will receive its intended image. I think the method you described is much more likely, as it would be far cheaper.
Whatever they go with, let’s hope it really launches - I’ve been hooked on 3D imagery since the first time I actually perceived depth looking at a friend’s wedding photos 