By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Wii Vs PS3/XBOX360 [Technicaly]

makingmusic476 said:
sc94597 said:
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
 

I'm not saying that the Wii isn't more powerful than the PS2. It's much more powerful than the ps2. I even said so in the first line of the post you quoted. What I'm saying is that the difference between the ps3/360 and Wii is greater than the difference between the Wii and ps2.


Explain why my friend.

And while we're at it, let's post a first-gen PS2 game:

Done by no others than the graphics pioneers at Squaresoft


You chose that craptastic game to show off the ps2? Yet you use 1st party Nintendo titles to show of the Wii?

GT3, released less than a year after the ps2 came out in Japan:

Compared to MK Wii, a Nintendo title, released a year and a half after the Wii came out:

Compared to GT5P, released a year after the ps3 came out in Japan:

That is why, my friend.


I've always seen Sony fans praise Square for their ambition on graphics department, that's why I choose that one...

And how can we compare the cartooniest racing game EVER with a realistic one? What's there to compare? I can see more detail, more polygon count and better texture resolution in the Kart one in comparison to the GT3 one.

And GT3 was released 13 months after PS2's launch... just saying.


First, I don't own a single Square game, and I've never played a single Final Fantasy. *puts up flame shield*

Second, you can notice things like polygons and texture resolution in any shot, regardless of artstyle, and you have done just that.

Third, yes, the MK shot looks better than the GT3 shot. I've said it time and time again, I'm not saying that the Wii isn't more powerful than the ps2. It is. Easily. I'm saying that the gap between the Wii and ps3/360 is larger than the gap between the ps2 and Wii. MKWii looks a lot better than GT3 (and GT4), but GT5P looks WAY better than MKWii.

And finally, yeah, I kinda switched April and March in my mind.


Um realistic games always are meant to push the hardware while cartoony games aren't. That is what fazz was going with it.


So you're saying games like SMG don't push the Wii's hardware? Or games like R&C for the ps3?


Notice the bolded part. This implies that they aren't always about graphics whle realistic games are. This doesn't mean that some cartoonish looking games couldn't push the hardware.



Around the Network

 
sc94597 said:

selnor said:
fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:





Equates to the difference between these two:





And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

 

vs.

 

and

 

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPUs had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted, nullifying it's advantage in processing power.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia facts so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.

The bottom-line: The Wii is as far from the PS2, as the Xbox 360 from the Wii... but if you don't agree with me, you'll ignore everything I said above ;>


So those of you who dont know what he's on about. GPU in Wii looks after the motion sensing controls. So it does need to have more power.

Hollywood is the name of the Graphics Processing Unit (GPU) used in Nintendo's Wii video game console. It was designed by ATI Technologies and is manufactured using the same 90 nm CMOS process[1] as the "Broadway" processor. Very few official details have been released to the public by Nintendo, ATI, or IBM. Unofficial reports claim that it is an enhanced revision of the 162 Mhz Nintendo GameCube LSI "Flipper" and that it is clocked 50% faster at 243 MHz. None of the clock rates have been confirmed by Nintendo, IBM, or ATI. It is not known how many pixel pipelines or shader units Hollywood possesses.[2]

The Hollywood is a Multi-Chip Module package composed of two dies within the cover. One of the two chips, codenamed Napa, controls the I/O functions, RAM access, and the actual GPU with its 3 MB of embedded DRAM (1MB texture cache, 2MB framebuffer), and measures 8 × 9 mm. The other, codenamed Vegas, holds the Audio DSP and the 24 MB of "internal" 1T-SRAM and measures 13.5 × 7 mm. Hollywood has a 128-Bit memory interface between the VRAM and GPU.[3]

The Hollywood also contains an ARM926 core, which has been unofficially nicknamed the Starlet[4]. This embedded microprocessor performs many of the I/O functions, including controlling the wireless functionality, USB, the disc drive, and other miscellaneous functions. It also acts as the security controller of the system, performing encryption and authentication functions. The Hollywood includes hardware implementations of AES and SHA-1, to speed up these functions. Communication with the main CPU is accomplished via an IPC mechanism. The Starlet performs the WiiConnect24 functions while the Wii console is in standby mode

He's talking about those. Not the extra arm processor.


 

My point was the Hollywood has more power than the Flipper. I know what he was talking about. My post was for those to see how Hollywood was set up. 



selnor said:


sc94597 said:

selnor said:
fazz said:
makingmusic476 said:




Yeah, 720p at a much lower framerate

Also, I have much better examples:

 

vs.

 

and

 

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPUs had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted, nullifying it's advantage in processing power.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia facts so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.

The bottom-line: The Wii is as far from the PS2, as the Xbox 360 from the Wii... but if you don't agree with me, you'll ignore everything I said above ;>


So those of you who dont know what he's on about. GPU in Wii looks after the motion sensing controls. So it does need to have more power.

Hollywood is the name of the Graphics Processing Unit (GPU) used in Nintendo's Wii video game console. It was designed by ATI Technologies and is manufactured using the same 90 nm CMOS process[1] as the "Broadway" processor. Very few official details have been released to the public by Nintendo, ATI, or IBM. Unofficial reports claim that it is an enhanced revision of the 162 Mhz Nintendo GameCube LSI "Flipper" and that it is clocked 50% faster at 243 MHz. None of the clock rates have been confirmed by Nintendo, IBM, or ATI. It is not known how many pixel pipelines or shader units Hollywood possesses.[2]

The Hollywood is a Multi-Chip Module package composed of two dies within the cover. One of the two chips, codenamed Napa, controls the I/O functions, RAM access, and the actual GPU with its 3 MB of embedded DRAM (1MB texture cache, 2MB framebuffer), and measures 8 × 9 mm. The other, codenamed Vegas, holds the Audio DSP and the 24 MB of "internal" 1T-SRAM and measures 13.5 × 7 mm. Hollywood has a 128-Bit memory interface between the VRAM and GPU.[3]

The Hollywood also contains an ARM926 core, which has been unofficially nicknamed the Starlet[4]. This embedded microprocessor performs many of the I/O functions, including controlling the wireless functionality, USB, the disc drive, and other miscellaneous functions. It also acts as the security controller of the system, performing encryption and authentication functions. The Hollywood includes hardware implementations of AES and SHA-1, to speed up these functions. Communication with the main CPU is accomplished via an IPC mechanism. The Starlet performs the WiiConnect24 functions while the Wii console is in standby mode

He's talking about those. Not the extra arm processor.


 

My point was the Hollywood has more power than the Flipper. I know what he was talking about. My post was for those to see how Hollywood was set up.

I don't understand why YOU would want to help prove the case that the wii is significantly more powerful than the ps2 , and maybe to the extent where the difference is similar to the difference between the wii and hd consoles.

Btw I thought you were just trying to disprove it by saying that the extra arm processor was only for motion controls usb processes, etc. Sorry for mis interpreting what you were trying to do.

 



makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:

And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

vs.

and

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPU had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia data so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.


I'm not saying that the Wii isn't more powerful than the PS2. It's much more powerful than the ps2. I even said so in the first line of the post you quoted. What I'm saying is that the difference between the ps3/360 and Wii is greater than the difference between the Wii and ps2.


Explain why my friend.

And while we're at it, let's post a first-gen PS2 game:

Done by no others than the graphics pioneers at Squaresoft


You chose that craptastic game to show off the ps2? Yet you use 1st party Nintendo titles to show of the Wii?

GT3, released less than a year after the ps2 came out in Japan:

Compared to MK Wii, a Nintendo title, released a year and a half after the Wii came out:

Compared to GT5P, released a year after the ps3 came out in Japan:

That is why, my friend.


I've always seen Sony fans praise Square for their ambition on graphics department, that's why I choose that one...

And how can we compare the cartooniest racing game EVER with a realistic one? What's there to compare? I can see more detail, more polygon count and better texture resolution in the Kart one in comparison to the GT3 one.

And GT3 was released 13 months after PS2's launch... just saying.


First, I don't own a single Square game, and I've never played a single Final Fantasy. *puts up flame shield*

Second, you can notice things like polygons and texture resolution in any shot, regardless of artstyle, and you have done just that.

Third, yes, the MK shot looks better than the GT3 shot. I've said it time and time again, I'm not saying that the Wii isn't more powerful than the ps2. It is. Easily. I'm saying that the gap between the Wii and ps3/360 is larger than the gap between the ps2 and Wii. MKWii looks a lot better than GT3 (and GT4), but GT5P looks WAY better than MKWii.

And finally, yeah, I kinda switched April and March in my mind.


I still see the difference between Kart and GT3 as a big one.

Easier comparison:

@selnor

The Starlet processor is the one handling the motion controllers and it's separate from the TWO other big chips in the GPU pack. The Starlet is a really small chip in a corner, open up your Wii and check it. It DOESN'T handles anything related to graphics.

EDIT: Changed all three pictures by request from makingmusic476 



Metroid Prime 3 versus two launch titles? And cutscenes versus gameplay?

Why not use Call of Duty 4 for a better comparison:



Around the Network
sc94597 said:
selnor said:


sc94597 said:

selnor said:
fazz said:
makingmusic476 said:




Yeah, 720p at a much lower framerate

Also, I have much better examples:

 

vs.

 

and

 

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPUs had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted, nullifying it's advantage in processing power.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia facts so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.

The bottom-line: The Wii is as far from the PS2, as the Xbox 360 from the Wii... but if you don't agree with me, you'll ignore everything I said above ;>


So those of you who dont know what he's on about. GPU in Wii looks after the motion sensing controls. So it does need to have more power.

Hollywood is the name of the Graphics Processing Unit (GPU) used in Nintendo's Wii video game console. It was designed by ATI Technologies and is manufactured using the same 90 nm CMOS process[1] as the "Broadway" processor. Very few official details have been released to the public by Nintendo, ATI, or IBM. Unofficial reports claim that it is an enhanced revision of the 162 Mhz Nintendo GameCube LSI "Flipper" and that it is clocked 50% faster at 243 MHz. None of the clock rates have been confirmed by Nintendo, IBM, or ATI. It is not known how many pixel pipelines or shader units Hollywood possesses.[2]

The Hollywood is a Multi-Chip Module package composed of two dies within the cover. One of the two chips, codenamed Napa, controls the I/O functions, RAM access, and the actual GPU with its 3 MB of embedded DRAM (1MB texture cache, 2MB framebuffer), and measures 8 × 9 mm. The other, codenamed Vegas, holds the Audio DSP and the 24 MB of "internal" 1T-SRAM and measures 13.5 × 7 mm. Hollywood has a 128-Bit memory interface between the VRAM and GPU.[3]

The Hollywood also contains an ARM926 core, which has been unofficially nicknamed the Starlet[4]. This embedded microprocessor performs many of the I/O functions, including controlling the wireless functionality, USB, the disc drive, and other miscellaneous functions. It also acts as the security controller of the system, performing encryption and authentication functions. The Hollywood includes hardware implementations of AES and SHA-1, to speed up these functions. Communication with the main CPU is accomplished via an IPC mechanism. The Starlet performs the WiiConnect24 functions while the Wii console is in standby mode

He's talking about those. Not the extra arm processor.


 

My point was the Hollywood has more power than the Flipper. I know what he was talking about. My post was for those to see how Hollywood was set up.

I don't understand why YOU would want to help prove the case that the wii is significantly more powerful than the ps2 , and maybe to the extent where the difference is similar to the difference between the wii and hd consoles.

Btw I thought you were just trying to disprove it by saying that the extra arm processor was only for motion controls usb processes, etc. Sorry for mis interpreting what you were trying to do.

 

 First of all I love the wii. It's a breath of fresh air. All along I've said that the wii isnt about graphics. Yes it's more powerful than last gen, but on screen there isnt a major jump. I know as time goes on the games will look better, but  dont think we will see much better than Galaxy or SSB. But I dont play wii for the graphics. I play for Shigsy magic. Would I like to see Mario HD? Yeah why not. I think it would only enhance the expeience, but it doesnt bother me that it's not. Same with loads and loads of detail. It would enhance mario but it doesnt bother me. I'm just saying dont make the wii something it's not.  

 



makingmusic476 said:


Even with CoD4 the difference is very similar to the difference of the ps2 fps to prime 3.



fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:

And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

vs.

and

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPU had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia data so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.


I'm not saying that the Wii isn't more powerful than the PS2. It's much more powerful than the ps2. I even said so in the first line of the post you quoted. What I'm saying is that the difference between the ps3/360 and Wii is greater than the difference between the Wii and ps2.


Explain why my friend.

And while we're at it, let's post a first-gen PS2 game:

Done by no others than the graphics pioneers at Squaresoft


You chose that craptastic game to show off the ps2? Yet you use 1st party Nintendo titles to show of the Wii?

GT3, released less than a year after the ps2 came out in Japan:

Compared to MK Wii, a Nintendo title, released a year and a half after the Wii came out:

Compared to GT5P, released a year after the ps3 came out in Japan:

That is why, my friend.


I've always seen Sony fans praise Square for their ambition on graphics department, that's why I choose that one...

And how can we compare the cartooniest racing game EVER with a realistic one? What's there to compare? I can see more detail, more polygon count and better texture resolution in the Kart one in comparison to the GT3 one.

And GT3 was released 13 months after PS2's launch... just saying.


First, I don't own a single Square game, and I've never played a single Final Fantasy. *puts up flame shield*

Second, you can notice things like polygons and texture resolution in any shot, regardless of artstyle, and you have done just that.

Third, yes, the MK shot looks better than the GT3 shot. I've said it time and time again, I'm not saying that the Wii isn't more powerful than the ps2. It is. Easily. I'm saying that the gap between the Wii and ps3/360 is larger than the gap between the ps2 and Wii. MKWii looks a lot better than GT3 (and GT4), but GT5P looks WAY better than MKWii.

And finally, yeah, I kinda switched April and March in my mind.


I still see the difference between Kart and GT3 as a big one.

Easier comparison:

@selnor

The Starlet processor is the one handling the motion controllers and it's separate from the TWO other big chips in the GPU pack. The Starlet is a really small chip in a corner, open up your Wii and check it. It DOESN'T handles anything related to graphics.

I know that. Why you tellig me? Thats why I posted it. It clearly states that ther is three cores. 1 for graphics, 1 for audio, 1 for sensors. 

 



sc94597 said:
makingmusic476 said:


Even with CoD4 the difference is very similar to the difference of the ps2 to prime 3.


Again, launch title! And the average scene in MP3 does not look like that cutscene.

My comparison was more valid with all games being first party and coming out 1-1.5 years after launch. I can't think of any shooters that hit a year after the ps2 launched to fit the comparison above.



makingmusic476 said:

Metroid Prime 3 versus two launch titles? And cutscenes versus gameplay?

Why not use Call of Duty 4 for a better comparison:


 I'm sorry, but was MP3 supposed to be a launch title but had to be delayed because Retros needed more time with the controls? So take out the controls... and you have MP3 as a launched title, no?