By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii's graphically superior to anything last generation!

I still dont see what the point of this whole discussion is... you dont need an HDTV to see the difference in graphics



THe oNLY TRue STuPiDiTY iS THe aCCePTaNCe oF iGNoRaNCe 

PSNTAG K_I_N_G__COKE

  The King Of The Iron Fist tournament

windbane said:

Joelcool7, you are delusional.

Gamecube: 485MHz CPU, 24MB ram, 162MHz GPU, 3MB GPU ram, 1.5GB per disc

PS2: 294MHz CPU, 32MB ram, 147MHz GPU, 4MB GPU ram, 4.7GB per disc

Dreamcast: 200MHz CPU, 16MB ram, GPU speed not listed on wikipedia, 8MB GPU ram, 1GB per disc

Xbox: 733MHz CPU, 64MB ram split between CPU and GPU, 233MHz GPU, 4.7GB per disc

Wii: 729MHz CPU, 24MB ram, 243MHz GPU, 64MB GPU ram, 4.7GB per disc

PS3: 3.2GHz CPU and 6 3.2GHz SPEs, 256 ram, 550MHz GPU, 256MB GPU ram, 50GB per disc

Xbox360: 3 3.2GHz CPUs, 512MB ram split between CPU and GPU, 500MHz GPU, 9.4GB per disc

 


I was happy at some point, that arguing about system power is over considering current gen, or atleast for Wii. But what do i see... Ok, so it's wikipedia again, GC had some shared RAM between Gekko and Flipper, for them to coprocess each other. I think it works that way in the Wii also. Wii can use dual-layer discs, so storage capasity of its media is equal to 360. It's pretty shitty to compare specs between different architectures, it's also not about the performance, but also how it is used. Sports car with 100kW of power, has better performance than a truck with 400kW. But try to pull load of 20 tons with sports car with 400kW, when it's easy for a truck with 100kW.

Ei Kiinasti.

Eikä Japanisti.

Vaan pannaan jalalla koreasti.

 

Nintendo games sell only on Nintendo system.

johnlucas said:
The problem with people talking about games "looking good" is one of disconnect from reality. Outside of that Spiderman 3 Wii version game someone put up in some other thread there ARE no "bad looking games" anymore. In the 5th gen you had some games that weren't the best-looking or best-playing (usually cheap market tie-ins). But in the 6th gen even Mario Party 4 though one of the first of the system looked good. Luigi's Mansion looked good being 1st editions of Gamecube. It was clear distinct and had personality. They got better with the scanning lines in later titles and made the color strips look more seamless. I talked about PS2, the weakest of the 6th gen systems, looking wonderful to me while visiting a friend's house playing Tekken 5. It reminds me of Howard Stern and how he and those jackoffs judge the women on there for not having a absolute perfect 45 degree angle on this body part and such. "Oops that's 45.2 degree. Not good enough." Delusional standards is what I'm getting at. Graphic power won't REALLY matter like it has in the past until virtual reality is achieved. Until gamelife is nearly inseparable from real life will it have the same true wow factor as in the past. Which is probably not a good thing if you think about it blurring these lines. Final Fantasy: The Spirits Within showed the folly in trying to match realism pound for pound. Actually the closer you get to realism the more you notice the errors. Until they get the eyes looking "alive" it all falls apart. Too perfect. Too shiny. Too idealized. It bombed at the box-office because it was a "fake realism" ironically and that move sent Square reeling having to merge with Enix. The same argument goes for many and breast implants. LOL! Graphics have a long way to go before trying to attempt second reality so the graphics lust need to be kept in check egowise. They are games and they can be stylized depictions of realism. Keyword: stylized. Artistic license to make abstract or less abstract illustrations of reality. Art. Even the paintings of Michelangelo LOOK like painted depictions of people not actual people. Even the most realistic drawing artists miss elements of full reality as close to realism as their depictions come. They still look like stylized depictions not the actual thing. Photography comes the closest to artistically reproducing reality but even this has its limits. And many times the picture is touched up to work out the less than desireable features. Graphics matter, yes, but unless the developer is a lazy "arse" you won't find any REALLY "bad" graphics anymore. This ain't Atari & ET for goodness sake. John Lucas

 There are still bad looking games, because of the decisions the people making the games make, the art direction of the game can make it look crappy no matter what resolution it's in. That playground game by EA looks pretty awful because of the way they chose to make it look...



Thanks to Blacksaber for the sig!

bdbdbd said:

It's pretty shitty to compare specs between different architectures, it's also not about the performance, but also how it is used.
Sports car with 100kW of power, has better performance than a truck with 400kW. But try to pull load of 20 tons with sports car with 400kW, when it's easy for a truck with 100kW.

Back in the 80's and early 90's "Bits" was the meaningless stat that everyone was using to determing processing power. Intel pushed "Hz" because, although Intel processors were overall faster, AMD and Cyrix processors were generally faster per "Hz" and ran at a slower speed. Now it seems like everyone is pushing "Cores" as the meaningless measure of processing power ...

The truth is that "processing power" is far more complicated than that and can not be easily measured across architectures. Hypothetically speaking a processor that was designed to be used for game development and the size (and cost) of the chip were not issues could have an extra wide bus (1024 or 2048 bits wide), have 48 64-bit wide registers for holding 3 Matricies, and include Matrix Multiplication/Addition/Subtraction/etc. functions.

Matrix-Matrix and Vector-Matrix multiplication is done far more often than anything else in a modern 3D game. In an old architecture (like the N64) to do matrix multiplication would require 32 Memory movements, 64 Floating point multiplications, and 48 floating point additions. With this hypothetical processor this could be reduced to 1 to 2 Memory Moves, 1 Floating point multiplication (64 at the exact same time) and 2 Floating point additions. In comparing these two very different architectures the "old architecture" would need to run far faster, and have far more cores in order to be similar in performance to the theoritical processor I described.



windbane said:

This is getting very ridiculous. Apple switched to Intel because they were slower for years. Much more expensive? Are you serious? The one good part of Macs were they were affordable, because their CPU was so outdated. I've built computers for years, I've looked at benchmarks for years, and I can assure you that the Macs were MUCH SLOWER than Intel CPUs at any point in time for at least the last decade. Why do you think gamers didn't buy them? And if you want to talk Supercomputers, the Cell is used for those now.  Supercomputers 10 years ago aren't that fast and do not indicate consumer level prices.

The PowerPC CPU did not undergo any sort of revolution like the Core Duo. So the architechture may have improved, but not that much. I'm not sure why you guys are equating architecture improvements of first AMD that started the 1500, 1600, etc line that estimated clock speed equivalent models and then Intel's Core Duo that increased performance by 30% with PowerPC CPUs. If PowerPCs really got that much faster show me some benchmarks.

Please don't twist around my statements. I said that the PPC was faster than a Pentium 3 at the same clockspeed. Show me a benchmark that show, that Pentium 3 or Pentium 4 outperform a PPC at same clockspeed. The architecture was optimized for higher clockspeeds, so they can reach much higher clockspeeds. But at the same rate they suck. And back to the original statement: I said that the Wii must be faster than the XBox, if they have nearly the same clockspeed, but I know the XBOX had Pentium 3-architecture and the Wii have a PPC.

Even if the PowerPC was as efficient as the Intel Core Duo (the best processor in the world right now), it would only be abotu 50% better at the same clock speed. Yeah, that's a great improvement, but if you are still below 800MHz there's only so much better it can be.

The Core is another beast. Intel had the architecture for many years around, they used it in the Pentium M. As they cannot further increase GHz on the Netburst-architecture (Pentium 4) because of the heat produced, they switched to the architecture of the Pentium M, calling it Core now.

Also, i LOVE how you guys are ignoring the other benchmarks. Look at the freaking GPU numbers on the Wii! it's barely better than previous generations as well. The RAM is even more sad of a situation. So we can continue this architecture arguement until someone shows some benchmarks but you can't change the other facts.

Oh wait, next you'll be telling me that the GPU and RAM is SUPER-MEGA-OPTIMIZED LIKE NOTHING BEFORE because for some reason Nintendo holds the secrets of computing and is selling it at a loss. Oh wait, their Wii is worth around $150. That's right, I forgot.


Dont make the impression, I said that the Wii outperforms the 360 or PS3. I never said that. I said, that with the posted clockspeeds the Wii-CPU must be faster than the XBOX-CPU. If you ever try again to twist around my statements, I think I will report your posts to mod. For now I hope you simply got my point wrong.

Edit: About the Apple: Even now with the Core-CPU (the best CPU on the world as you say) the Apple will not become a gamer-station. Apple dont focus on gamers. Apple hasn't the fastest PC's, but the performance of an Apple was always around the performance of a standard-PC of the same time.



3DS-FC: 4511-1768-7903 (Mii-Name: Mnementh), Nintendo-Network-ID: Mnementh, Switch: SW-7706-3819-9381 (Mnementh)

my greatest games: 2017, 2018, 2019, 2020, 2021, 2022, 2023

10 years greatest game event!

bets: [peak year] [+], [1], [2], [3], [4]

KruzeS said:
ALKO said: (...)

You may be right in that no one will make a game optimized for 480p on either the 360 or the PS3, but the fact is, if they did, the game could potentially look a lot better than a downscaled 720p game. Actually I think it's arguable weather sitting at a distance from a smallish HDTV an upscalled version of a game optimized for 480p couldn't look better either. It's a simple truth that if you have less than half the pixels you can do twice the per pixel work (meaning twice the pixel/fragment/texture shader work). That's also why a slow paced game can look better at 30fps than at 60fps, and why I don't think we'll see many 1080p games this generation.


 exactly!!!!!

 

this is the situation!!

WE'LL NEVER SEE  AN 480P OPTIMIZED GAME ON X360/PS3!!!

WE CAN MAKE SOME PREVISION....

 

BUT WE HAVE TO CONSIDER THAT X360 AND PS3 ARE NOT DESIGNED FROM THE SCRATCH TO WORK AT 480P.

 

 

you're a right.

 

so making a power comparision simply doesn't give us anything to choosea console...

 

choosing a console ONLY on nominal specs are really stupid by a real gamer!!!!!!

 

 

 



HappySqurriel said:
windbane said:

This is getting very ridiculous. Apple switched to Intel because they were slower for years. Much more expensive? Are you serious? The one good part of Macs were they were affordable, because their CPU was so outdated. I've built computers for years, I've looked at benchmarks for years, and I can assure you that the Macs were MUCH SLOWER than Intel CPUs at any point in time for at least the last decade. Why do you think gamers didn't buy them? And if you want to talk Supercomputers, the Cell is used for those now. Supercomputers 10 years ago aren't that fast and do not indicate consumer level prices.


Apple switched to Intel in 2005 because the Core Duo was far more powerful than a PowerPC 970MP (G5) for the price. Back in 1999, when the G3 and Pentium III were in direct competition, Apple always bragged that IBM's G3 was twice as powerful as the Pentium III at the same clockspeed (which was somewhat true); soon afterwords Apple was braggin that the G4 was over twice (2.6 times) as powerful as the Pentium 4 at the same clockspeed. Intel's response was to increase the clockspeed so that the 400MHz G3 was in direct competition with the 1GHz Pentium 3; the 600MHz-800MHz G4s were in direct competition with 2GHz to 2.53GHz Pentium 4 processors.

IBM (and Nintendo) took the G3 processor and heavily modified it for the Gamecube; there were (approximately) 50 vector instructions added to (dramatically) improve its performance with 3D calculations. The overall result was that the Gekko was far more powerful than a standard G3 processor for 3D game applications. Whether the Gekko was more powerful then the modified Celeron that was in the XBox has been the center of debate since they were released, but the general consensus is that they're very similar in performance.

windbane said:

Also, i LOVE how you guys are ignoring the other benchmarks. Look at the freaking GPU numbers on the Wii! it's barely better than previous generations as well. The RAM is even more sad of a situation. So we can continue this architecture arguement until someone shows some benchmarks but you can't change the other facts.

Oh wait, next you'll be telling me that the GPU and RAM is SUPER-MEGA-OPTIMIZED LIKE NOTHING BEFORE because for some reason Nintendo holds the secrets of computing and is selling it at a loss. Oh wait, their Wii is worth around $150. That's right, I forgot.

The Flipper (Gamecube's GPU) was a very different beast than what was being developed for the PC at the time (or what is currently available in the PS3 or XBox 360). In 2000/2001 both ATI and Nvidia moved towards producing graphics cards with GPUs that abandoned fixed functionality pipelines (with lots of graphical features) in favour of programmable pipelines because the incompatibility between the features across videocards prevented developers from taking advantage of these features. At the same time the Flipper was designed around built in features and was (much) faster than the PC GPUs (like the XBox's GPU) when both systems only used graphical techniques that were built into the Flipper, but the Flipper had difficulty emulating many of the pixel and vertex shaders that were possible on the PC GPUs.

 

Overall we know very little about either the Hollywood or Broadway processors that are in the Wii and we only know that they're based on the Gamecube's Gekko and Flipper processors; if they are only overclocked versions of these processors the Wii would be (roughly) 1.5 to 2 times as powerful as the XBox and we should see games that look somewhat better than anything in the previous generation; on the other hand if they extended the instruction set we should see games that look quite a bit better than the previous generation.


 MEGA QUOTE.

YOU'RE  RIGHT IN ANYTHING.

 



windbane said:
Punisher said:
windbane said:

Even if the PowerPC was as efficient as the Intel Core Duo (the best processor in the world right now), it would only be abotu 50% better at the same clock speed. Yeah, that's a great improvement, but if you are still below 800MHz there's only so much better it can be.


And the most expensive one too.

This thread is already heading to the way off doom/flamewars. :p


They are priced the same as CPUs have been for years, if not a little cheaper due to cheaper designs. We're not even arguing price at this point, though. Shouldn't the best cost more? It's 30% than anything before it.


 you're wrong.

 



windbane said:

Apple can brag all they want but the fact remains that every single benchmark for over a decade showed that the fastest Intel processors were faster than the fastest Apple processors. Except for some video editing and photoshop applications. I'll give you that.

Anyway I never said I believed the Xbox was more powerful than the Wii. The Xbox didn't blow away the last generation so if the Wii is only slightly (1.5 to 2 is slightly to me) better than it's not much of an improvement. All reports thus far point to the fact that the CPU and GPU on the Wii are modified gamecube stuff, so it's not much better.

I think we basically agree on the power here. My point is that it's nowhere near PS3 or 360 and the Wii is much closer to last-gen than any previous console generation. No matter how you argue the architecure and special features, the Wii has low specs.


 NO YOU'RE WRONG!

 

READ THIS:

http://en.wikipedia.org/wiki/RISC#RISC_and_x86

 

AND THEN:

http://cse.stanford.edu/class/sophomore-college/projects-00/risc/risccisc/

 

i can't believe that there are still people that don't know difference between cisc and risc!!!!!

 

study also how core2 due really works.

but this isn't a topic on cpus...

if you compare cpus only on mhz...sorry you're are wrong...

 

i don't have time t explain why your are wrong so go on wikipedia to find answers!!!!!

 



Cobretti said:
somone said there is no chance the wii could ever do 720p. However didnt the original xbox have 720p capabilities. So shouldn't the wii in theroy be able to do it. Granted the shading and lighting won't be anywer ein the league of ps2/xbox360 but surely it can look decent enough at 720p

 sorry was me.

maybe wii GPU can handle 720p...

 

but you cannot output a digital 720p signal...