First of all things... how the f*** that site knows all the specs of the Hollywood? They actually dissected it? Got blueprints? Or they supose its just a smaller and speedy Flipper?
First of all things... how the f*** that site knows all the specs of the Hollywood? They actually dissected it? Got blueprints? Or they supose its just a smaller and speedy Flipper?
| leo-j said: Doesnt the new psp have 64mb ram? Ram doesnt matter for graphcis though, all it is is loading speed. and makes the interent go somewhat faster with the more ram you have. The wii is basicly a great console spec wise. |
Utterly ridiculous comment of the day.
| shams said: Does it actually have a DVD-9 drive? The GC used a propriatory drive system (developed by Ninty & Panasonic) to make copying hard. The Wii might be able to read DVD-9 discs - but I thought the system used an extension of the GC disc format (physically bigger discs, more memory). Not sure if they hold 4Gig or 9Gig. ???
|
Same question here? Doesn't the disc spin the other way? My bruhhah has a wii, and this is what he told me.

| Kwaad said: Good god, The damn thing runs on 20w. How fast can it be?!?! The PS3 runs on over 10x more power than the Wii. Count the HDD, I'd say the PS3 runs at least 175w. Or 8x more power useage. Now, let's do a rough, power useage/power comparison. 4x would be GeForce6. 2x Would be GeForce5. 1x would be GeForce4. Once agian, it nails the Wii at ~GeForce4 graphics chip performance. |
Utterly baseless, useless, and preposterous comment of the day.
AMD Athlon 64 X2 3800+ EE SSF: 35W
Intel Pentium 4 "E" 570 (3800MHz): 115W
Of course, based on your "logic", the Pentium 570 is leaps and bounds more powerful than the Sempron. Right. I guess the chip that controls my 1800-watt heater absolutely destroys the PS3 with its 10x "power usage/power comparison" advantage...
HappySqurriel said:
Kwaad, why do you even pretend to know what you're talking about? ArtX was a company that was made from former SGI employees and was the company Nintendo contracted to produce their GPU for the Gamecube. The Flipper was largely based on the GPU ArtX designed and sold to the US Military for flight simulations and has very little similarity to any nVidia GPU; there is a reason the military is willing to spend $10,000+ per GPU on these flight simulators rather than buy an over the counter graphics card like the Geforce 2. ATI bought ArtX and integrated a large portion of their technology into their main GPU line and this is one of the reasons why the Radeon 9800 series was so successful for ATI. |
The Gamecube's GPU was very close in specs to the original 7500/8500 ATI cards. The Wii's GPU is very close to what we see in the 9600 cards. I played Warcraft 3 on an 8500 card for many years... that game always looked great. From what I've seen, the Wii is easily able to pull off 8500 quality visuals. Mario Galaxy, Metroid Prime 3 and Super Smash Brothers Brawl all have a very 9600 ish look (same card in my HTPC). So I'm not going to complain one bit. I happen to like the things the 9600 card can pull off. My 850XT card in my main PC renders frames faster, but still only has the same or close to the same video quality.
Prepare for termination! It is the only logical thing to do, for I am only loyal to Megatron.
| Biggerboat said: Well Factor 5 said that Wii was roughly 2xXbox, so who are you gonna believe EA(think that was who made your quote) or a dev that actually knows console hardware? |
There's some speculation that Factor 5 is actually working on a Wii game. Wouldn't that be interesting ...
Threads of Interest:
The Movie Thread: http://www.vgchartz.com/forum/thread.php?id=6880
The Crow Eating Thread: http://www.vgchartz.com/forum/thread.php?start=0&id=3886
The Betting Thread: http://www.vgchartz.com/forum/thread.php?start=0&id=7104
Custom GIFs Thread: http://www.vgchartz.com/forum/thread.php?id=18963
The Greatest Game Ever Conceived On Any Platform
Tag: "I have tasted Obi-Wan's bitter tears"
I'd like to say flat out that I didn't like the article one bit. Far too much positive spin.
That said though, there's already enough evidence showing the Wii is not just a GC speed bump, as Squrriel mentioned. I can't find the source, but we have these numbers roughly verified through multiple sources.
CPU:
GC Gekko - 180nm process, 43mm² diesize.
Wii Broadway - 90nm process, 19mm² diesize.
If no transistors had been added, the expected size of Gekko at 90nm would be 11mm². A clock bump does not justify a 75% increase in size. This is speculated to be added L2 cache.
GPU:
GC Flipper - 180nm process, 110mm² diesize.
Wii Hollywood - 90nm process, 72mm² diesize.
If no transistors had been added, the expected size of the Flipper at 90nm would be 26mm². A clock bump does not justify 175% increase in size. Possibilities? Extra pipelines and texture units, more complex T&L, etc.
Besides, a point that is comonly brought up is programable shaders. Despite the fact that programable shaders are great in terms of flexibility, the fact is that most of the time they're used to produce some pretty standard effects. The GC's non-programable pipeline already supported some of these common effects, and the Wii is likely to have improved on that support. A simple example is bump mapping: programable pixel/fragment shaders are used in most platforms; the Wii/GC TEV supports it out of the box; one of the vector processors on the EE was used in the PS2. Lack of flexibility, though, may mean developers won't just come up with new tricks later in the generation.
^It WAS from a feature they do for NintendoWiifanboy. I am not tech savy, so it all just sounds interesting.
Why do people even care. It seems so pointless to argue over such things.
The tech part was OK actually. The writing style though...
But I guess that's what makes it an interesting read on "Nintendo Wii Fanboy" (it just won't turn any Wii haters' hearts).