I want to post random screen shots also!
I think this looks better than anything posted yet!


I want to post random screen shots also!
I think this looks better than anything posted yet!


I believe frame rate must be given a great deal of consideration, 60fps requires twice the polygons, pixels, effects, etc per second compared to 30 fps, My brother can have crysis run at highest detail at something like 1600 * 1200 and it would produce fabulous looking stills, but it would be running at about 5 frames per second
comparing stills is terribly effective way to compare things that aren't running at the same framerate, furthermore they don't indicate what is going on behind the scenes, like a.i., physics, and other information processing
I HAVE A DOUBLE DRAGON CAB IN MY KITCHEN!!!!!!
NOW A PUNISHER CAB!!!!!!!!!!!!!
From the framerate perspective, the Wii has plenty of horsepower. It lacks the programmable shaders of the original XBox GPU, but if you're willing to limit yourself to old school rendering methods, your Wii game should have a great framerate.
Once you get over the fact that its in 480p, and realize that you don't need normal maps, multitexturing, self-shadowing, streaming textures off a HDD, etc. for a game to look decent and be fun, the Wii is a fine game system. The PS2 is still a fine game system too. That's why it still sells.
The blue ocean demographic has "spoken" about how they feel about HD and high-performance graphics. They don't care. They Wii. Comparisons are pointless discussions, trying to pretend that the Wii "sucks" or something merely because its GPU is most definately stuck in 2002. The Wii still owns as much marketshare, of the current gen, as the PS3 and X360 combined. Apparently, GPU (and CPU) doesn't matter as much as some might think.
As far as "defending" the Wii's architecture as being even remotely close to the X360 or PS3... lol. I still have no idea why people even bother arguing that the Wii can look good, from a purely technical standpoint. It can't compare to the PS360 from that, very limited I might add, point-of-view -- not by a longshot. I would argue that SM64 looks fine, in this day and age, at 480i. It has nothing to do with its techincal merits.
Groucho the Gamecube had superior multitexturing in comparison to the X-box. Lucasarts had a port of Rogue Squadron 2 and 3 in the works for X-box but they had to can the game because Factor 5 couldn't get the game running on X-box without serious concessions. Rogue Squadron 3 pushed the most polygons a second of any game last gen by far. Rogue Squadron 2 topped most games from last gen as well. This was while running every available hardware effect, at 60fps, in 480p.

nintendo is telling developers to dumb down their graphics insisting that they focus more on innovation than showy graphics.
of course, in-house productions are the exception to the rule - showing that nintendo's products rule.
or they don't want to blow the minds of grandmothers with life-like graphics which would only accelerating their poor minds into senility (yes, it's made up word).
| Darc Requiem said: Groucho the Gamecube had superior multitexturing in comparison to the X-box. Lucasarts had a port of Rogue Squadron 2 and 3 in the works for X-box but they had to can the game because Factor 5 couldn't get the game running on X-box without serious concessions. Rogue Squadron 3 pushed the most polygons a second of any game last gen by far. Rogue Squadron 2 topped most games from last gen as well. This was while running every available hardware effect, at 60fps, in 480p. |
Sorry pal, I worked on both the GC and XBox in the last gen. The XBox's GPU is most assuredly superior to the GameCube's, no matter what Factor 5 believes or blubbers on about. I guarantee you'v e seen some of the work I did on the XBox, if you even owned one, and I'm sure I know more than F5 does about its GPU, and I know the GC's GPU pretty damn well too. F5 was dead wrong if they thought the GC's GPU was better. They're about the only development studio who has ever even stated such a thing... and now they're out of business.
Factor 5 also stated (its not worth my time to find the article link) that the Wii's GPU was comparable to the PS3's. Lol. And then they went under. I think that speaks for itself.
The only reason the Wii's GPU is comparable (IMO slightly better than) to the XBox's nv2a is due to its clock.
Groucho said:
Sorry pal, I worked on both the GC and XBox in the last gen. The XBox's GPU is most assuredly superior to the GameCube's, no matter what Factor 5 believes or blubbers on about. I guarantee you've seen some of the work I did on the XBox, even if you owned one, and I'm sure I know more than F5 does about its GPU, and I know the GC's GPU pretty damn well too. |
Then you should know that the Gamecube's GPU had superior texturing and lighting capablities. The NV2A had a higher fill rate and programmable shaders. The Flipper, the GC's GPU, could do 8 textures in a single pass with 8 simultaneous hardware lights. The NV2A could only do 4 textures in a single pass with 4 hardware lights.

Darc Requiem said:
Then you should know that the Gamecube's GPU had superior texturing and lighting capablities. The NV2A had a higher fill rate and programmable shaders. The Flipper, the GC's GPU, could do 8 textures in a single pass with 8 simultaneous hardware lights. The NV2A could only do 4 textures in a single pass with 4 hardware lights. |
Sure... if you were rendering a slideshow the GC rocked. Then again, so does software rendering, doesn't it?
Single pass != single cycle, if you weren't aware. Also, of course, those 8 textures (tiny enough to fit in the flipper's texture memory, at that) would be kinda overkill without programmable shaders, wouldn't they?
The nv2a was about 50% faster than the flipper, for all practical purposes, and its shader functionality, stencil buffer, much larger texture memory, etc., provided utility that the flipper just could not match. That's all that mattered in the end.
| Groucho said: Umm... actually JPEG compression will often tend to have the effect of pseudo "anti-aliasing" a screenshot -- making it look smoother. The raw BMP images from the XBox would look worse/more pixelated, I think. |
Yes, bluring a picture is the same as pseudo AA ...
Or is it!? 