By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Umos-Cmos said:

Someone asked earlier what we could expect graphics-wise from this baby.  If the rumors are true and it does indeed use a modified version of the HD4850 1GB video card you can expect results similar to this:

The aforementioned video card is the recommended one for this game (The Witcher 2) from the publisher.  It'll likely be more refined versions of current gen games.  Better framerates, pixel refresh rates, true 1080p, better anti-aliasing, and modern shaders.  This doesn't take into account the amount of RAM they decide to use.  Nintendo has always used fast RAM in the past, but the quantity will determine how good the graphics will be.  If it has 2GB or so of RAM it will allow for greater draw distances and no popup (can't believe modern games still have popup).  The extra RAM would allow for more data to be buffered at once so the processor doesn't have to do it on the fly.

I'm no expert on hardware, but that much I do know.  Either way, I think Nintendo fans are in for a real treat.  Zelda HD, or Metroid HD will be amazing!

Edit:  Can you pull that image up in your browser OK? I guess you can right click and choose "view image".

While it is an impressive image in 1080p @ 60fps, I still think that it is likely that Nintendo will probably not be too worried about games being output beyond 720p @ 30fps. The reason for this is that the majority of people (probably) do not see a significant advantage of 1080p over 720p (or 60fps over 30fps) and it requires a much more expensive GPU to output similar graphics at 1080p @ 60fps than 720p @ 30fps, and it is unlikely that Sony and Microsoft would be able to produce a system that creates noticeably better graphics at 720p or 1080p without being released far later at a much higher price,

The reason I make this claim is not because this hardware is so amazing that it couldn't be surpassed, it is simply that (as I said before) we're hitting the limit of the raster-scanline algorithm at 720p; effectively, we will be unable to display more geometry or textures which meaningfully add to the image quality, and shaders will be unable to produce better lighting without moving to global illumination. Potentially, Sony and Microsoft could aim for a similar position in 1080p @ 60fps (resulting in a moderate perceived improvement) but that would require 4 to 8 times the processing power; or they could move towards a global illumination system in 720p @ 30fps which would also require 8 times the processing power or more.