By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo - Wii U GPU new info and more speculation

ninjablade said:
gronk-bonk said:

any were going off topic and this stupid, you think mario looks better, it doesn't change the fact that resident evil 4 and rogue leader are still some of the best looking games on wii, and no need for the visual impairment remark, cause its sill, your pics are not impressive at all, if we post pics of both games using emulators i bet resident evil will look better.

just jokin with yah bro


thats cool were still friends, you might getting some good news, it looks like a wiiu muliplat game coming out soon will look better on wiiu. anyway  i owned an xbox back then, and you know xbox had some very techically impressive games, any i went to a  my friends house and saw resident evil 4 on ps2 and was blown away by it on ps2, the game really is a work of art imo, and its why many people of lastgen call the best looking game of the generation, even with games like ninja gaiden, splinter cell chaos theory and conker bad fury day, which were technically much more impressive graphically.

Resident Evil 4 was amazing graphically on cube and ps2. I think it was the best looking game of that generation imo. Nevertheless RE4 is one of the greatest games of all time.



Bet with ninjablade:

Ninjablade wins if the next 5 multiplat on the wii u are inferior to the 360 version.

I win if one of the 5 mulitplats are on par or superior on the Wii U.

Around the Network
gronk-bonk said:
ninjablade said:
gronk-bonk said:

any were going off topic and this stupid, you think mario looks better, it doesn't change the fact that resident evil 4 and rogue leader are still some of the best looking games on wii, and no need for the visual impairment remark, cause its sill, your pics are not impressive at all, if we post pics of both games using emulators i bet resident evil will look better.

just jokin with yah bro


thats cool were still friends, you might getting some good news, it looks like a wiiu muliplat game coming out soon will look better on wiiu. anyway  i owned an xbox back then, and you know xbox had some very techically impressive games, any i went to a  my friends house and saw resident evil 4 on ps2 and was blown away by it on ps2, the game really is a work of art imo, and its why many people of lastgen call the best looking game of the generation, even with games like ninja gaiden, splinter cell chaos theory and conker bad fury day, which were technically much more impressive graphically.

Resident Evil 4 was amazing graphically on cube and ps2. I think it was the best looking game of that generation imo. Nevertheless RE4 is one of the greatest games of all time.

 I agree.



One thing I think we can put to rest is this. As far as the issue of the Wii U GPU being a AMD 4000 line or 5000 line. It would definitely be the 5000. I think if, based off anything it would have to be the 5550. The 4600 line has two 400Mhz RAMDACS. It's also fabbed at 55nm. The 4600 is probably closer to 70 watts under a full load. The 5550 is 550 Mhz has the 320 SPU 15 TU 8 ROP and fabbed at 40nm. It has a maximum board power 39 Watts. I think it used the 5550 as a base with some newer tech added features. Either that or built straight from the ground up which I think unlikely.

http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-4000/hd-4600/Pages/ati-radeon-hd-4600-overview.aspx
http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-4000/hd-4600/Pages/ati-radeon-hd-4600-specifications.aspx
http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5550/Pages/hd-5550-overview.aspx#2 (under specs)



unaveragejoe said:

One thing I think we can put to rest is this. As far as the issue of the Wii U GPU being a AMD 4000 line or 5000 line. It would definitely be the 5000. I think if, based off anything it would have to be the 5550. The 4600 line has two 400Mhz RAMDACS. It's also fabbed at 55nm. The 4600 is probably closer to 70 watts under a full load. The 5550 is 550 Mhz has the 320 SPU 15 TU 8 ROP and fabbed at 40nm. It has a maximum board power 39 Watts. I think it used the 5550 as a base with some newer tech added features. Either that or built straight from the ground up which I think unlikely.

http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-4000/hd-4600/Pages/ati-radeon-hd-4600-overview.aspx
http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-4000/hd-4600/Pages/ati-radeon-hd-4600-specifications.aspx
http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5550/Pages/hd-5550-overview.aspx#2 (under specs)

Looks like it lines up very well. One thing that's bothered me is, the estimate of 12.8GB/s data throughput to the RAM. This lists DDR3 with this particular card at 24.5 – 28.8 GB/s.

Here's the kicker: "Drive three displays simultaneously with independent resolutions, refresh rates, color controls, and video overlays ", this is necessary for the Nintendo-stated capability of running a primary display, as well as 2 Gamepads at the same time.

To be clear: The 4000 series cards CANNOT run 3 displays simultaneously, while Nintendo has stated the WiiU is capable of running 3 displays. It cannot be a 4000 series card.



One interesting note about the RAM is it that  24.5 – 28.8 GB is almost exactly  double 12.8GB. Another note about the Wii U's RAM latency Nintendo claims to hit it's theoretical numbers not just have it as potential. You hear developers complain about CPU issues but, I do not recall one article complain about RAM latency issues. Even though on paper the RAM seems to be lacking though. Makes you say hmm.



Around the Network
timmah said:

Looks like it lines up very well. One thing that's bothered me is, the estimate of 12.8GB/s data throughput to the RAM. This lists DDR3 with this particular card at 24.5 – 28.8 GB/s.


Redwood LE (HD5550, 320:16:8 part) has been around for some time now as candidate (my pick as well), with exact same clock as WiiU's GPU, TDP of 25W (in mobile version) and die size that fits inside initial measurments (104mm^2 is for Redwood Pro (400:20:8), so my guesstimate is around 90mm^2 for LE version).

It comes in 3 flavours (all on 128bit bus): GDDR5 with 51.2GB/s bandwith (27VP rating), DDR3 with 25.6GB/s (21.8VP) and DDR2 12.8 GB/s (16.5VP). The last one is indentical in throughoutput as suggested WiiU's DDR3 bandwith over 64bit bus - 12.8GB/s. Notice the difference in ratings when memory bandwith changes - at suggested WiiU's bandwith, this GPU is terribly bottlenecked (performs at some 60% of its full potential with GDDR5).

HD 6450 is 160:8:4 part, with 12.8GB/s memory bandwith over 64 bit bus - at its stock core clock of 750MHz it has VP of 16.1....compared to 5550 over same bandwith which has 16.5VP, it achieves almost same performance with half SPUs/TMUs/ROPs, albeit with higher clock. At 550MHz it would have around 13-14 VP, similar, or just slightly better to 360.

If they really put something equvivalent to 5550, I'm hoping they also put some way to actually overcome that 64bit bus bottleneck - otherwise, 320:16:8 really starts to look like overkill for such constrained memory system.



ninjablade said:
dahuman said:
ninjablade said:
dahuman said:
ninjablade said:
dahuman said:
ninjablade said:
osed125 said:
ninjablade said:

you have no idea what your  talking about do you, if your hardware is more more powerful and based off a hd 5550, why should'nt it run better or astleast on par, when its in a closed box. go get some knowledge dude and also do you think pc is the same hardware, its all different as well, yet more powerful = runs better 90% of the time.

Yeah because game development is so easy that you can just change the resolution and textures by just clicking a bottom like PC games... 

quote}Taking the 360 version of a game and throwing it across to the PC on a card like the HD 5550 results in big performance increases. We've heard from developers on this very forum just how little hardware specific optimisation PC ports get, and yet still games run faster and/or at higher resolutions when you plug in a significantly faster GPU than the 360 has.[quote

 

I know this suprises you but consoles are alote like PC's but in a closed box and get way more optimisation.

hahahahahaha, you are taking it as literally, taking 360 code, throw it on PC, and it automatically runs better? Have you ever heard of a little game called Saint's Row 2? My lord are you clueless.

i said 90% of games and please show me saints row 2 running on a HD 5550 and running much worst then the 360 version, so we can post it at beyond3d, you think your smart, why don't you and post on beyond3d  and prove there theory, i will watch you get owned and laugh


lol...... I should have figured that you wouldn't know what I was even talking about, how about this, you get people from beyond3d to read this dumb and pointless conversation, then get your own face pushed in, and then we can call it a day?

you think i'm gonna listen to anything you say, you looked at the gpu die and have no idea what your looking at, your no tech head, just somebody biased and pretending to know whey they are talking about, so goodnight.

lol I saw the memory and the shader blocks right away, it wasn't until chipworks came out with more analysis that I knew where the controllers were, then you have a lot of spaces that people are still speculating without facts, the point is that I knew from the get go that there are too many unknown factors and I'm not willing to spend the time on trying to dicipher it. Also I always notice this with people, when they can't win a conversation, they just call me biased, nothing is more satisfying, and you are afraid to ask people from other sites to see how clueless you are, I love it.


lol your full of it, saw your OP about the gpu die, it read like somebody who has no idea what there talking about, everything that i posted here was what i was told from reliable posters from beyond3d that know what they are talking about, even razor dragon agreed and thought it was 160sp, like i  said go on beyond3d and own then, prove them wrong, i will give you sig control for 2 weeks if you put up a good argument, about ports being the reason wiiu is lacking and  its not power. you have an account already or your probably too  scared.



Wow this runs so much worse and looks shittier right? o_O; I don't want your sig control, I want you to take my other bet, sig control means shit to me, and I still have no hard facts from you.



My word.....the quote pyramids. You guys really need to stop feeding the troll. He doesn't know what he's talking about and literally parrots back what he reads without any comprehension. Just ignore him. Stop giving him attention. He'll go away.



VGKing said:
LemonSlice said:
So the Wii U could have the glitz and glamor of next-gen without the raw power? Games as good looking, but not as detailed?

You're being way too optimistic. With custom the GPU is and how Nintnedo doesn't play nice with 3rd parties.....we could be looking at Nintendo's next Gamecube. That's not necessarily a bad thing though. 

The Wii U was DESIGNED to please third parties. How on earth do you reach the OPPOSITE conclusion? :/



I LOVE ICELAND!

dahuman said:
ninjablade said:
dahuman said:
ninjablade said:

Wow this runs so much worse and looks shittier right? o_O; I don't want your sig control, I want you to take my other bet, sig control means shit to me, and I still have no hard facts from you.

first off thsi port is coming out almost a  5 months later, so it should be enhanced, second of all u should wait for DF face off beofore calling a victory, it could run much worst for all we know.