By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Report: Wii U GPU uses R770

drkohler said:
ethomaz said:
ghost_of_fazz said:

And the RV770 (on 55nm) consumed between 110w and 190w of power, but that was determined only by the core speed (525 Mhz being the lowest and 850 Mhz the highest), being unrelated to the number of available shader processors.

The power is not a problem because that new GPU will be make in 32nm...

How do you know? Do you know how many 32nm fabs are actually operating (you qould be surprised) for mass manufacturing? If they really had a 32nm line, wouldn't they shrink the CPU first?

Let's not forget the 4890 is an RV790 chip, not an RV770 chip. Even at 45nm, we are looking at TDPs of >100Watts, and adding the rumour of a superfast CPU with a TDP over 125-150Watts, that would make a hell of a lot of heat to dissipate for such a small WiiU box as shown in pix (I love the picture where a Ninty guy places the WiiU into an almost completely enclosed TV stand.. that unit would melt within minutes..)

There are no 45nm for GPU... if Nintendo not used 32nm it will be 40nm for sure (all Radeon HD 6xxx GPU uses 40nm).

And 2012 is ready for 32nm... the AMD Fusion (GPU+CPU) is 32nm and most Intel CPU is 32nm too (Intel will show 22nm in 2012).

So I can't see Nintendo using the dated 55nm... at least 40nm.



Around the Network
Zlejedi said:
demonfox13 said:
Hmmmm so that is the possible GPU for the Wii U? That's kinda low. I am using a Sapphire Radeon HD 5830 which kinda blows that out the water. However, it is powerful compared to the 5 year + consoles....go figure lol.

Shouldn't have any problems with running current multiplatform games at 1920x1080 with 30 fps. Maybe even more once we factor in  optimisations for console hardware.

If this is Wii U gpu that's a healthy acceptable minimum even if Sony/ms come out with some kind of monster.

That's the thing though, the MS and Sony would just have to use something similar to my Sapphire Radeon HD 5830  256 bit and it would pretty much crush the old card. When I bought it a few weeks back on newegg it was only $110 US. I don't doubt that the Wii U's GPU is beast when compared to current consoles but I do think Nintendo could have pushed a little higher and the price difference wouldn't have been that dramatic while the performance would have. I am also aware that a console GPU tends to work a bit better than the PC counterpart because the GPU can focus on the game itself nearly 100% whereas the CPU's has to run a demanding OS in the background such as Windows 7 and so on.



Make games, not war (that goes for ridiculous fanboys)

I may be the next Maelstorm or not, you be the judge http://videogamesgrow.blogspot.com/  hopefully I can be more of an asset than a fanboy to VGC hehe.

ethomaz said:
drkohler said:
ethomaz said:
ghost_of_fazz said:

And the RV770 (on 55nm) consumed between 110w and 190w of power, but that was determined only by the core speed (525 Mhz being the lowest and 850 Mhz the highest), being unrelated to the number of available shader processors.

The power is not a problem because that new GPU will be make in 32nm...

How do you know? Do you know how many 32nm fabs are actually operating (you qould be surprised) for mass manufacturing? If they really had a 32nm line, wouldn't they shrink the CPU first?

Let's not forget the 4890 is an RV790 chip, not an RV770 chip. Even at 45nm, we are looking at TDPs of >100Watts, and adding the rumour of a superfast CPU with a TDP over 125-150Watts, that would make a hell of a lot of heat to dissipate for such a small WiiU box as shown in pix (I love the picture where a Ninty guy places the WiiU into an almost completely enclosed TV stand.. that unit would melt within minutes..)

There are no 45nm for GPU... if Nintendo not used 32nm it will be 40nm for sure (all Radeon HD 6xxx GPU uses 40nm).

And 2012 is ready for 32nm... the AMD Fusion (GPU+CPU) is 32nm and most Intel CPU is 32nm too (Intel will show 22nm in 2012).

So I can't see Nintendo using the dated 55nm... at least 40nm.

I think the point was that today Nintendo (probably) could not have their CPU and GPU manufactured using a 32nm process to have the systems on display represent actual hardware ...

While the Wii U's GPU will be based on the R770 GPU, will be heavily modified, and will be manufactured (probably) using a 32nm process it is highly plausable that the GPU in the systems on the show floor were a stock R770 GPU that was manufactured using a 55nm process and underclocked to fit into the tiny case that was on display.



HappySqurriel said:
ethomaz said:
drkohler said:
ethomaz said:
ghost_of_fazz said:

And the RV770 (on 55nm) consumed between 110w and 190w of power, but that was determined only by the core speed (525 Mhz being the lowest and 850 Mhz the highest), being unrelated to the number of available shader processors.

The power is not a problem because that new GPU will be make in 32nm...

How do you know? Do you know how many 32nm fabs are actually operating (you qould be surprised) for mass manufacturing? If they really had a 32nm line, wouldn't they shrink the CPU first?

Let's not forget the 4890 is an RV790 chip, not an RV770 chip. Even at 45nm, we are looking at TDPs of >100Watts, and adding the rumour of a superfast CPU with a TDP over 125-150Watts, that would make a hell of a lot of heat to dissipate for such a small WiiU box as shown in pix (I love the picture where a Ninty guy places the WiiU into an almost completely enclosed TV stand.. that unit would melt within minutes..)

There are no 45nm for GPU... if Nintendo not used 32nm it will be 40nm for sure (all Radeon HD 6xxx GPU uses 40nm).

And 2012 is ready for 32nm... the AMD Fusion (GPU+CPU) is 32nm and most Intel CPU is 32nm too (Intel will show 22nm in 2012).

So I can't see Nintendo using the dated 55nm... at least 40nm.

I think the point was that today Nintendo (probably) could not have their CPU and GPU manufactured using a 32nm process to have the systems on display represent actual hardware ...

While the Wii U's GPU will be based on the R770 GPU, will be heavily modified, and will be manufactured (probably) using a 32nm process it is highly plausable that the GPU in the systems on the show floor were a stock R770 GPU that was manufactured using a 55nm process and underclocked to fit into the tiny case that was on display.

While using a 32nm process would be ideal, you must remember that it's Nintendo. They always play safe with their hardware so will probably use the tested and now reliable 40nm process.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

People are easily forgetting the fact that Wii U has the ability to render two different complex scenes at the same time, as shown in this version of the tech demo with the sparrow and the falcon (the second part of the demo that wasn't shown on the press conference, looks really nice):

http://www.youtube.com/watch?v=i2Nsa06KRLo

The controller view of the scene is different from that on the TV.

My guesstimate is that Wii U will be at least 2x graphically stronger than the x360 and ps3. It won't be a mindblowingly substantial leap, but neither will be ps4 vs PS3 compared to PS2 vs PS1 or PS3 vs PS2.

Edit: I totally suck at embedding video, sou you lazy ppl will be forced to click on the link



.

Around the Network
demonfox13 said:
Zlejedi said:
demonfox13 said:
Hmmmm so that is the possible GPU for the Wii U? That's kinda low. I am using a Sapphire Radeon HD 5830 which kinda blows that out the water. However, it is powerful compared to the 5 year + consoles....go figure lol.

Shouldn't have any problems with running current multiplatform games at 1920x1080 with 30 fps. Maybe even more once we factor in  optimisations for console hardware.

If this is Wii U gpu that's a healthy acceptable minimum even if Sony/ms come out with some kind of monster.

That's the thing though, the MS and Sony would just have to use something similar to my Sapphire Radeon HD 5830  256 bit and it would pretty much crush the old card. When I bought it a few weeks back on newegg it was only $110 US. I don't doubt that the Wii U's GPU is beast when compared to current consoles but I do think Nintendo could have pushed a little higher and the price difference wouldn't have been that dramatic while the performance would have. I am also aware that a console GPU tends to work a bit better than the PC counterpart because the GPU can focus on the game itself nearly 100% whereas the CPU's has to run a demanding OS in the background such as Windows 7 and so on.


I'm sorry but the HD 5830 doesn't 'crush'  a 4850/4890.

The   4000s and  5000s are from everything I've heard pretty close.

The 5000s have better tesselation support and run more efficiently but they're not really far off from the 4000s.

Something like a 6950 on the other hand...



ethomaz said:

Zelda HD Demo runs 30 fps (with a lot of framedrop) without AA... and not even running in 1080p... the HD 4890 do that running Uncharted 2 together.

Don't know if you were at E3 or not, but from what I've widely heard from attendees, yes it was only 30fps, but without any framedrop and most certainly running in 1080p...



JEMC said:
HappySqurriel said:
ethomaz said:
drkohler said:
ethomaz said:
ghost_of_fazz said:

And the RV770 (on 55nm) consumed between 110w and 190w of power, but that was determined only by the core speed (525 Mhz being the lowest and 850 Mhz the highest), being unrelated to the number of available shader processors.

The power is not a problem because that new GPU will be make in 32nm...

How do you know? Do you know how many 32nm fabs are actually operating (you qould be surprised) for mass manufacturing? If they really had a 32nm line, wouldn't they shrink the CPU first?

Let's not forget the 4890 is an RV790 chip, not an RV770 chip. Even at 45nm, we are looking at TDPs of >100Watts, and adding the rumour of a superfast CPU with a TDP over 125-150Watts, that would make a hell of a lot of heat to dissipate for such a small WiiU box as shown in pix (I love the picture where a Ninty guy places the WiiU into an almost completely enclosed TV stand.. that unit would melt within minutes..)

There are no 45nm for GPU... if Nintendo not used 32nm it will be 40nm for sure (all Radeon HD 6xxx GPU uses 40nm).

And 2012 is ready for 32nm... the AMD Fusion (GPU+CPU) is 32nm and most Intel CPU is 32nm too (Intel will show 22nm in 2012).

So I can't see Nintendo using the dated 55nm... at least 40nm.

I think the point was that today Nintendo (probably) could not have their CPU and GPU manufactured using a 32nm process to have the systems on display represent actual hardware ...

While the Wii U's GPU will be based on the R770 GPU, will be heavily modified, and will be manufactured (probably) using a 32nm process it is highly plausable that the GPU in the systems on the show floor were a stock R770 GPU that was manufactured using a 55nm process and underclocked to fit into the tiny case that was on display.

While using a 32nm process would be ideal, you must remember that it's Nintendo. They always play safe with their hardware so will probably use the tested and now reliable 40nm process.

The Wii's CPU and GPU used the same 90nm process as the PS3 and XBox 360, which was fairly cutting edge at the time; and the Gamecube used a 180nm process which was cutting edge when it was released.

Being that a smaller manufacturing process (typically) helps to keep production costs down, and also helps to make smaller and more energy efficient systems, it is completely within Nintendo's usual strategy to use the most current mass market manufacturing process available; and for the Wii U that will (probably) be a 32nm or smaller process.



archbrix said:
ethomaz said:

Zelda HD Demo runs 30 fps (with a lot of framedrop) without AA... and not even running in 1080p... the HD 4890 do that running Uncharted 2 together.

Don't know if you were at E3 or not, but from what I've widely heard from attendees, yes it was only 30fps, but without any framedrop and most certainly running in 1080p...

The framedrop could be the video I whatched (or my connection)... but the game is running in 720p upscalled to 1080p... the screean released show that.



archbrix said:
ethomaz said:

Zelda HD Demo runs 30 fps (with a lot of framedrop) without AA... and not even running in 1080p... the HD 4890 do that running Uncharted 2 together.

Don't know if you were at E3 or not, but from what I've widely heard from attendees, yes it was only 30fps, but without any framedrop and most certainly running in 1080p...


Wait a minute, did ethomaz claim that the Radeon HD 4890 ran Uncharted 2 at 1080p?

I would like to see evidence for that claim ...