By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Report: Wii U GPU uses R770

ghost_of_fazz said:

And the RV770 (on 55nm) consumed between 110w and 190w of power, but that was determined only by the core speed (525 Mhz being the lowest and 850 Mhz the highest), being unrelated to the number of available shader processors.

The power is not a problem because that new GPU will be make in 32nm... the RV740 is in 40nm and consume way less than RV770.



Around the Network

zarx said:

 I think your power estimations are kinda off...

  Fab (nm) Memory (MiB) Core clock (MHz) Memory clock(MHz) Config core Pixel (GP/s) Texture (GT/s) Bandwidth (GB/s) RAM type
Radeon HD 4890 55[43] 1024
2048
850 9752[44] 800:40:16 13.6 34 124.8 GDDR52


  Fab (nm) Core clock (MHz) Core config Pixel (GP/s) Texture (GT/s) Memory (MiB) Memory clock(MHz) Bandwidth (GB/s) RAM type
Xenos (Xbox 360)[98] 90
65
45
500 48:16:82 4 8 512 (shared)
10 eDRAM
700
500
22.4
32 (GPU – Logic)
256 (Logic – Memory)
GDDR3
eDRAM

Why? The 48 shaders cores of Xenos is like 200 or more shaders cores of HD 4890... so at least 4x more power... of course theare are others variables like clock, memory, TMUs and ROPs... but in the worst case the HD 4890 is 4x more powerful than Xenos (maybe the best case it is 8x more powerful).



PullusPardus said:
O_o?

why not? , every new PC now uses Radeon HD 5xxx and the a little more pricey ones use Radeon HD 6xxx series's , GPU's aren't that expensive.

Read my answer to @Play4Fun.

Play4Fun said:

So, the reason why there is no way it's using a 4890 is because that would make it 4x   more powerful than the 360's GPU? I don't follow that logic.

Anyways, even before the Wii U was revealed rumours were pointing to it using a R770 and IGN said it was something like the 4850. So I am optimistic about this ruomour.

A customized 4850/4890 would be cheap, would not run too hot and with a nicely customized Power 7 CPU plus 1 GB to 1.5 GB RAM would give a very nice leap over PS360.

Did you see the Zelda HD Demo for Wii U? There are no way that Demo is running in a HD 48xx... that's my point... what Nintendo showed until now is not running in a powerful high-end HD 4890... it's more like a HD 46xx or less.

Other point is the developers talking about the power of Wii U... none said anything to support that RV770 based GPU.

But of course I can be wrong... like that rumor from Engadget.



ethomaz said:
PullusPardus said:

 

Play4Fun said:

 

Did you see the Zelda HD Demo for Wii U? There are no way that Demo is running in a HD 48xx... that's my point... what Nintendo showed until now is not running in a powerful high-end HD 4890... it's more like a HD 46xx or less.

Other point is the developers talking about the power of Wii U... none said anything to support that RV770 based GPU.

But of course I can be wrong... like that rumor from Engadget.


1. Not only did Nintendo say that demo was something quickly put together for E3, but  how can you tell what something is running on just by looking at the game/demo?

Yeah, none said anything concrete . We get one rumour telling us something one day and another telling us the opposite the next.

Well, since you seem to think you have the ability to tell what non-finalised GPU  the tech demo was running on by just looking at it, I'm going to go ahead and say you're probably wrong. =p



Play4Fun said:

1. Not only did Nintendo say that demo was something quickly put together for E3, but  how can you tell what something is running on just by looking at the game/demo?

Because even a bad tech demo mayde by me runs better than Zelda HD on a HD 4890... today we have the HD 6970... two generations ahead  but the HD 4890 is high-end yet and more powerful than mid-end HD 6xxx.

Zelda HD Demo runs 30 fps (with a lot of framedrop) without AA... and not even running in 1080p... the HD 4890 do that running Uncharted 2 together.



Around the Network

Isn't it tough to compare console GPU with PC GPUs?

My impression is that console GPUs are way more utilized since the hardware is standardized allowing developers to optimize the software.



ethomaz said:
Play4Fun said:

1. Not only did Nintendo say that demo was something quickly put together for E3, but  how can you tell what something is running on just by looking at the game/demo?

Because even a bad tech demo mayde by me runs better than Zelda HD on a HD 4890... today we have the HD 6970... two generations ahead  but the HD 4890 is high-end yet and more powerful than mid-end HD 6xxx.

Zelda HD Demo runs 30 fps (with a lot of framedrop) without AA... and not even running in 1080p... the HD 4890 do that running Uncharted 2 together.

I can run Wolfenstein 3D on my Nvidia 9800GT PC. Does not mean it can't run Crysis as well.

EDIT: why is ths entire thread in italics?



ethomaz said:
Play4Fun said:

1. Not only did Nintendo say that demo was something quickly put together for E3, but  how can you tell what something is running on just by looking at the game/demo?

Because even a bad tech demo mayde by me runs better than Zelda HD on a HD 4890... today we have the HD 6970... two generations ahead  but the HD 4890 is high-end yet and more powerful than mid-end HD 6xxx.


The tech demo ran at 30 fps and seemed to be mainly highlighting the lighting, cloth physics and such things.

You can't say you can tell what is/will be in the console just by looking at that demo. Especially on hardware that is not yet finalized.

You know just as much as the rest of us.

I'm just going by the rumours that seem to be more consistent.



silicon said:
Isn't it tough to compare console GPU with PC GPUs?

My impression is that console GPUs are way more utilized since the hardware is standardized allowing developers to optimize the software.

Indeed.

Console games are coded on hardware, meaning that the game code "talks" directly to the hardware allowing developers to fully use its power. Developers can do this as every 360/PS3/Wii has the same components.

PC games can't do this as there is no standard (nº of CPU cores, RAM, brand and family of GPU, etc.) so they are forced to use "translators" (directX, OpenGL, AMD/Nvidia drivers) that tell the hardware what to do based on the game code. This avoids PC games to fully exploit the hardware.

That's why games like Uncharted or Gears look as better as PC games running on new GPUs even though they run on "obsolete" hardware.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
Indeed.

Console games are coded on hardware, meaning that the game code "talks" directly to the hardware allowing developers to fully use its power. Developers can do this as every 360/PS3/Wii has the same components.

PC games can't do this as there is no standard (nº of CPU cores, RAM, brand and family of GPU, etc.) so they are forced to use "translators" (directX, OpenGL, AMD/Nvidia drivers) that tell the hardware what to do based on the game code. This avoids PC games to fully exploit the hardware.

That's why games like Uncharted or Gears look as better as PC games running on new GPUs even though they run on "obsolete" hardware.

You right but the consoles uses "translators" too like PC... the OpenGL ES API is used in PS3 and DirectX 9 API in X360 .