By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U GPU Type CONFIRMED! Custom AMD E6760!

Captain_Tom said:
lilbroex said:
Captain_Tom said:

So it essentially has a partially nerfed HD 6570 GDDR5 graphics card.  Just in case anybody is wondering, that's a joke lol!   It can probably play games that look better than the current consoles, but not much.  At most, it will run games at 1080p with 30 fps, same settings as they have now.  In 720p, 60 FPS on med-high might be possible...


Its already doing better than that according to the Toki Tori, BO2, and ZombiU devs.


How so?  OMG BO2 running 1080p 60 fps?  Don't make me laugh!  My $300 Netbook can run COD:BO above 720p.  COD is NOT a modern game.


It really depends on the detail settings of the game ...

At the highest detail settings you'd need a fairly powerful gaming PC to play at 1080p@60fps, at the lowest detail settings a netbook could probably run it at 720p@30fps; and I suspect the Wii U version has its settings somewhere between the two.



Around the Network

I do not understand the point of bringing the PC into a console comparison. Its like a last ditch effort to deny gains when all else fails.



ECM said:
Captain_Tom said:
errorpwns said:
Lafiel said:

this chip would be a good step up from Xenos/RSX, but by todays graphic cards offerings it's low end

for comparison, the E6760 is almost the same as the GPU on the Radeon HD 6670: http://www.amd.com/us/products/desktop/graphics/amd-radeon-hd-6000/hd-6670/Pages/amd-radeon-hd-6670-overview.aspx#2

both have 480stream processors but that one is clocked at 800MHz, so it does 768Gflops, while the E6760 (at 600MHz) does 576Gflops

the HD 6670 card is about as good as the HD 4850 in the comparison I found (it largely depends on the games used in those), but as I said the E6760 clocks 25% lower so expect it to be perform a bit less than that

Apparently you missed the part where it would be custom and perform better than the counterparts.  That's cool though.


Custom what?  Usually that means worse.  We are using cards like the 6570 and 6670 to compare.  However the actual card it is based on is weaker than those.  So best case scenario, your little optimization will still get you a $40 bairly able to play modern games well card.  HAVE FUN!  LMFAO!

Uh, no, that is not what "custom" typically means.

Let me spell this out for you:

*NIntendo asks for a part that fits into their budget/power profile.
*Nintendo asks for xyz additions to the GPU's microcode (and, if within budget, the silicon) to handle things like, say, DX11-style effects.
*If those additions fit into their budget, they get added.

This is why using an older GPU actually makes sense: you get a *much* cheaper part which you can then modify the crap out of.

It's no different, at all, than people that buy a Honda Civic (Instead of, say, a Porsche Boxster) and trick it out so that it destroys the Boxster for a fraction of the cost of the Boxster.

In your world, however, that custom Civic is actually a weaker car than the Boxster by virtue of being custom despite murdering it in the 0-60, 0-100, and 100-0.

(But I'm sure the Boxster has a better stereo, so that makes it a better car, right?)


aftermarket stereo sounds better too bro, at the same cost.



Can anyone actually post a quote saying BO2 runs in 1080p? From what I can recall during the show, the way it was worded he said something like "for the first time ever on a Nintendo console it will run in true HD" and I doubt he meant full 1080p from that.



TheBardsSong said:
Can anyone actually post a quote saying BO2 runs in 1080p? From what I can recall during the show, the way it was worded he said something like "for the first time ever on a Nintendo console it will run in true HD" and I doubt he meant full 1080p from that.

well, and there is the issue that 1080p output doesn't have to mean 1080p native resolution, but I don't really want to go deeply into that discussion -  we will see all of that soon enough



Around the Network

If true, that would mean the GPU alone would be more powerful than the 360 and PS3 combined.



S.Peelman said:
Mazty said:
S.Peelman said:

Yes, well, if you'd want to take the term literally. The PS4/720 will be a supercomputer from '97. What's the point?

This was probably meant as an example only.

An example that was horribly, horribly wrong?

Why use the term "super computer" for any reason other than literally? It was wrong and people talking about "supercomputers" have no idea what they are talking about. Do they mean gaming PCs?? 

I could agree that there was probably a better word for it . And yes, personally I took it as meaning high-end PC's.

Okay, the thing is though that the GPU is lesser then an 8800 Ultra, a top end 2006 card, and it certainly won't have the same low latency ram, OC'd Q9950 or Raid 0 HDD's. So the Wii U is like an entry level enthusiast gaming PC from '06, rather then a top end one. 

And if we are talking about gaming circa '06/07, the real question is: Can it play Crysis?



lilbroex said:
I do not understand the point of bringing the PC into a console comparison. Its like a last ditch effort to deny gains when all else fails.


Or it's a very easy way of determining a consoles technical capabilites as the hardware is already out there in the PC market and usually has been for some time...In fact when this wasn't the case with the PS3, it's tech capabilities remained for a very long while quite unknown. 

Your comment was merely a last ditch effort in order to try and retain some excitement for an already dated console I do believe?



I'm sure it could be used to determine gains but I have never seen it brought in for that purpose.

Its always brought in to downplay or dismiss console performance.



Nsanity said:
If true, that would mean the GPU alone would be more powerful than the 360 and PS3 combined.

2 HHD's....load of cores, 2 GPUs...it really, really wouldn't.