By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U GPU Type CONFIRMED! Custom AMD E6760!

Yeah, i rememeber GC full specs where revealed, but Nintendo didnt give them much importance, just released them because it was the norm back then



Around the Network
DieAppleDie said:
SNES, Nintendo64 and Nintendo Gamecube were top notch, technology wise, consoles, that could perform the best grafix available, at the moment of their launch


That is not true for the SNES. There were a few other systems that weren't as popular in the U.S. but outperformed it in its time frame.

The N64 and GC were the most powerful overall in their eras, and that is also when Nintendo saw its lowest sales. Nintendo learned from its mistakes.

There philophy now is that specs are not one of the most important things.



haxxiy said:
Lafiel said:
Captain_Tom said:
TheSource said:

This is what Nintendo has always done really save for Wii.

NES was a super computer...by 1982 standards when it came out in 1985.

SNES was a super computer...by 1989 standards when it came out in 1991.

N64 was a super computer...by 1994 standards when it came out in 1996.

GC was a super computer...by 1999 standards when it came out in 2001.

Wii was a super computer...by 2000 standards when it came out in 2006.

Wii U is a super computer...by say, 2009 standards when it comes out in 2012.

Nintendo's model assumes profitable hw + mass consumer adoption + strong internal games = massive profit. Wii came at the most difficult time in Nintendo's history so they spent less on hw than usual

Wrong!  Wii U is a super Computer by 2007 standards.  I could build you a PC now for $400 that easily performs better than the Wii U.  

hm.. yea the G92 (8800 GTS 512) released dec 2007 and the HD 6670 is about on par with that in power

I like how the definition of super computer seems to be what about a measly $800 or so could buy. 


I just assumed he meant a high end gaming PC.  There's no way he could possibly mean a $2000 5+ year proof mega PC.... How could someone be that stupid?



lilbroex said:

There is a difference between numerical performance comparisons and graphical results. Everyone is factoring "out" the fact that it uses modern technologies that are simply impossible on the older hardware components that they are trying to retroactively compare it to.

The numerical on paper performance and what they GPU can actually output on screen are two completely different things. Its like saying that because two boxes(one filled filled with tools (one with old wooden tools and the other filled with dynamic stainless steel modern tools) and supplies weigh the same, that the potential of the wooden tools is the limit of the steel tools.

Showing numbers is one things. What is done with those numbers is another. You can't do Tesselation on a 7800GT or 8800 GTS or use shader modal 5.0 features.

 

Yeah lol because a nerfed 6570 will tessalate like a beast!  LOL you do realize that a $30 6450 can run all those things you said right?   But is it playable?  NO!  



errorpwns said:
Lafiel said:

this chip would be a good step up from Xenos/RSX, but by todays graphic cards offerings it's low end

for comparison, the E6760 is almost the same as the GPU on the Radeon HD 6670: http://www.amd.com/us/products/desktop/graphics/amd-radeon-hd-6000/hd-6670/Pages/amd-radeon-hd-6670-overview.aspx#2

both have 480stream processors but that one is clocked at 800MHz, so it does 768Gflops, while the E6760 (at 600MHz) does 576Gflops

the HD 6670 card is about as good as the HD 4850 in the comparison I found (it largely depends on the games used in those), but as I said the E6760 clocks 25% lower so expect it to be perform a bit less than that

Apparently you missed the part where it would be custom and perform better than the counterparts.  That's cool though.


Custom what?  Usually that means worse.  We are using cards like the 6570 and 6670 to compare.  However the actual card it is based on is weaker than those.  So best case scenario, your little optimization will still get you a $40 bairly able to play modern games well card.  HAVE FUN!  LMFAO!



Around the Network
Captain_Tom said:

Wrong!  Wii U is a super Computer by 2007 standards.  I could build you a PC now for $400 that easily performs better than the Wii U.  


You're the wrong one here.  I bet the only thing you think of is "lul look at me I'm so cool I can find 8GB of ram for $40 that's 4x as much as the Wii U has, and I can find a better processor for under $100" yada yada yada.  You're an idiot if you think that a PC could out-perform a console using the same specs as it does in gaming.  



Captain_Tom said:
lilbroex said:

There is a difference between numerical performance comparisons and graphical results. Everyone is factoring "out" the fact that it uses modern technologies that are simply impossible on the older hardware components that they are trying to retroactively compare it to.

The numerical on paper performance and what they GPU can actually output on screen are two completely different things. Its like saying that because two boxes(one filled filled with tools (one with old wooden tools and the other filled with dynamic stainless steel modern tools) and supplies weigh the same, that the potential of the wooden tools is the limit of the steel tools.

Showing numbers is one things. What is done with those numbers is another. You can't do Tesselation on a 7800GT or 8800 GTS or use shader modal 5.0 features.

 

Yeah lol because a nerfed 6570 will tessalate like a beast!  LOL you do realize that a $30 6450 can run all those things you said right?   But is it playable?  NO!  


It sure couldn't run those things that well. Its good thing we arne't talking about that GPU, but one that is stronger and "can" run all of those things.



Mazty said:

An example that was horribly, horribly wrong?

Why use the term "super computer" for any reason other than literally? It was wrong and people talking about "supercomputers" have no idea what they are talking about. Do they mean gaming PCs?? 


While it's obvious in context that TheSource did not mean a supercomputer like how most in research think of it, since that's where I'm from I thought the exact same thing as Mazty so I'm going to agree with him.  It was a poor term to use unless you have an extremely low bar for what you'd call a supercomputer.  We have a 5 year old cluster at my department that I use to run theoretical chemistry calculations, I'd be quite impressed if any console could get do anywhere near what it can do.  



...

As long as it's 1080p and has nice sexy smooth textures, framerate, etc. I don't think anyone will complain.



Captain_Tom said:
errorpwns said:
Lafiel said:

this chip would be a good step up from Xenos/RSX, but by todays graphic cards offerings it's low end

for comparison, the E6760 is almost the same as the GPU on the Radeon HD 6670: http://www.amd.com/us/products/desktop/graphics/amd-radeon-hd-6000/hd-6670/Pages/amd-radeon-hd-6670-overview.aspx#2

both have 480stream processors but that one is clocked at 800MHz, so it does 768Gflops, while the E6760 (at 600MHz) does 576Gflops

the HD 6670 card is about as good as the HD 4850 in the comparison I found (it largely depends on the games used in those), but as I said the E6760 clocks 25% lower so expect it to be perform a bit less than that

Apparently you missed the part where it would be custom and perform better than the counterparts.  That's cool though.


Custom what?  Usually that means worse.  We are using cards like the 6570 and 6670 to compare.  However the actual card it is based on is weaker than those.  So best case scenario, your little optimization will still get you a $40 bairly able to play modern games well card.  HAVE FUN!  LMFAO!

Uh, no, that is not what "custom" typically means.

Let me spell this out for you:

*NIntendo asks for a part that fits into their budget/power profile.
*Nintendo asks for xyz additions to the GPU's microcode (and, if within budget, the silicon) to handle things like, say, DX11-style effects.
*If those additions fit into their budget, they get added.

This is why using an older GPU actually makes sense: you get a *much* cheaper part which you can then modify the crap out of.

It's no different, at all, than people that buy a Honda Civic (Instead of, say, a Porsche Boxster) and trick it out so that it destroys the Boxster for a fraction of the cost of the Boxster.

In your world, however, that custom Civic is actually a weaker car than the Boxster by virtue of being custom despite murdering it in the 0-60, 0-100, and 100-0.

(But I'm sure the Boxster has a better stereo, so that makes it a better car, right?)