By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U GPU Type CONFIRMED! Custom AMD E6760!

Mazty said:
Kasz216 said:
Mazty said:

I've never heard the term super computer be used "off brand" because using it wrong is like calling a cottage a mansion. Plus, as I've stated, the consoles weren't even close to being decent gaming PC's with the dates used - the comment was completely wrong no matter how liberally you look at it. 

In short, if you expect the power of a gaming PC from 09 from the Wii U, you will be very, very dissapointed. It's going to be more like an OC'd 8800GT with an OC'd dual core circa 06 - not super in anyway and certainly not suitable for the price tag. 

I've heard people refer to homes as mansions all the time when reffering to houses not classfied as mansions.

Also, outside the fact that consoles run better then pc's with similar hardware because of less restraints on the system because it's "dedicated".

I'd point out that he said "Let's say 09" as in he wasn't exactly sure on the date... making your complaints even more pedantic in nature.

If you want to argue the year is off sure... but the way in which you are doing it and trying to say he meant the kind of thing that's being used to crunch the physics of a black hole is well... extremely pedantic... and if you can't see how... I don't know what to tell you.

Additonally, care to post the average high end computer from 2009?  I mean, like an ad advertising what the high end sold in stores is?  So far you haven't actually backed anything up, but mention other parts that may or may not have been standard in a higher end computer at that time.

Seems like your intentionally argueing a completely different point.

Other people and sites seem to peg the GPU at 2009 like the Source did.

http://www.cinemablend.com/games/Wii-U-Seems-Too-Expensive-Specs-46908.html


Cottage...not house but moving on, 3 years is hardly being pedantic considering Moore's Law. If we are talking about, say, glacial movement, 3 years is being pedantic. If we are talking about where someone was on the night of a crime, minute-accuracy is critical so as time is relevant to whatever is in question, in this instance 3 years is significantly off the mark.  By 2009 a high-end gaming machine (nvidia build) would be using a GTX 295 and an OC'd i7 with 16gb DDR3. Maybe a small OS SSD as well. No one buys high-end gaming PCs, or no serious gamer would, as it's much better to build them. 
That '2009' GPU was an $100 GPU showing that it wasn't even an enthusiast GPU in 2009, and in fact it came out in '08. It was by no means a high-end gaming card - more like a budget card, and ATI were trailing behind Nvidia in that chip gen. 

If the 6760 is being used, then the Wii U is going to be very much like a high-end gaming PC from 06. I imagine that if MS or Sony release anything next year, then those consoles may be far better value for money in terms of hardware. The Wii U is considerably weaker then my 18 month old GTX560 Ti, so to pay upwards of $300 for a lot less performance? Ouch, I'll take a rain check and see what 2013 holds. 

And your saying HE is talking in stupid hyperbole.

If you want to talk years.  The latest 6760 was released in 2011... and Moore's law hasn't applied for some time now.

And he said it was a $100 GPU NOW... that wasn't the price in 2009.  He's using the price to tally up what the hardware for the Wii U costs to make today.   So according to you on pricing... that makes it a budget/middle of the road card today.  Exactly Source's point. 

That you could make such a reading mistake and the $100 price point in 09 seemed credible to you makes me REALLY puts your knowledge on the subject in question.

As for paying more for less performace?  That's every console ever when it comes to tech.  Consoles always release with behind the edge processors and graphics cards.  That's why if you actually care about shit like this it's PC or nothing. 

They rely on low system reqs for OS and such.  I mean hell, why do you think most consoles have about as much Ram as a computer from 2002.



Around the Network
Kasz216 said:

And your saying HE is talking in stupid hyperbole.

If you want to talk years.  The latest 6760 was released in 2011... and Moore's law hasn't applied for some time now.

And he said it was a $100 GPU NOW... that wasn't the price in 2009.  He's using the price to tally up what the hardware for the Wii U costs to make today.   So according to you on pricing... that makes it a budget/middle of the road card today.  Exactly Source's point. 

That you could make such a reading mistake and the $100 price point in 09 seemed credible to you makes me REALLY puts your knowledge on the subject in question.

As for paying more for less performace?  That's every console ever when it comes to tech.  Consoles always release with behind the edge processors and graphics cards.  That's why if you actually care about shit like this it's PC or nothing. 

They rely on low system reqs for OS and such.  I mean hell, why do you think most consoles have about as much Ram as a computer from 2002.

Is that opening comment not just as inflammatory as "HAVE FUN! LMFAO!" ?
I didn't say stupid hyperbole, I said he was completely wrong no matter what way you look at it. Let's be mature about this and not warp words. 

Actually Moore's law is doing just fine:
http://en.wikipedia.org/wiki/File:Transistor_Count_and_Moore%27s_Law_-_2011.svg
And it's still going to be around for a while:
http://www.theregister.co.uk/2012/07/10/intel_asml_deal/

You're clearly not a PC gamer, or keep up to date with PC hardware and yet you're trying to argue the value of components? If you've ever seen a 4650 it's instantly clear that it's not a gaming card it's so small and certainly not worth $100 unless you are going to seriously argue that the HD4650, a low end card from '08, is on par with modern day $100 cards like the HD7750. He didn't say NOW and people who write on websites aren't always right, as shown by him getting the cards year wrong.  It's not exactly polite, nor pleasent, to try and argue about things so vehemently when you're guessing at most of it. 
Consoles are generally behind computer hardware, but not by this much, and certainly not at this stage of the existing console generation. With news on this GPU and the next consoles stated for release in 2013, it does beg the question why bother with the Wii U? 



MDMAlliance said:
Mazty said:
Play4Fun said:

I'm sorry, I don't work at AMD.

It's a claim they put up on the features and specifications sheet of the GPU.

You say you know technology progresses, yet you seem to have difficulty believing a 2011 GPU could actually be more pwoerful than a 2008 GPU while consuming less power.


Link to that sheet then? 

 

Chandler said:
Seriously, who the hell cares, Mazty? Are you on a mission or something to convert all those dirty Nintendo peasants? If the Wii U has an E6760 GPGPU I am more than happy. You better have to deal with the fact that not everyone wants to have a fucking Midi tower besides their TV's. When I unboxed my PS3 slim earlier this year my eyes fell out and I literally triple checked the box if it really is the "slim" version.

There might very well have been GPU's in 2006 that could rival the E6760 IN POWER, but not as cost efficient and not at the same power consumption rate. So apples to oranges I guess.

 

I'm guessing customers care about what they are buying so would like to know if they are getting something that is worth $300 and not, say, $100?

It's not really apples and oranges - it's a case that Nintendo are selling a next gen console with very old tech in it...so it's not really next gen is it? If that's okay for you then fine, but just realise what you are actually buying (so no omg amazing graphics/value for money/etc threads plz).

 


Are you assuming people only care about buying the most up-to-date technology?  The vast majority of stuff you buy are not "up-to-date."  Consoles stay relevant for a good half decade or so I would say, and according to you tech becomes "very old" within a year or two.  

Exactly - half a decade per generation, and the Wii U is joining this generation when it's been around now for 5-6 years. Where did I say tech becomes "very old" within a year or two?



Mazty said:
MDMAlliance said:

Are you assuming people only care about buying the most up-to-date technology?  The vast majority of stuff you buy are not "up-to-date."  Consoles stay relevant for a good half decade or so I would say, and according to you tech becomes "very old" within a year or two.  

Exactly - half a decade per generation, and the Wii U is joining this generation when it's been around now for 5-6 years. Where did I say tech becomes "very old" within a year or two?

If we assume that this rumor is true, that it does use a custom AMD E6760, and you consider that "old" tech, you are implying it.  The AMD E6760 is 2011 tech.



MDMAlliance said:
Mazty said:
MDMAlliance said:

Are you assuming people only care about buying the most up-to-date technology?  The vast majority of stuff you buy are not "up-to-date."  Consoles stay relevant for a good half decade or so I would say, and according to you tech becomes "very old" within a year or two.  

Exactly - half a decade per generation, and the Wii U is joining this generation when it's been around now for 5-6 years. Where did I say tech becomes "very old" within a year or two?

If we assume that this rumor is true, that it does use a custom AMD E6760, and you consider that "old" tech, you are implying it.  The AMD E6760 is 2011 tech.


The last time the E6760 could be considered a viable gaming card was back in '06 as it's like an OC'd 8800GT...in 2011 it's nothing more then a basic graphic card . It certainly would not be considered a gaming card in '11.



Around the Network
Mazty said:
MDMAlliance said:
Mazty said:
MDMAlliance said:

Are you assuming people only care about buying the most up-to-date technology?  The vast majority of stuff you buy are not "up-to-date."  Consoles stay relevant for a good half decade or so I would say, and according to you tech becomes "very old" within a year or two.  

Exactly - half a decade per generation, and the Wii U is joining this generation when it's been around now for 5-6 years. Where did I say tech becomes "very old" within a year or two?

If we assume that this rumor is true, that it does use a custom AMD E6760, and you consider that "old" tech, you are implying it.  The AMD E6760 is 2011 tech.


The last time the E6760 could be considered a viable gaming card was back in '06 as it's like an OC'd 8800GT...in 2011 it's nothing more then a basic graphic card . It certainly would not be considered a gaming card in '11.


It's an embedded graphics card, and it seems like you don't hold a realistic view of what the term "old" means.  It seems like any computer that's on the market is old technology no matter how good it is, because a real supercomputer has it beat in every way every time.  Your logic fails, ultimately.  A gaming-dedicated console can do much more with much less due to the fact that its hardware optimizes its performance for gaming and graphics.  I don't know if you have any idea what you're talking about.



MDMAlliance said:


It's an embedded graphics card, and it seems like you don't hold a realistic view of what the term "old" means.  It seems like any computer that's on the market is old technology no matter how good it is, because a real supercomputer has it beat in every way every time.  Your logic fails, ultimately.  A gaming-dedicated console can do much more with much less due to the fact that its hardware optimizes its performance for gaming and graphics.  I don't know if you have any idea what you're talking about.


What are you on about talking about supercomputers? It's a very, very weak graphics card compared to cards from 09 let alone nowadays and yet Nintendo are asking $300 for the console. I know exactly what I am taking about; do you? If we consider the power of that GPU, there is no way you'llbe able to play at 60fps true 1080p with good graphics. Note I said good, not cutting edge. 



Mazty said:
MDMAlliance said:


It's an embedded graphics card, and it seems like you don't hold a realistic view of what the term "old" means.  It seems like any computer that's on the market is old technology no matter how good it is, because a real supercomputer has it beat in every way every time.  Your logic fails, ultimately.  A gaming-dedicated console can do much more with much less due to the fact that its hardware optimizes its performance for gaming and graphics.  I don't know if you have any idea what you're talking about.


What are you on about talking about supercomputers? It's a very, very weak graphics card compared to cards from 09 let alone nowadays and yet Nintendo are asking $300 for the console. I know exactly what I am taking about; do you? If we consider the power of that GPU, there is no way you'llbe able to play at 60fps true 1080p with good graphics. Note I said good, not cutting edge. 


You didn't provide any real information at all with this post.  If you're going to try to refute my point, try giving real information rather than obscure subjective information that you think is correct.



MDMAlliance said:
Mazty said:
MDMAlliance said:


It's an embedded graphics card, and it seems like you don't hold a realistic view of what the term "old" means.  It seems like any computer that's on the market is old technology no matter how good it is, because a real supercomputer has it beat in every way every time.  Your logic fails, ultimately.  A gaming-dedicated console can do much more with much less due to the fact that its hardware optimizes its performance for gaming and graphics.  I don't know if you have any idea what you're talking about.


What are you on about talking about supercomputers? It's a very, very weak graphics card compared to cards from 09 let alone nowadays and yet Nintendo are asking $300 for the console. I know exactly what I am taking about; do you? If we consider the power of that GPU, there is no way you'llbe able to play at 60fps true 1080p with good graphics. Note I said good, not cutting edge. 


You didn't provide any real information at all with this post.  If you're going to try to refute my point, try giving real information rather than obscure subjective information that you think is correct.

If you had knowledge of GPU's I wouldn't be having this dicussion...The 8800GT is from 06' and nowadays it sucks, so go figure.

To summarise I know what I'm talking about as I'm experienced with GPU's, you clearly aren't and I'm not about to become your mentor so this discussion is as good as over. 



HoloDust said:
Baron said:
HoloDust said:
Play4Fun said:

The e6760 is 35 watts on 40nm and performs slightly better than a HD 4850.


Actually, 4850 is some 1.4x more powerfull than e6760 - you should not compare e6760 with 6670, but with slightly downclocked 6570 (or 6650M but with GDDR5)


How did you arrive at 1.4 times more powerful. I can't find a 4850 vs e6760 comparison anywhere. All we have is the 3DMark Vantage score of the e6760 with an AMD Athlon II X2 620 which is 5870. There are no game benchmarks of the e6760 anywhere.

The best comparison I could find is the 4850 Vantage score of 6805. But that's with a more powerful cpu, the Intel Core 2 Extreme QX9650.

Anyway, that is nowhere near 1.4x 5870. And factoring in that the QX9650 is roughly 50% more powerful than the 620 I'd say the 4850 is actually weaker than the e6760.

Go to www.3Dmark.com and under Results/Advanced search put Athlon x4 620 into CPU field, and 4850 into GPU field (than when results show up select number of GPUs: 1) - you will see that P scores are 9500+ for that combo, and that's actually some 1.6x of e6760

EDIT: If you compare pure GPU scores (taking into acount that e6760 is slighlty downclocked 6570) it's even worse - difference is 2x)


Those are overclocked results. The lowest score is 5587, lower than the 5870 for the standard e6760.