By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - NeoGAF: Wii U GPU 160 ALUs, 8 TMUs and 8 ROPs

Teriol said:
Daisuke72 said:
snowdog said:
Daisuke72 said:
snowdog said:
He was wrong. More than 3 times more cache for the GPU, 3 times more cache for the CPU, a DSP, an out-of-order execution CPU, a CPU with half the stages in its pipeline, a DX11-equivalent feature set and 4 times the RAM says he was wrong.

And if that isn't enough then Sonic Racing Transformed, Need For Speed Most Wanted, the gimping of Rayman Legends removing half of those black things from the swarm so that it would run on the PS3 and 360, Pikmin 3, The Wonderful 101, Super Mario 3D World, Bayonetta 2, X, Mario Kart 8 and SSBU all say he's wrong.


He always said the Wii U had the advantage in RAM, and overall efficiency. So? At the end of the day the Wii U is closer to current gen than it is next, and it's alot closer to current, which was his initial point, which you tried arguing and still are. 



He was wrong. It's as simple as that. Being somewhere between the 360 and PS4 in terms of power isn't 'on par' with the 360...unless you change the meaning of the words 'on par'. The Wii U is around 3-4 times more powerful than the 360, but that doesn't for a second mean that games will be 3-4 times better looking.

We've already seen the PS4 and One struggling to run games at a higher resolution than 720p and the difference between the 360 and the PS4 and One in terms of power is a great deal bigger.

Four times as powerful? Not even close, but keep telling yourself that, I would go in-depth, but last time we had this discussion you pretty much speculated on everything. Hell, every reply started with something like.

 

"I think..."

 

"Nintendo has complex..."

 

But like I said, just putting it out there, Ninjablade was right. Ninjablade won. RIP my nigga Ninja. 

are you ninjanlade?...  i see you created a new  profile!!! ;)

Why are you still here derailing the thread ? 



Around the Network
Daisuke72 said:

Four times as powerful? Not even close, but keep telling yourself that, I would go in-depth, but last time we had this discussion you pretty much speculated on everything. Hell, every reply started with something like.

 

"I think..."

 

"Nintendo has complex..."

 

But like I said, just putting it out there, Ninjablade was right. Ninjablade won. RIP my nigga Ninja. 

Ninjablade was wrong time and time again, he was banned something like six times, he repeatedly tried to pass of rumour as fact, and he tried to talk tech while clearly not having a clue what he was talking about. The guy was a laughing stock and still is.



With any discussion about the latte fabrication you have to bear in mind the huge amount of stuff included in the latte, not only is a huge part the edram but there is also the arm cpu, audio dsp, 1MB wii u texture memory, 2MB wiii frame buffer, wii gpu, sections designed for high speed compression of the wii u gpu frame buffer and downscaled to fit in the wii gpu frame buffer. There is a huge amount of additional stuff.

The 176 gflops figure is completely realistic, not only does it fit in with the power consumption figures but it fits in with how the wii u is actually performing. Let's not forget the generational difference between 360/PS3 gpu's and the later radeon's is quite significant in performance.

If the latte really is 352 gflops and the architecture is much improved plus you have 32MB of high speed edram then what the hell is going wrong that the wii u that it is struggling to outperform 360/PS3 graphically? The figure of 176 gflops makes total sense, it simply works with all the information we have. If the latte really is 352 gflops then something has gone horribly wrong in the design of the wii u that is creating major issues. I don't believe this. I believe Nintendo have a designed a console at the absolute minimum price to merely match current gen performance overall. However as a wii u owner I'm more than happy to be proved wrong but the evidence surely dictates that of the possible range of gflops performance 176-352 gflops where once we were clinging on to believing it was 352 gflops infact the lower figure is much more realistic.

I guess an alternative view is that CPU is so weak that the compute functionality of the radeon gpu is being utilised for practically every game and compromising graphic output. I don't believe this myself. I also don't believe the wii u console is hard to develop for, I believe the complete opposite is true and I don't believe all developers are being lazy on wii u either.

Ultimately I believe the wii u is a low performance console designed to a specification for good quality cartoon graphics for Nintendo games and a huge profit for Nintendo and its shareholders (if it sold well). A continuation of their 'withered technology' philosophy that was so successful for gameboy, wii etc.

http://en.wikipedia.org/wiki/Gunpei_Yokoi



Hynad said:

You must be joking about crysis.The internet was crying back then with how much demanding and badly optimised its engine was. As for the rest of your points, sure, consoles don't have to run as many things, which makes them more efficient. You talk about scaling things down on PC to achieve better performance. You have to scale them to a point where the games look just the same, if not worse, than they do on consoles (with same specs).  And then you imply mods and ini tweaks to make sure the game doesn't even run as intended. As I said, you're being disingenious.

As for your bias, one has to be totally blind to not see you're a PC elitist.


I'm not joking about Crysis, I used to be a mod at the InCrysis forums, so I spent allot of time with the origional game and helping people get the most out of their hardware to run it via .ini and driver tweaks.
Not to mention I actually had a Geforce 7950 GX2 back then.
Granted, I wasn't exactly trying to push any higher than 1280x1024@30fps and didn't bother with the Anti-Aliasing.

As for scaling, it works both ways on the PC. You can scale Oblivion from this:



To max graphics:


And even farther beyond (I was also a mod at the Oldblivion forums.):



Or lets use GTA IV as an example of not only being able to scale down games to run on weaker hardware, but in the opposite direct too?
Looks better than GTA 5.


Nothing like being part of the PC gaming master race, I can assure you. - Always have the best graphics.

As for throwing words like disingenious and biased at me, if you're attempting to get a negative reaction, it's not going to happen.



--::{PC Gaming Master Race}::--

Pemalite said:
Hynad said:

You must be joking about crysis.The internet was crying back then with how much demanding and badly optimised its engine was. As for the rest of your points, sure, consoles don't have to run as many things, which makes them more efficient. You talk about scaling things down on PC to achieve better performance. You have to scale them to a point where the games look just the same, if not worse, than they do on consoles (with same specs).  And then you imply mods and ini tweaks to make sure the game doesn't even run as intended. As I said, you're being disingenious.

As for your bias, one has to be totally blind to not see you're a PC elitist.


I'm not joking about Crysis, I used to be a mod at the InCrysis forums, so I spent allot of time with the origional game and helping people get the most out of their hardware to run it via .ini and driver tweaks.
Not to mention I actually had a Geforce 7950 GX2 back then.
Granted, I wasn't exactly trying to push any higher than 1280x1024@30fps and didn't bother with the Anti-Aliasing.

As for scaling, it works both ways on the PC. You can scale Oblivion from this:



To max graphics:


And even farther beyond (I was also a mod at the Oldblivion forums.):



Or lets use GTA IV as an example of not only being able to scale down games to run on weaker hardware, but in the opposite direct too?
Looks better than GTA 5.


Nothing like being part of the PC gaming master race, I can assure you. - Always have the best graphics.

As for throwing words like disingenious and biased at me, if you're attempting to get a negative reaction, it's not going to happen.


The topic is about efficiency. Not about scaling. I've already made my points. Which you didn't disprove.

Specs for specs, consoles are more efficient at running games. The OS footprint, among other things, is in part responsible for this. I gave you the point about Mantle. That it has the potential to turn this around. But we'll have to wait for when it comes out to really be the judge of this.

Build me a PC with 512GB of total RAM, with CPU and GPU comparable to those found in the PS3 and 360, and then show me a direct feed video of it. Otherwise, I don't see how you can make me believe you ran Crysis as well as on the 360 with such low PC specs. 



Around the Network

how many times much better than ps360 a system have to be to show graphics like the 2 wii U tech demos showed in E3 2011?

After seing that i just cant believe that wii U isnt at least 3 or 4 times more powerfull than ps360.



Hynad said:


The topic is about efficiency. Not about scaling. I've already made my points. Which you didn't disprove.

Specs for specs, console are more efficient. The OS footprint among other things are responsible for this. I gave you the point about Mantle. That it has the potential to turn this around. But we'll have to wait for when it comes out to really be the judge of this.


I did disprove, proof is in history of prior game releases, I did iterate upon this in my prior posts, if you need more information of what I have stated, feel free to use the worlds largest repository of information known to mand-kind, which is available at www.google.com
The reason why I brought up scaling is because you made it sound like you can only scale PC games downwards and not upwards, you can do both.

I'm not discounting that consoles are lighter, leaner machines later on in a generational cycle, but early on? Not so much. Oblivion, Bioshock, Call of Duty 4, Unreal Tournament 3 being prime examples of this, all of which perfectly happy running on GPU's and CPU's an entire generation older than the consoles.
There is actually a good explanation of this, during a new generation, Developers typically use High-level API's while they shift their game engines to the new hardware and drop support of the old hardware, thus that incurrs ovehead. (Hence why Oblivion performs worse on the Xbox 360 than a PC with half the GPU horsepower.)
Now as time goes on, Developers get more bold and they start writing to a low-level API, the bolder developers will go even closer to the metal than that.

Later on in the generation, it's a different story as PC games have higher image quality out of the box and no longer support older rendering paths such as Direct X 9, so they really can't be compared, Battlefield 3 being a prime example of this with better textures, lighting, shadows, tessellation, larger maps, more players amongst other things.

As for the OS, Windows XP which is the OS that was around during the Xbox 360 and Playstation 3's launch is actually a very lean OS, it can run rather happily with 70-90Mb of Ram which is competitive enough with the twins.
Windows 7/8 is even leaner than the Xbox One and Playstation 4's OS.

It's the API's holding things back at the moment, which is soon to be rectified, at-least for AMD users.

You edited your post.

Hynad said:

Build me a PC with 512GB of total RAM, with CPU and GPU comparable to those found in the PS3 and 360, and then show me a direct feed video of it. Otherwise, I don't see how you can make me believe you ran Crysis as well as on the 360 with such low PC specs. 

You can't build a consumer grade PC with 512Gb of Ram and why would you want to? You would have to do some severe stripping back of a Windows XP install to make that kind of Ram amount have any weight, you're forgetting a PC does more at any one time than a console.
Sitting at the desktop mine is running my Keyboards screen with detailed information of Core temperatures, voltages, clockspeeds, fan speeds for both the CPU and GPU.
It's also showing me information on the weather, CPU, Network and GPU usage, with some servers waiting to be "woken up" via another device to begin transcoding, just to name a few, once a console can do everything my PC can do, then it would be a viable comparison point.

And why should I find you anything? Go do it yourself. I'm not about to help someone who ridicules others a favour.

Also, a 7950 GX2 is faster than the GPU in the Playstation 3 or Xbox 360, if you knew anything about hardware, you would know the obvious reason why.
Go ahead, do a youtube search for someone running Crysis on a Radeon x1950 XT or a Geforce 7950GT at 720P.

edit: In-fact, this has gone way off topic, so better to do it elsewhere.



--::{PC Gaming Master Race}::--

bonzobanana said:
With any discussion about the latte fabrication you have to bear in mind the huge amount of stuff included in the latte, not only is a huge part the edram but there is also the arm cpu, audio dsp, 1MB wii u texture memory, 2MB wiii frame buffer, wii gpu, sections designed for high speed compression of the wii u gpu frame buffer and downscaled to fit in the wii gpu frame buffer. There is a huge amount of additional stuff.

The 176 gflops figure is completely realistic, not only does it fit in with the power consumption figures but it fits in with how the wii u is actually performing. Let's not forget the generational difference between 360/PS3 gpu's and the later radeon's is quite significant in performance.

If the latte really is 352 gflops and the architecture is much improved plus you have 32MB of high speed edram then what the hell is going wrong that the wii u that it is struggling to outperform 360/PS3 graphically? The figure of 176 gflops makes total sense, it simply works with all the information we have. If the latte really is 352 gflops then something has gone horribly wrong in the design of the wii u that is creating major issues. I don't believe this. I believe Nintendo have a designed a console at the absolute minimum price to merely match current gen performance overall. However as a wii u owner I'm more than happy to be proved wrong but the evidence surely dictates that of the possible range of gflops performance 176-352 gflops where once we were clinging on to believing it was 352 gflops infact the lower figure is much more realistic.

I guess an alternative view is that CPU is so weak that the compute functionality of the radeon gpu is being utilised for practically every game and compromising graphic output. I don't believe this myself. I also don't believe the wii u console is hard to develop for, I believe the complete opposite is true and I don't believe all developers are being lazy on wii u either.

Ultimately I believe the wii u is a low performance console designed to a specification for good quality cartoon graphics for Nintendo games and a huge profit for Nintendo and its shareholders (if it sold well). A continuation of their 'withered technology' philosophy that was so successful for gameboy, wii etc.

http://en.wikipedia.org/wiki/Gunpei_Yokoi

Current PS3/360 games are built on 8 years of optimization for their specific hardware. Wii U hasn't even been out for a year yet, no console is maxed out that fast. You need exclusives designed around a console's  specific hardware to really show off what it can do, and no Wii U exclusive so far has aimed for high end graphics, they all adopt simpler styles that preclude technological boundary-pushing.

The Wii U has yet to have its Uncharted 2, its Gears of War, its Mario Galaxy, that one game that really puts its chipset through its paces and shatters its established graphical standards.



Daisuke72 said:
The Wii U is more efficient than the PS3 and Xbox 360, and has more RAM as well, so of course it'll have better looking games, but honestly, just barely.

In short, Ninjablade was right, the Wii U is on par with current gen, Nintendo fans seriously owe him an apology.


He was, is and will be a dick as evidenced by his "LOL fukc you, Nintendo fanboy, WiiU $99 Christmas 2014!!!111!"

 

We don´t owe him anything.



fatslob-:O said:

Your forgetting the fact that the eDRAM takes up a significant amount of die space. After all cache doesn't cost a small amount of transistors. BTW I don't literally mean "off the shelf" that was a slight hyperbole. By that I mean pretty damn similar. If it truly had around 700 million transistors of enabled logic then why is it so hard for the wii u to completely beat the PS360 and why does it consume 35 watts in total not including the disc drives etc ? Right now 600 million transistors of logic make sense because the actual graphics processing component is around 100mm^2 not the 156mm^2 you initally thought. It could easily compare to a 320 shader part that has some disabled shaders too and BTW none of those 160 shader parts make sense because they only have 4 rops so don't just assume that I am referring to 160 shader parts. An HD 5550 is looking pretty likely right now for what nintendo has used as a base. 

Oh as for your "latte" having more logic do you even know if all of that is ENABLED logic. IE the shaders that ACTUALLY WORK. It's very common to see some alot of GPU manufacturers disable a part of the DIE that NOT WORKING. Since the "latte" probably has around 900 million transistors in total and a third of it is probably reserved for things like the eDRAM. Half of it is probably used for things like GPU logic and the rest is used to create extra eDRAM and and GPU logic so that the chip doesn't end up having less yields.

Do we even have a DF or LOT analysis to even come to the conlusion that it runs worse on the PS4 ? (Doesn't matter anyways since the PS4 is just 5 days away from analysis.) 

I can't answer that, because I simply don't know. I think the question that should be asked, is what functions are being used right now? It has a tessellator on fixed-function silicon, that's likely not being used right now, it has extra GPRs compared to a conventional R700 series GPU, all of that extra cache (Nintendo has mentioned that Wii U was heavily reliant on memory) how much of that is actually being used? Unfortunately with Wii U dev kits being badly documented, I wouldn't be surprised if a lot of GX2 functions of Latte are being left unused. Like how Two Tribes suddenly discovered a hardware feature that reduced memory usage, and saved them 100MB. Was this feature not documented? How would they have just "discovered" something.

OK, then say it some other way. Saying "Off shelf part" makes it seem like Nintendo took a, for example, 5550, looked at it, and thought, "Hmm...this one's good, get rid of useless stuff and put eDRAM". They've worked on the console for over 3 years, that's a lot of time to have done all sorts of things to the GPU. 

eDRAM is 40nm Renesas eDRAM, we know information from this DRAM as it comes directly from the Renesas website, it should take around 220 million (+/- a few million), leaving 717 million for the GPU (depending on the million transistors per mm2). You're forgetting that off-shelf parts use die space for things you'll need for them being PC parts. Latte won't need those components and can easily be used for something else. And no, Latte has a DX 10.1 (plus extras) feature set-like compatibility for GX2 (based on documentation), it's not based on a 5000 series GPU (full DX 11 compatibility). It has an R700 base, and they modified it heavily from there. AMD does work with customers, and would allow modifications of stuff like shaders to work better on particular hardware.

And I never said that all of that 156.21mm2 was GPU logic, I already subtracted all of the eDRAM and I apologize for bad wording, I'm referring to the "logic" as anything usable to gaming, everything that's on the die that can technically be used for gaming purposes. However, eDRAM is also a factor for gaming too, conventional GPUs don't have that, and it can be very useful for increasing performance. You seem to be passing this off like it's disposable and not part of the entire GPU system. Heck, even for the CPU since it also has direct access. ROPs though, would not make a big difference in transistors, even if they added in 4 more. 

And I wasn't thinking about certain Wii-related functions in the GPU that are likely used for BC purposes, so I suppose it is possibly lower than what I have calculated. Though, going by Shiota's (from Iwata Asks) comment, he makes it seem like they simply took Wii U parts, and modified them so that they could be used for Wii BC as well. Of course, we don't know what exactly what they're talking about, but since they're talking about BC on Wii U for Wii, that could be the case. And I believe Marcan said that Wii emulation wasn't being run by an implanted "Hollywood GPU", so this could mean Wii GPU emulation is done from the same Wii U parts, meaning that not much would have been wasted for Wii GPU emulation at all, if any was. Though, if that's not true, that would put transistors at around 600 (Wii BC logic + gamepad compression shouldn't take up much space at 40nm) would still put it above some parts with more shaders. Plus, Latte is produced on a more mature 40nm process than any 5000 series GPU was....

You know, that "unusable logic" idea factors more for the commercial off-shelf GPU, a console GPU will not have as much if there's any, otherwise that'll be a terribly designed GPU if it has so much unused logic.

And no, we don't have a DF analysis, but so far, reviewers have complained about these framerate issues. If those framerate issues are there (unless the consoles playing those games are malfunctioning in some way), then I would blame the developers, not the hardware.