By using this site, you agree to our Privacy Policy and our Terms of Use. Close
forethought14 said:
fatslob-:O said:

Sorry but that optimization excuse don't work out bra. The PS360 could out their own predecessors so why isn't the WII U doing the same thing to EVERY game. Even the PS4 and X1 could shit on their current gen counterparts easily bra. If the WII U had more bruteforce power to out the PS360 why ain't it performing better ? All that's needed to make a game look and run better is a significantly more powerful GPU an clearly the WII U lacks this. This generation ain't exactly over yet. Consoles altogether this generation started to use the GPU more and became more PC like in their philosophies and consoles altogether next generation are even dressed up PC's. Hell it's thanks to the WII Us different CPU that it ain't branded as a dressed up PC yet but that don't matter too much when much of it's power comes from an off the shelf PC GPU LMAO. If your referring to even older consoles like the PS1 and the N64 they also relied alot on CPU too and the N64 didn't even have a graphics processor! 

PS4 and XB1 have literally no excuse to not out-perform their predecessors (leaving the bad CoD PS4 port with terrible framerate aside), we all know Wii U is not a huge leap compared to PS360, so I don't see why we need to compare it the same way...

Off-shelf PC GPU? Just to put this out, no 40nm 160 shader GPU out there in the market has as much GPU logic as Latte does. All of them (including cards like lower end 6000, 7000, 8000 and the newer R2 xxx cards) are around ~ 67 mm2 in size, while also sharing around ~ 330 million transistors each. Latte is far larger than any 160 shader part at 156.21mm2, and it has well over 700 million transistors (excluding eDRAM). It doesn't make too much sense for Nintendo to have taken only an off-shelf GPU, reduce shaders, TMUs or ROPs and leave it at that considering size, process and transistor count. Latte doesn't compare to any 160 shader part, at all, period.

Just note, I'm not saying it has more shaders, more texture mapping units (if it even has any) nor ROPs, I'm just saying that Latte has a ton of logic that you simply cannot explain without the aid of newer documentation. As for 3rd party ports running bad on Wii U, you can easily blame documentation. Yes, it's an excuse, and a very relevant excuse. It was terrible before launch, and if you don't have the entire API well documented, then you're stuck with as much development effort put into the games as to how the PS4 version of Ghosts is demonstrating its performance. For PS4, there is literally no excuse for that cheap-looking game like Ghosts to run with terrible framerate on that hardware. Though, I don't blame the hardware, I blame the developers. Same thing applies to Wii U, although on a worser scale considering how I'm sure PS4 is easier to develop for than Wii U, and the PS4 documentation is a lot easier to follow than Wii Us.

Your forgetting the fact that the eDRAM takes up a significant amount of die space. After all cache doesn't cost a small amount of transistors. BTW I don't literally mean "off the shelf" that was a slight hyperbole. By that I mean pretty damn similar. If it truly had around 700 million transistors of enabled logic then why is it so hard for the wii u to completely beat the PS360 and why does it consume 35 watts in total not including the disc drives etc ? Right now 600 million transistors of logic make sense because the actual graphics processing component is around 100mm^2 not the 156mm^2 you initally thought. It could easily compare to a 320 shader part that has some disabled shaders too and BTW none of those 160 shader parts make sense because they only have 4 rops so don't just assume that I am referring to 160 shader parts. An HD 5550 is looking pretty likely right now for what nintendo has used as a base. 

Oh as for your "latte" having more logic do you even know if all of that is ENABLED logic. IE the shaders that ACTUALLY WORK. It's very common to see some alot of GPU manufacturers disable a part of the DIE that NOT WORKING. Since the "latte" probably has around 900 million transistors in total and a third of it is probably reserved for things like the eDRAM. Half of it is probably used for things like GPU logic and the rest is used to create extra eDRAM and and GPU logic so that the chip doesn't end up having less yields.

Do we even have a DF or LOT analysis to even come to the conlusion that it runs worse on the PS4 ? (Doesn't matter anyways since the PS4 is just 5 days away from analysis.)