By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Shin'en: If you can't make great looking games on Wii U, it's not the hardware's fault

The PS4 and One SKUs are being done by Ubisoft Montreal and the Wii U SKU is being done by Ubisoft Bucharest I think.

And as for comparing GTA V to X you may as well compare apples to oranges even before you start to think about the former probably having a budget 10 times the size lol.



Around the Network
snowdog said:
The PS4 and One SKUs are being done by Ubisoft Montreal and the Wii U SKU is being done by Ubisoft Bucharest I think.

And as for comparing GTA V to X you may as well compare apples to oranges even before you start to think about the former probably having a budget 10 times the size lol.


i don't understand why 3rd parties don't just put their main teams on their WiiU games?



fatslob-:O said:

It's just that PS4/X1 will make it easier to do gpgpu compared the WII U becaause it has an even more modernized shaders.

Probably, but you're making it seem like it will be virtually useless on Wii U, and that's just not the case at all. It may have been difficult to do it on 360 but it's sure much better on Wii U, and it's not as awkward as you think when you code only for the console, rather...code to the metal.

ninjablade said:

lol the best part is the DF article makes the wiiu have double the amount of shaders when in reality the wiiu is most likely a 160sp gpu.

Not that I disagree with that, but one thing that has bugged me all this time about this 160 hypothesis, is the fact that the GPU (if measurements are right) has over 700 million transistors, and that's excluding the eDRAM. A GPU with 3 times the transistors of the 360 Xenos only to be on par with it or just a tad better, that doesn't make sense AT ALL. Either it doesn't have less, or it has 160 and they are heavily modified with very efficient components. You're making it seem like it's simply a 360 + newer stuff and that's it. No, it's not, it's much more than that. If you talk this big about something, you had better have good means of explanation, because you haven't been saying anything new to explain much. 

Just so you know, we will likely never know the true number of shaders, that's not in documentation at all. 



forethought14 said:
fatslob-:O said:

It's just that PS4/X1 will make it easier to do gpgpu compared the WII U becaause it has an even more modernized shaders.

Probably, but you're making it seem like it will be virtually useless on Wii U, and that's just not the case at all. It may have been difficult to do it on 360 but it's sure much better on Wii U, and it's not as awkward as you think when you code only for the console, rather...code to the metal.

ninjablade said:

lol the best part is the DF article makes the wiiu have double the amount of shaders when in reality the wiiu is most likely a 160sp gpu.

Not that I disagree with that, but one thing that has bugged me all this time about this 160 hypothesis, is the fact that the GPU (if measurements are right) has over 700 million transistors, and that's excluding the eDRAM. A GPU with 3 times the transistors of the 360 Xenos only to be on par with it or just a tad better, that doesn't make sense AT ALL. Either it doesn't have less, or it has 160 and they are heavily modified with very efficient components. You're making it seem like it's simply a 360 + newer stuff and that's it. No, it's not, it's much more than that. If you talk this big about something, you had better have good means of explanation, because you haven't been saying anything new to explain much. 

Just so you know, we will likely never know the true number of shaders, that's not in documentation at all. 

I'm not saying it will be useless but it's just that VLIW is known to have alot of shaders to stalls on compute tasks and if I remember correctly the xbox 360 did introduce unified shaders but they weren't very programmable compared to the WII U because fully programmable graphics pipelines were only introduced in the DirectX10 days so I don't think the 360 was capable on alot of gpgpu workloads. BTW gpgpu is still awkward even to alot of game developers. 

If I were you I would ignore ninjablade.



cheesecake said:
snowdog said:
The PS4 and One SKUs are being done by Ubisoft Montreal and the Wii U SKU is being done by Ubisoft Bucharest I think.

And as for comparing GTA V to X you may as well compare apples to oranges even before you start to think about the former probably having a budget 10 times the size lol.


i don't understand why 3rd parties don't just put their main teams on their WiiU games?

Because your star developers worked too long and too hard to be stuck working with Nintendo hardware.



Monster Hunter: pissing me off since 2010.

Around the Network
fatslob-:O said:

I'm not saying it will be useless but it's just that VLIW is known to have alot of shaders to stalls on compute tasks and if I remember correctly the xbox 360 did introduce unified shaders but they weren't programmable compared to the WII U because programmable graphics pipelines were only introduced in the DirectX10 days so I don't think the 360 was capable on alot of gpgpu workloads. BTW gpgpu is still awkward even to alot of game developers. 

If I were you I would ignore ninjablade.

Again, it's awkward because many don't have much experience using such things. I know, VLIW4/5 GPGPU isn't as easy to impliment as GCN, they improved the operations with the architecture, but that doesn't matter. If developers want to improve their games, they will learn how to use it on Wii U, learn to make it not so awkward. It being "awkward" is no excuse to ignore it completely. 

I should huh.... O_o...



forethought14 said:
fatslob-:O said:

I'm not saying it will be useless but it's just that VLIW is known to have alot of shaders to stalls on compute tasks and if I remember correctly the xbox 360 did introduce unified shaders but they weren't programmable compared to the WII U because programmable graphics pipelines were only introduced in the DirectX10 days so I don't think the 360 was capable on alot of gpgpu workloads. BTW gpgpu is still awkward even to alot of game developers. 

If I were you I would ignore ninjablade.

Again, it's awkward because many don't have much experience using such things. I know, VLIW4/5 GPGPU isn't as easy to impliment as GCN, they improved the operations with the architecture, but that doesn't matter. If developers want to improve their games, they will learn how to use it on Wii U, learn to make it not so awkward. It being "awkward" is no excuse to ignore it completely. 

I should huh.... O_o...

@Bold Not just the operation were impoved on the architecture. They improved caching, sheduling, and the whole programmability of the gcn gpu's compared to the VLIW gpu's. Developers will practically do almost nothing for the PS4/xbone to utilize gpgpu because they can just rely on amd's HSA bolt library to do most of the work for them and this is allowed because gcn has expanded programmability.

Like I said I don't think ninjablade knows anything about hardware.



cheesecake said:
snowdog said:
The PS4 and One SKUs are being done by Ubisoft Montreal and the Wii U SKU is being done by Ubisoft Bucharest I think.

And as for comparing GTA V to X you may as well compare apples to oranges even before you start to think about the former probably having a budget 10 times the size lol.


i don't understand why 3rd parties don't just put their main teams on their WiiU games?



It all comes down to developers and publishers having the same resources (for two console teams) this gen as they had last gen. As this gen continues you'll probably see developers and publishers increasing the size of their teams to allow for one developer to handle all 3 platforms.

This is all down to Nintendo insisting on backwards compatibility with the GameCube last gen for the Wii and having a GPU with a nonstandard rendering pipeline compared to the PS3 and 360 which made ports next to impossible.



fatslob-:O said:

@Bold Not just the operation were impoved on the architecture. They improved caching, sheduling, and the whole programmability of the gcn gpu's compared to the VLIW gpu's. Developers will practically do almost nothing for the PS4/xbone to utilize gpgpu because they can just rely on amd's HSA bolt library to do most of the work for them and this is allowed because gcn has expanded programmability.

Like I said I don't think ninjablade knows anything about hardware.

I know all of that, but you're still making it sound like Wii U's GPGPU capabilities aren't too useful, why? What does "being awkward" have to do with anything? Awkward to perhaps the people who haven't had much experience, but if you understand how to use it the most efficient way possible, then it doesn't have to be so "awkward" (VLIW GPGPU isn't as bad as you're making it seem). And why are you comparing Wii U to GCN at all? I just recited the fact that PS4/X1 will use GPGPU, whether it's easier or not is not too relevant, if it means offloading tasks from the CPU so that it can have more leeway, then by all means, learn to not make it so unusual. Of course, that's where budgets and business kicks in, where some teams may not want to spend so much on trying to understand it, and yes, that's where Nintendo goofed up. But, it's possible that NFSMWU impliments some GPGPU usage since the performance of that game is better than PS360 (Espresso alone may not have performed that well if the code was very floating pointy SIMD-y and not integer). If a single digits amount of people were able to make it happen, then I don't see why double digits amount of people wouldn't be able to figure out. 

Anyway, even though floating point is supposed to be Espresso's weak points, if used correctly it's not so bad at all. I recall  that a NeoGaf member tested several CPUs in floating point / SIMD, and a PPC750CL (Broadway) didn't do too bad at all compared to other CPUs. (if you havne't seen it before, I'll link you to the post, and if I could find it.) And if Espresso is 3 Broadways with higher clocks and more cache, then it's not to hard to estimate how capable Espresso should be.



forethought14 said:
fatslob-:O said:

@Bold Not just the operation were impoved on the architecture. They improved caching, sheduling, and the whole programmability of the gcn gpu's compared to the VLIW gpu's. Developers will practically do almost nothing for the PS4/xbone to utilize gpgpu because they can just rely on amd's HSA bolt library to do most of the work for them and this is allowed because gcn has expanded programmability.

Like I said I don't think ninjablade knows anything about hardware.

I know all of that, but you're still making it sound like Wii U's GPGPU capabilities aren't too useful, why? What does "being awkward" have to do with anything? Awkward to perhaps the people who haven't had much experience, but if you understand how to use it the most efficient way possible, then it doesn't have to be so "awkward" (VLIW GPGPU isn't as bad as you're making it seem). And why are you comparing Wii U to GCN at all? I just recited the fact that PS4/X1 will use GPGPU, whether it's easier or not is not too relevant, if it means offloading tasks from the CPU so that it can have more leeway, then by all means, learn to not make it so unusual. Of course, that's where budgets and business kicks in, where some teams may not want to spend so much on trying to understand it, and yes, that's where Nintendo goofed up. But, it's possible that NFSMWU impliments some GPGPU usage since the performance of that game is better than PS360 (Espresso alone may not have performed that well if the code was very floating pointy SIMD-y and not integer). If a single digits amount of people were able to make it happen, then I don't see why double digits amount of people wouldn't be able to figure out. 

Anyway, even though floating point is supposed to be Espresso's weak points, if used correctly it's not so bad at all. I recall  that a NeoGaf member tested several CPUs in floating point / SIMD, and a PPC750CL (Broadway) didn't do too bad at all compared to other CPUs. (if you havne't seen it before, I'll link you to the post, and if I could find it.) And if Espresso is 3 Broadways with higher clocks and more cache, then it's not to hard to estimate how capable Espresso should be.

@Bold you realize that GCN has to abreviations such a gamecube nintendo or graphics core next which is the newest gpu architecture from amd. 

"you're still making it sound like Wii U's GPGPU capabilities aren't too useful, why? "

You already know the reason, say it with me, VLIW for gpu compute is inefficient :P (BTW VLIW is practically bad in almost all cases for compute.)

"Anyway, even though floating point is supposed to be Espresso's weak points, if used correctly it's not so bad at all. I recall  that a NeoGaf member tested several CPUs in floating point / SIMD, and a PPC750CL (Broadway) didn't do too bad at all compared to other CPUs. (if you havne't seen it before, I'll link you to the post, and if I could find it.) And if Espresso is 3 Broadways with higher clocks and more cache, then it's not to hard to estimate how capable Espresso should be."

Woah there bro, I wasn't talking about the ibm espresso in my last post, It's fine for what it does but don't just hope for mind blowing ingame physics.