By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Shin'en: If you can't make great looking games on Wii U, it's not the hardware's fault

fatslob-:O said:
dahuman said:
ninjablade said:
dahuman said:
ninjablade said:
 


if bayo 2 is 1080p/60fps i will ban myself forever, and wiiu has impressed me and is indeed 2x more powerful then currentgen.

What I can tell you so far is that the E3/PAX Prime demo looked really good(online videos can't show it off due to the higher FPS, I was under the impression that textures weren't that great but I knew I was wrong when I saw the game running in real time) and performance was right there when I played and feels even more amazing than the first game. I did see some graphical bugs since my eyes are really trained but apparently they are known issues that will be fixed after I made some inquries. If they can get that kind of performance already in an early demo build in 720p, then 1080p isn't entirely impossible if they optimize the code more, though I'm leaning towards 720p final as well, I'd be surprised otherwise.


yea jumping from 720p to 1080 only requires 2.5x the rendering power, should be breeze.

That has more to do with the rendering pipeline memory speed and fillrate than pure raw power, since I'm not a Wii U dev, I don't have the information on any of that. It's not just a pixel % count like many would believe, you need to spend less time on Neogaf or Beyond3D and study on the subject more if you are actually interested, out of every 100 users from those sites, only about 2-3 are actually really good and know their shit, rest are either idiots or amateurs, I'm talking about even the devs, I can't stress how many idiots are in every field of a job.

BTW if you want to know about the pixel fillrate the WII U has like 8 ROPS and that is the same amount as the PS3 and both of them are clocked at 550 Mhz which gives you 4.4 gigapixels. 



We have no idea of any such thing. There's still at least a third of the silicone that's a complete mystery. You're stating that as fact whereas it's actually an opinion.



Around the Network
snowdog said:
fatslob-:O said:
dahuman said:
ninjablade said:
dahuman said:
ninjablade said:
 


if bayo 2 is 1080p/60fps i will ban myself forever, and wiiu has impressed me and is indeed 2x more powerful then currentgen.

What I can tell you so far is that the E3/PAX Prime demo looked really good(online videos can't show it off due to the higher FPS, I was under the impression that textures weren't that great but I knew I was wrong when I saw the game running in real time) and performance was right there when I played and feels even more amazing than the first game. I did see some graphical bugs since my eyes are really trained but apparently they are known issues that will be fixed after I made some inquries. If they can get that kind of performance already in an early demo build in 720p, then 1080p isn't entirely impossible if they optimize the code more, though I'm leaning towards 720p final as well, I'd be surprised otherwise.


yea jumping from 720p to 1080 only requires 2.5x the rendering power, should be breeze.

That has more to do with the rendering pipeline memory speed and fillrate than pure raw power, since I'm not a Wii U dev, I don't have the information on any of that. It's not just a pixel % count like many would believe, you need to spend less time on Neogaf or Beyond3D and study on the subject more if you are actually interested, out of every 100 users from those sites, only about 2-3 are actually really good and know their shit, rest are either idiots or amateurs, I'm talking about even the devs, I can't stress how many idiots are in every field of a job.

BTW if you want to know about the pixel fillrate the WII U has like 8 ROPS and that is the same amount as the PS3 and both of them are clocked at 550 Mhz which gives you 4.4 gigapixels. 



We have no idea of any such thing. There's still at least a third of the silicone that's a complete mystery. You're stating that as fact whereas it's actually an opinion.

OK then ignore the die shot analysis from chipworks lol. Oh and uh why don't you take a guess as to what it is and do you think it will top the PS4 and xbone also ?



the_dengle said:
JoeTheBro said:
the_dengle said:

Thanks Google.

That means they ported a DX 11 game to Wii U, not that the Wii U supports a DX 11 equivalent API.

I just Googled it, I neither know nor care what DX 11 even means. Though I have some recollection of Square-Enix saying it's the reason FFXV and Kingdom Hearts 3 aren't coming to Wii U.

Square Enix never said anything of the sort.

 

Mind you PS4 doesnt use Direct X11 either, thats a microsoft API



Dr.EisDrachenJaeger said:
the_dengle said:

I just Googled it, I neither know nor care what DX 11 even means. Though I have some recollection of Square-Enix saying it's the reason FFXV and Kingdom Hearts 3 aren't coming to Wii U.

Square Enix never said anything of the sort.

 

Mind you PS4 doesnt use Direct X11 either, thats a microsoft API

Ah, guess my memory is a little fuzzy. Somebody said something like that though.

If DX11 is owned by Microsoft and PS4 doesn't even use it, why would anyone expect Wii U to?



fatslob-:O said:
snowdog said:
fatslob-:O said:
dahuman said:
ninjablade said:
dahuman said:
ninjablade said:
 


if bayo 2 is 1080p/60fps i will ban myself forever, and wiiu has impressed me and is indeed 2x more powerful then currentgen.

What I can tell you so far is that the E3/PAX Prime demo looked really good(online videos can't show it off due to the higher FPS, I was under the impression that textures weren't that great but I knew I was wrong when I saw the game running in real time) and performance was right there when I played and feels even more amazing than the first game. I did see some graphical bugs since my eyes are really trained but apparently they are known issues that will be fixed after I made some inquries. If they can get that kind of performance already in an early demo build in 720p, then 1080p isn't entirely impossible if they optimize the code more, though I'm leaning towards 720p final as well, I'd be surprised otherwise.


yea jumping from 720p to 1080 only requires 2.5x the rendering power, should be breeze.

That has more to do with the rendering pipeline memory speed and fillrate than pure raw power, since I'm not a Wii U dev, I don't have the information on any of that. It's not just a pixel % count like many would believe, you need to spend less time on Neogaf or Beyond3D and study on the subject more if you are actually interested, out of every 100 users from those sites, only about 2-3 are actually really good and know their shit, rest are either idiots or amateurs, I'm talking about even the devs, I can't stress how many idiots are in every field of a job.

BTW if you want to know about the pixel fillrate the WII U has like 8 ROPS and that is the same amount as the PS3 and both of them are clocked at 550 Mhz which gives you 4.4 gigapixels. 



We have no idea of any such thing. There's still at least a third of the silicone that's a complete mystery. You're stating that as fact whereas it's actually an opinion.

OK then ignore the die shot analysis from chipworks lol. Oh and uh why don't you take a guess as to what it is and do you think it will top the PS4 and xbone also ?



I've been following the Gaf thread since it started mate. Could be 8 or could be 16, we don't know anything for certain. The only thing we know for certain is that the GPU is completely custom, it has some similarities to Brazos in places but the general consensus of opinion is that it's loosely based on the R700 and Nintendo and AMD have probably added in different parts from different lines. We also know the clockspeed, 550MHz, but that's all we know for a fact, and that's down to marcan hacking the thing.

Like I've said, we can see how advanced the hardware is in comparison to the PS3 and 360 by looking at the Bayonetta 2 demo.



Around the Network
the_dengle said:
Dr.EisDrachenJaeger said:
the_dengle said:

I just Googled it, I neither know nor care what DX 11 even means. Though I have some recollection of Square-Enix saying it's the reason FFXV and Kingdom Hearts 3 aren't coming to Wii U.

Square Enix never said anything of the sort.

 

Mind you PS4 doesnt use Direct X11 either, thats a microsoft API

Ah, guess my memory is a little fuzzy. Somebody said something like that though.

If DX11 is owned by Microsoft and PS4 doesn't even use it, why would anyone expect Wii U to?



It's all down to GPU feature sets. Saying that a GPU is a DX11 or DX10.1 just means that the hardware is capable of producing the same effects and equivalent shaders. Technically people should say 'DX11-equivalent feature set' rather than just 'DX11'.

We know from statements by Unity and the Project C.A.R.S. changelogs as well as the fact that Latte has a usable tessellation unit that Latte has a DX11-equivalent feature set. The fact that it also has compute shaders also indicates this.



snowdog said:

It's all down to GPU feature sets. Saying that a GPU is a DX11 or DX10.1 just means that the hardware is capable of producing the same effects and equivalent shaders. Technically people should say 'DX11-equivalent feature set' rather than just 'DX11'.

We know from statements by Unity and the Project C.A.R.S. changelogs as well as the fact that Latte has a usable tessellation unit that Latte has a DX11-equivalent feature set. The fact that it also has compute shaders also indicates this.

Then I don't understand Joethebro's comment. "That means they ported a DX 11 game to Wii U, not that the Wii U supports a DX 11 equivalent API." What's the distinction there? How could they port a DX11 game to Wii U if it did NOT support an equivalent API?



snowdog said:

It's all down to GPU feature sets. Saying that a GPU is a DX11 or DX10.1 just means that the hardware is capable of producing the same effects and equivalent shaders. Technically people should say 'DX11-equivalent feature set' rather than just 'DX11'.

We know from statements by Unity and the Project C.A.R.S. changelogs as well as the fact that Latte has a usable tessellation unit that Latte has a DX11-equivalent feature set. The fact that it also has compute shaders also indicates this.

Can you link me to those Unity and Project CARS statements? That would be news to me.

the_dengle said:

Then I don't understand Joethebro's comment. "That means they ported a DX 11 game to Wii U, not that the Wii U supports a DX 11 equivalent API." What's the distinction there? How could they port a DX11 game to Wii U if it did NOT support an equivalent API?

It would be news to me if the Wii U supported an equivalent API for DX 11.



snowdog said:
fatslob-:O said:

OK then ignore the die shot analysis from chipworks lol. Oh and uh why don't you take a guess as to what it is and do you think it will top the PS4 and xbone also ?



I've been following the Gaf thread since it started mate. Could be 8 or could be 16, we don't know anything for certain. The only thing we know for certain is that the GPU is completely custom, it has some similarities to Brazos in places but the general consensus of opinion is that it's loosely based on the R700 and Nintendo and AMD have probably added in different parts from different lines. We also know the clockspeed, 550MHz, but that's all we know for a fact, and that's down to marcan hacking the thing.

Like I've said, we can see how advanced the hardware is in comparison to the PS3 and 360 by looking at the Bayonetta 2 demo.

I could care less if you followed the whole gaf thread. No need to get defensive there mate I ain't exactly attacking the WII U. It's probably 8 bro otherwise why would AMD think it would be a good idea to do 16 with a tiny ass 150mm^2 die. 

@Bold That's cool bra it's about the games that show the hardware and not vice versa or somewhat vice versa.



http://www.cinemablend.com/games/Interview-Why-Unity-Engine-Good-Fit-Wii-U-47173.html

There's an interview with the CEO of Unity where he mentions that DX11 equivalent effects can be made possible.

Can't provide a link to the changelog though because someone on Gaf from Slightly Mad Studios requested people not to do so unfortunately. It basically said that the PS4 and Wii U shared DX11 equivalent shaders if I remember correctly.

Tbh, there isn't a great deal of difference between DX10.1 and DX11 anyway, just compute shaders and updated tesselation I think.

You can see from the Latte die shot that the GPU is a bit of a Frankenstein creation. The general consensus of opinion is that Nintendo and AMD took the best and most useful parts of several series of GPU and squeezed it all into one GPU.

I'm still convinced that they've included an evolution of the TEV Unit from Flipper/Hollywood but made the changes necessary to give the Wii U a standard rendering pipeline. This would explain how they could manage something like Bayonetta 2 (and The Wonderful 101 for that matter) with so few programmable shaders.

That would give developers 'free' use of HDR/depth of field/whatever else they've included but would have the disadvantage of not having a great deal of spare floppage if someone comes up with a new lighting system etc for the PS4 and One. Probably worth doing though, would just mean developers would be 'stuck' using the fixed function version.