By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Shin’en Multimedia: Wii U Is Most Definitely A Next-Generation Console

Torillian said:

Seems a bit silly, the whole idea is that these developers have a lot of experience on the platforms so they know best, but then why should we trust them for comparisons with platforms they've never touched or released a game on?  I wouldn't ask Naughty Dog what the Xone is capable of, and I wouldn't ask these guys what anything non-Nintendo is capable of.  

I wouldn't ask Naughy Dog about Xone either, beause it is unknown new hardware. The PS3 and 360, however, are not.

Naughty Dog actually did comment on Uncharted 2 not being possible on 360. (http://gamer.blorge.com/2009/08/29/naughty-dog-uncharted-2-not-possible-on-xbox-360-only-ps3/) And you know what? I believe them. The 360 had been out for four years and thoroughly and publically explored, it was a known quantity, even to devs who hadn't made a game on it personally.



Around the Network
curl-6 said:
Torillian said:
 

Seems a bit silly, the whole idea is that these developers have a lot of experience on the platforms so they know best, but then why should we trust them for comparisons with platforms they've never touched or released a game on?  I wouldn't ask Naughty Dog what the Xone is capable of, and I wouldn't ask these guys what anything non-Nintendo is capable of.  

I wouldn't ask Naughy Dog about Xone either, beause it is unknown new hardware. The PS3 and 360, however, are not.

Naughty Dog actually did comment on Uncharted 2 not being possible on 360. (http://gamer.blorge.com/2009/08/29/naughty-dog-uncharted-2-not-possible-on-xbox-360-only-ps3/) And you know what? I believe them. The 360 had been out for four years and thoroughly and publically explored, it was a known quantity, even to devs who hadn't made a game on it personally.

That's just where we're going to have to disagree then.  I would never trust an exclusive developer when they talk about the capabilities of a platform they have no experience on.  At the very least you have to admit they have a vested interest in making the platform they exclusively develop on look as good as possible, and the same would go for Naughty Dog.  Based on that and the developers not having experience for themselves on the platform to know its capabilities I think any comparison they make is useless.  



...

jake_the_fake1 said:

My original statement stands, the developer hit a hard limit and compromised which is fine, however, After talking about harnessing the power of the WiiU they can hardly admit that they hit a limit so they just prettied up the words, turning a possible negative into a positive, typical PR spin. Honestly if the WiiU had the power to harness as he put it then 1080p with the same effects could have been possible, but that's not the case.

 

No matter what system you develop for, 1080p will always use more than double as many pixels as 720p, which means less remaining fillrate for other things. Even on PS4, you could push more post-effects on Killzone Shadowfall if it was 720p instead of 1080p.

In fact, speaking of Killzone, Guerrilla made it run at 30fps instead of 60fps for the same reason as Shin'en chose 720p; not that PS4 (Wii U) can't handle great graphics at 60fps, (1080p) but that they would rather prioritise detail over a higher framerate (higher resolution) many wouldn't notice.



dahuman said:
Hynad said:
dahuman said:
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:
 

 



1) Silcon size is important. It tells allows you to get an estimate for the power of the chip, and then what other features it will have eg. bandwith, inbuilt AA features, resolution support etc. Resolution is a deal breaker. Anyone running a gaming PC will tell you this from experience. It also means less power goes into AA and more cna be placed on having prettier textures, shadows, more obejects and so on.

2) PS4 can run it....hence why it ran the more demanding U4 engine demo. No idea about the xbone, and the wiiU certainly can not.  As for the anology, that would be bandwidth. More traffic is not a problem is you have more lanes, something the PS4 has a brutal amount of.

 

 


Ok, well a few things with that demo. First off, what is it running on? Then you have the issue of falling between 24-30fps for realtively small scences, and then, most importantly, he never mentiones the WiiU but specifically mentions the ps4 and xbone for dx11 features, but mentions how they can turn them off for current consoles. I think it will be a case of dx11 for ps4, pc, xbone, and dx9 for the wiiu.

Because I'm totally not a PC gamer and not in the field and don't know what I'm talking about right? :P The size you listed is the process, not the actual size of the chip, and no, I don't think it will beat PS4 or Xbone by a long shot, I just know that the feature set isn't too different but there is a raw power difference, high->low settings if devs bother is about it, but I doubt they will.

The PS4 couldn't handle the real UE4 engine, nor can it run Samaritan at the level of that demo, which was more demanding than the UE4 demo, you got it reversed, and your analogy still doesn't work, as you can't dynamically change the amount of lanes you drive on like a sci-fi movie and that's what scalable engines or game creators can do. You make software makers sound like lazy idiots, which a lot of them actually are now that I think about it.....

 

The demo was partly PC and partly on Wii U, the PC one is the one with the higher FPS and the Wii U one is the 24-30FPS one, you can see the Wii U gamepad button at the 7 some minute mark. You'd get better lighting and better tessllation with DX11 level feature sets. Wii U doesn't have DX11, it's more DX10 level with a tessllator most likely, which would be just like DX11 but not as efficient, the difference would be settings, and they are most likely under NDA and can't really share tech info about the Wii U anyways, Nintendo is very asshole-istic about that.


The jury is still out on that samaritan demo... You're just assuming here. With proper optimization and probably some compromises, I wouldn't be surprised if it could run on the PS4.

And the demo they showed suring the reveal event wasn't fully optimized and Epic said they could have made it better if they had more time with the hardware.


Na, I'm not so unrealistic, the PS4 can't pump that amount of polygons to start with, they'd need to tone down a lot of stuff and lower the quality or disable some features. What Samaritan showed us was a lot of raw power, it was pretty ridiculous actually.

I also think they could have, I'm talking about the lighting engine epic ended up not using due to the processing power, that's why I said the "real" one. I think the post one can be improved and will look much better, but SVOGI is something the new consoles just can't handle.

You do know what card ran the samaritan demo and how much power the ps4 has right? That alone shows you are wrong, and this is without even looking into console optimisation which makes an enormous difference.

In response to your other reply: The size is the transitor size which determines the performance of the chip. Small trans = more trans = more power. High an low settings only go so far, and the power jump the ps4 and pcs (not stating xbone as the figures are not detailed enough) will open up many options eg. vast landscapes, huge number of objects rendered, advanced physics etc. You can't change those settings. You either have them or don't.

Woah woah woah, slow down with the U4 demo bashing. Firstly, it was not optimised and as I have said, this is massive on consoles. Actually Samaritan and U4 use the same power, so I don't understand why you are saying the former requires more. Erm, how much do you know about gpu architecture, because it looks like not much. You should realise that there is no need to dynamically change anything if you supply it with an absurd bandwidth to start with. This ability opens up the ps4 to a number of options that are not possible with the WiiU.

The part with the "press X" does not mean it is on a WiiU necessarily as that could be a default the devs have put into their game. Also 24-30 fps in relatively small scenes is poor and would be limiting to the game. No DX11 will really crew the WiiU over. This means no dynamic particles, limited fluid motion, no sunsurface scattering etc which are all currently the next big things in terms of graphics and are being used in many new games. This will make the WiiU look like the old CRT 28" tv trying to compete with a 38" LED tv aka pretty crap.



Torillian said:
curl-6 said:
Torillian said:
 

Seems a bit silly, the whole idea is that these developers have a lot of experience on the platforms so they know best, but then why should we trust them for comparisons with platforms they've never touched or released a game on?  I wouldn't ask Naughty Dog what the Xone is capable of, and I wouldn't ask these guys what anything non-Nintendo is capable of.  

I wouldn't ask Naughy Dog about Xone either, beause it is unknown new hardware. The PS3 and 360, however, are not.

Naughty Dog actually did comment on Uncharted 2 not being possible on 360. (http://gamer.blorge.com/2009/08/29/naughty-dog-uncharted-2-not-possible-on-xbox-360-only-ps3/) And you know what? I believe them. The 360 had been out for four years and thoroughly and publically explored, it was a known quantity, even to devs who hadn't made a game on it personally.

That's just where we're going to have to disagree then.  I would never trust an exclusive developer when they talk about the capabilities of a platform they have no experience on.  At the very least you have to admit they have a vested interest in making the platform they exclusively develop on look as good as possible, and the same would go for Naughty Dog.  Based on that and the developers not having experience for themselves on the platform to know its capabilities I think any comparison they make is useless.  

Yes, we will have to disagree.

Of course these companies have a vested interest, but for me it's just too big a leap to then automatically assume a dev is lying when they do not have a history of dishonesty.



Around the Network
z101 said:
jake_the_fake1 said:
z101 said:
jake_the_fake1 said:

On the second part, you do realise that the PS4 has the same setup but it's intergrated into a single chip, plus both the CPU and GPU have access to 8GB of high bandwidth ram making EDram a non-requirment,


The Wii U eDRAM bandwith is much faster than normal RAM the PS4 uses. Interesting statement from the lead system architect from the PS4: 

For example, if we use eDRAM (on-chip DRAM) for the main memory in addition to the external memory, the memory bandwidth will be several terabytes per second. It will be a big advance in terms of performance.

He even explain why the PS4 don't use eDRAM:

However, in that case, we will make developers solve a puzzle, 'To realize the fastest operation, what data should be stored in which of the memories, the low-capacity eDRAM or the high-capacity external memory?' We wanted to avoid such a situation. We put the highest priority on allowing developers to spend their time creating values for their games.

Sony don't use eDRAM because they wanted to make console that is very easy to handle even for dumb programmers so they sacrifice performance for easy programming, the other reason is that eDRAM is very expensive.

Source: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

...

Also I'd like to point out that neither the high end GPUs from Nvidia or AMD use EDram in their graphics card, in fact the stock GTX680 has a bandwidth of 192.2GB/s, while the Titan has 288.4GB/s, graphical beasts of their time, just a little perspective. 

..

I'd like to also emphasise that Cerny's approach was developer centric, and as he said he wanted to remove any stumbling blocks, and split memory set ups comprising of small fast ram and large slow ram makes developers life hard, they would rather have 1 large and fast pool of ram, and Sony have done this, hence Cerny's decision to have 8GB GDDR5 ram with 176GB/s of bandwidth, best of both worlds.

 

 


There is no chitchat neccessary about it: eDRAM gives a hugh advantance in power for a console and even the lead PS4 developer confessed that, but they decided not to use eDRAM on PS4 because programming would be more complicated for average programmer.

eDRAM is not so efficient on PCs, because to really utilize its possibilities it must be really used by the program code, but no PC programmer will assume that there is eDRAM on a graphics card. But in the future some high end graphics card will feature eDRAM and special logics will automatically use this eDRAM even if it is not in the program code and so will give a performance boost. Of course this boost could be higher if the program (game) is coded to use eDRAM.

I agree with you, but this is why I questioned what was the bandwidth of the EDram the WiiU is using, so far we don't know, but we can't just assume the highest, we don't even know how it works in the architecture, this is why I described how the 360 uses it as to give a little perspective seeing as the WiiU and the 360 have such similarities in their desgn:-

"Keep in mind that even the 360 EDram had a bandwidth of 256GB/s only to it's FPU allowing it to take the brunt of bandwidth intensive operations like AA, however to the rest of the system the bandwidth was just 32GB/s...Microsoft used some smart words for their console war cannons, which is why if you look at the Xboxone Microsoft has an ESram of 102GB/s which makes sense since it's a bandwidth for the whole system and not for just 1 aspect of a component. So with what I just mentioned, Nintendo could have easily taken a xbox360 approach in it's hardware guts but of course tinkered with it a little."

EDram is way to expensive to have graphics card levels like 6GB, no way in hell would they even consider a split pool because developers already have to contend with 2 pools, 3 pools would be stupid hard for no reason, also GDD5 has bandwidth required to feed the shaders and it's cheaper to get the required capacities.



curl-6 said:
jake_the_fake1 said:

My original statement stands, the developer hit a hard limit and compromised which is fine, however, After talking about harnessing the power of the WiiU they can hardly admit that they hit a limit so they just prettied up the words, turning a possible negative into a positive, typical PR spin. Honestly if the WiiU had the power to harness as he put it then 1080p with the same effects could have been possible, but that's not the case.

 

No matter what system you develop for, 1080p will always use more than double as many pixels as 720p, which means less remaining fillrate for other things. Even on PS4, you could push more post-effects on Killzone Shadowfall if it was 720p instead of 1080p.

In fact, speaking of Killzone, Guerrilla made it run at 30fps instead of 60fps for the same reason as Shin'en chose 720p; not that PS4 (Wii U) can't handle great graphics at 60fps, (1080p) but that they would rather prioritise detail over a higher framerate (higher resolution) many wouldn't notice.

You are correct that 1080p is resource intensive, so is 60fps, in fact even running 720p @ 60fps is resource intensive, surely they could of dropped it to 480p to put in more effects, but seriously, the thing in question here is the power of the WiiU, is the WiiU or is the WiiU not more powerful than the PS3/360?

1) If we assume that the WiiU is on par for the most part with the PS3/360, then Nano assault Neo as is, is perfectly fine running in 720p @60fps.

2) If we are to assume that the WiiU is, as some say x1.5 - x3 more powerful than then PS3/360, then a 1080p @ 60fps Nano assault neo game should exist seeing as a 2008 game of the same genre called Stardust ran in 1080p 60fps, surely a power console of 2012 could run the same with even more detail without breaking a sweat. Especially when the developer is claiming to harness the power of the WiiU.

Case in point, option 1 is the reality we live in, 2 is but a dream.

 

In regards to Killzone it is a franchise which histoyricaly runs at 30fps, killzone: Shadow fall is no different in this respect, it now runs in 1080p. Guerilla was not targeting 60fps, they were targeting 1080p with cinematic visuals. The difference here is that Nano assault neo actually ran in 1080p, before being downgraded to 720p, why would they develop the game to run at 1080p in the first place? if you read between the lines you will find that they wanted to target 1080p but couldn't achieve the look and feel of the game so they compromised on resolution to free up the resources as they were not going to compromise on frame rate. think about the excuse for one moment in what they said " We had the game also running in 1080p but the difference was not distinguishable when playing." why make such a statement if the game was designed to run in 720p?

Look at call of duty, they always target 60fps and always compromise resolution. Dropping resolution always reduces the image quality. Just ask any PC gamer who always wants to run the games at higher rez to make them look better. Just think about PS2/Wii games running on PC emulators at crazy high resolutions and see how good they look, allot of art is lost with low resolution, so for the developer to make such an excuse makes no sense unless the real reason is like I said, they talked big about harnessing the power of the WiiU, hit a limit, they compromised, and now had to do the PR dance to still hit their original claim of "Harnessing the power of the WiiU", thus making option 1 the reality above which answers the power question.

It is all my opinion so take it as you will :)

 



TheJimbo1234 said:
dahuman said:
Hynad said:
dahuman said:
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:


The jury is still out on that samaritan demo... You're just assuming here. With proper optimization and probably some compromises, I wouldn't be surprised if it could run on the PS4.

And the demo they showed suring the reveal event wasn't fully optimized and Epic said they could have made it better if they had more time with the hardware.


Na, I'm not so unrealistic, the PS4 can't pump that amount of polygons to start with, they'd need to tone down a lot of stuff and lower the quality or disable some features. What Samaritan showed us was a lot of raw power, it was pretty ridiculous actually.

I also think they could have, I'm talking about the lighting engine epic ended up not using due to the processing power, that's why I said the "real" one. I think the post one can be improved and will look much better, but SVOGI is something the new consoles just can't handle.

You do know what card ran the samaritan demo and how much power the ps4 has right? That alone shows you are wrong, and this is without even looking into console optimisation which makes an enormous difference.

In response to your other reply: The size is the transitor size which determines the performance of the chip. Small trans = more trans = more power. High an low settings only go so far, and the power jump the ps4 and pcs (not stating xbone as the figures are not detailed enough) will open up many options eg. vast landscapes, huge number of objects rendered, advanced physics etc. You can't change those settings. You either have them or don't.

Woah woah woah, slow down with the U4 demo bashing. Firstly, it was not optimised and as I have said, this is massive on consoles. Actually Samaritan and U4 use the same power, so I don't understand why you are saying the former requires more. Erm, how much do you know about gpu architecture, because it looks like not much. You should realise that there is no need to dynamically change anything if you supply it with an absurd bandwidth to start with. This ability opens up the ps4 to a number of options that are not possible with the WiiU.

The part with the "press X" does not mean it is on a WiiU necessarily as that could be a default the devs have put into their game. Also 24-30 fps in relatively small scenes is poor and would be limiting to the game. No DX11 will really crew the WiiU over. This means no dynamic particles, limited fluid motion, no sunsurface scattering etc which are all currently the next big things in terms of graphics and are being used in many new games. This will make the WiiU look like the old CRT 28" tv trying to compete with a 38" LED tv aka pretty crap.

The PS4 is not more powerful than a 680 GTX even if optimization is considered into the equation(original demo ran on 2-3 580s.) To be able to run that demo, you'd need to tune things down for the PS4, that thing is not a magical box, we need to accept that at this point.

I'm talking about feature sets, you are going in another direction with it altogether for some reason. UE4 is a prime example of lack of power ending in an important feature set being taken out as a result, it's a fact, not a bash. Game devs made sure newer engines are more scalable for a reason, it's to counter the exact problems you are talking about. I'm also not saying the Wii U will match the number of options on the graphical front compared to the PS4. GPU architecture has nothing to do with what we are even talking about, you are talking about raw power at this point. There is also no need to try to discredit me, I know what I'm talking about, I just don't try to use fancy words to impress people. :P

I was just clarifying that the parts that looked like Wii U game play, I didn't say it was good or bad lol, I just answered you is all. I have no idea how it turned into Wii U bashing on your end.



pezus said:
dahuman said:

dood, read what you bolded :P a lot of it are RAM based, and I already said it'd be missing some stuff but it'd run, the game was running on the PS3 KZ engine man, hence why I said it can run, you'd just tone down the poly count and the lighting wouldn't be as good. You literally said the same thing I did lol.

It didn't look like that to me:

"TBH I think KZ4 can run on the PS3 lol, the lighting and draw distance wouldn't be as good though, and that's because the PS3 has shit for RAM is about it(of course you wouldn't have dx11 like effects, but average users can't see the difference so much.) The assets they were using weren't really that detailed and weren't massive in poly count. I think the final KZ4 will actually look quiet a lot better than the demo in detail TBH."

The poly count was way higher than in KZ3. So that's at least textures, polygons, lighting, draw distance, performance that would have to change for it to run on PS3. That does not mean "KZ4 can run on the PS3". That's like saying Crysis can run on Wii, if you reduce the quality of textures, polygons, lighting, draw distance, particle effects etc...


Yeah but now we are in a territory of console Crysis 2 or 3 vs PC versions. Sure those games run on consoles, but I guess they are not really Crysis since they don't run on PC quality? Therefore they don't count? :P come on now.



jake_the_fake1 said:
curl-6 said:
jake_the_fake1 said:

My original statement stands, the developer hit a hard limit and compromised which is fine, however, After talking about harnessing the power of the WiiU they can hardly admit that they hit a limit so they just prettied up the words, turning a possible negative into a positive, typical PR spin. Honestly if the WiiU had the power to harness as he put it then 1080p with the same effects could have been possible, but that's not the case.

 

No matter what system you develop for, 1080p will always use more than double as many pixels as 720p, which means less remaining fillrate for other things. Even on PS4, you could push more post-effects on Killzone Shadowfall if it was 720p instead of 1080p.

In fact, speaking of Killzone, Guerrilla made it run at 30fps instead of 60fps for the same reason as Shin'en chose 720p; not that PS4 (Wii U) can't handle great graphics at 60fps, (1080p) but that they would rather prioritise detail over a higher framerate (higher resolution) many wouldn't notice.

You are correct that 1080p is resource intensive, so is 60fps, in fact even running 720p @ 60fps is resource intensive, surely they could of dropped it to 480p to put in more effects, but seriously, the thing in question here is the power of the WiiU, is the WiiU or is the WiiU not more powerful than the PS3/360?

1) If we assume that the WiiU is on par for the most part with the PS3/360, then Nano assault Neo as is, is perfectly fine running in 720p @60fps.

2) If we are to assume that the WiiU is, as some say x1.5 - x3 more powerful than then PS3/360, then a 1080p @ 60fps Nano assault neo game should exist seeing as a 2008 game of the same genre called Stardust ran in 1080p 60fps, surely a power console of 2012 could run the same with even more detail without breaking a sweat. Especially when the developer is claiming to harness the power of the WiiU.

Case in point, option 1 is the reality we live in, 2 is but a dream.

 

In regards to Killzone it is a franchise which histoyricaly runs at 30fps, killzone: Shadow fall is no different in this respect, it now runs in 1080p. Guerilla was not targeting 60fps, they were targeting 1080p with cinematic visuals. The difference here is that Nano assault neo actually ran in 1080p, before being downgraded to 720p, why would they develop the game to run at 1080p in the first place? if you read between the lines you will find that they wanted to target 1080p but couldn't achieve the look and feel of the game so they compromised on resolution to free up the resources as they were not going to compromise on frame rate. think about the excuse for one moment in what they said " We had the game also running in 1080p but the difference was not distinguishable when playing." why make such a statement if the game was designed to run in 720p?

Look at call of duty, they always target 60fps and always compromise resolution. Dropping resolution always reduces the image quality. Just ask any PC gamer who always wants to run the games at higher rez to make them look better. Just think about PS2/Wii games running on PC emulators at crazy high resolutions and see how good they look, allot of art is lost with low resolution, so for the developer to make such an excuse makes no sense unless the real reason is like I said, they talked big about harnessing the power of the WiiU, hit a limit, they compromised, and now had to do the PR dance to still hit their original claim of "Harnessing the power of the WiiU", thus making option 1 the reality above which answers the power question.

It is all my opinion so take it as you will :)

Well, at a glance, (I have not played it) Super Stardust HD doesn't seem to do as much with shaders and fillrate-related effects as Nano Assault Neo does, so it looks like another case of priority; SSHD pushed for 1080p above all, (PS3 devs were still chasing the 1080p dream back then, Wipeout HD another example) while NAN went for the best overall look, which Shin'en found was better served by using their fillrate on extra post effects than extra pixels.

Also, the "harnessing the Wii U's power" tagline was not Shin'en's words, the website chose that as an attention-grabbing headline. ;)

The reason Killzone historically runs at 30fps because the series has always sacrificed framerate for detail. It's still a compromise. Every system has is resource cap and there will always be compromises because devs naturally want to push things as far as they can. Many of the best looking PS3/360 games run at sub-HD resolution for the same reason Nano Assault runs at 720p; devs decide it looks better at lower res with more effects versus higher res with less effects.

There's much more to graphics than just resolution. Being in 720p doesn't inherently make a game technically unimpressive, for example, the Samaritan demo at 720p would still be nothing to scoff at, and Angry Birds at 1080p would be nothing to wow at.

Likewise, this is my opinion, to take for what you will. Thanks for keeping this an intelligent and mature discussion, those are hard to come by on the internet. :)