By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Shin’en Multimedia: Wii U Is Most Definitely A Next-Generation Console

curl-6 said:
Hynad said:
curl-6 said:
Hynad said:
curl-6 said:
Hynad said:
Nem said:
TheJimbo1234 said:
Nem said:
Been saying this for a year now. No one listens.

Now that the xbox one is revealed and all stats known, do people still think that its the leap they thought it was? Its only a difference of texture detail and frame-rate. To the naked eye, the difference isnt much.


Not at all. Everyone who says such a thing simply has not seen what modern engines are capable of.


Of course. I'd like to see more examples and less faith leaping.

You mean... Like seeing something Shin'en have done for the Wii U that actually support their claims?

Nano Assault Neo.

That game certainly doesn't show the superiority of the Wii U compared to the HD twins. -__-

Nano Assault Neo does things PS3/360 cannot.
http://www.notenoughshaders.com/2012/11/03/shinen-mega-interview-harnessing-the-wii-u-power/

And what are those things, exactly? 

If you really think that this game couldn't have been made for the HD twins to look just the same, you're fooling yourself.

No offense, but I'll take the word of a developer with an impeccable technical track record and no history of lying over an anonymous forum user.

The word of a developer with a track record of PS3 and 360 experience of ... zero. Shin'en is as close a third party developer could get to be first party. What with having only developed games for Nintendo platforms since its creation in 1999.  I'd be really surprised if he ever got his hands on a PS3 or 360 dev kit.

So you'll excuse me if I don't take his comments as anything other than cheerleading PR.



Around the Network
pezus said:
Nem said:
pezus said:
Nem said:

 



What if i said it could run on the PS3 with less detailed textures and 30 fps. Does it sound possible now? Because i respectufully disagree with you. I didnt see anything inherently amazing with the Killzone demo. Granted, it was the best we have seen so far from next gen consoles.

No...

Does my post hint at that? I said KZ2 and 3 could probably run on WiiU, and they are two of the most graphically advanced PS3 games. It's quite obvious that SF couldn't be made on PS3.

@dahuman: The lighting was far beyond what PS3 could do. So were the particle effects, shadows, and amount of stuff on-screen. All this happening with very little aliasing and already at a stable FPS.

You both should read through this: http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

"In terms of graphics, this is where the enhancements Guerrilla has made over Killzone 3 are perhaps best appreciated. For in-game characters, the PS3 game used three different LOD models (more polygons used the closer you are to the character in question), up to 10,000 polygons and 1024x1024 textures. Things have changed significantly for PlayStation 4 with seven LOD models and a maximum of 40,000 polygons and up to six 2048x2048 textures."

"Lighting looks simply phenomenal in Killzone: Shadow Fall (to the point where Guerrilla released an entirely separate presentation on how it works) with a full HDR, linear space system in place that's a clear evolution of the techniques used in Killzone 2 and its sequel. A key new feature is something similar to what we see in Kojima Productions' FOX engine and Unreal Engine 4 - a move to physical based lighting. In the past, in-game objects would have a certain degree of their lighting "baked" into the object itself. Now, the physical properties of the object itself - its composition, its smoothness/roughness etc - are variables defined by the artist, and the way they are lit depends on the actual light sources in any given scene."

"During this evolution all lights in any given scene became "area lights" - able to influence the world around them, and all light sources have actual volume to them too. Everything on-screen has a real-time reflection that considers all the appropriate light sources. A mixture techniques including ray-casting and image-based lighting produces some exceptional results."

 

PS3 couldn't run this. Not a chance. They'd have to change the entire game, not just the textures

dood, read what you bolded :P a lot of it are RAM based, and I already said it'd be missing some stuff but it'd run, the game was running on the PS3 KZ engine man, hence why I said it can run, you'd just tone down the poly count and the lighting wouldn't be as good. You literally said the same thing I did lol.



curl-6 said:
Hynad said:
curl-6 said:
Hynad said:
Nem said:
TheJimbo1234 said:
Nem said:
Been saying this for a year now. No one listens.

Now that the xbox one is revealed and all stats known, do people still think that its the leap they thought it was? Its only a difference of texture detail and frame-rate. To the naked eye, the difference isnt much.


Not at all. Everyone who says such a thing simply has not seen what modern engines are capable of.


Of course. I'd like to see more examples and less faith leaping.

You mean... Like seeing something Shin'en have done for the Wii U that actually support their claims?

Nano Assault Neo.

That game certainly doesn't show the superiority of the Wii U compared to the HD twins. -__-

Nano Assault Neo does things PS3/360 cannot.
http://www.notenoughshaders.com/2012/11/03/shinen-mega-interview-harnessing-the-wii-u-power/

 

So Nano Assault neo runs in 720p @ 30fps, and your saying this games is an example of what cannot been done ont he 360/PS3, while Stardust HD a 2008 game on the PS3 runs in full 1080p @ 60fps. FYI Stardust has a tone of shit happening on screen in later stages both in particles and geometry.

I dunno, the article claims "Harnessing the power of the WiiU", you use it as an example of what the WiiU can do which the 360/PS3 can't do. I just figured that if the console had 'power' to harness that the same game should have come out in 1080p @ 60fps without breaking a sweat....so either this developer did a bad job harnessing the WiiU's power, or maybe they did their best, hit a limit and had to cut resolution and framerate to gain back performance to achieve the look and feel they wanted for their game, the latter seems the most convincing to me as this is what all developers seem do when faced with a compromise.

Just my opinion from a different perspective, take it as you will.

 



Hynad said:

The word of a developer with a track record of PS3 and 360 experience of ... zero. Shin'en is as close a third party developer could get to be first party. What with having only developed games for Nintendo platforms since its creation in 1999.  I'd be really surprised if he ever got his hands on a PS3 or 360 dev kit.

So you'll excuse me if I don't take his comments as anything other than cheerleading PR.

The PS3/360 are known inside out by the whole world by now, you don't need to have made games for them to know their capabilities.

Shin'en don't have a history of lying. They do, however, have a history of performing great technical feats across a range of different hardwares.



jake_the_fake1 said:
So Nano Assault neo runs in 720p @ 30fps, and your saying this games is an example of what cannot been done ont he 360/PS3, while Stardust HD a 2008 game on the PS3 runs in full 1080p @ 60fps. FYI Stardust has a tone of shit happening on screen in later stages both in particles and geometry.

I dunno, the article claims "Harnessing the power of the WiiU", you use it as an example of what the WiiU can do which the 360/PS3 can't do. I just figured that if the console had 'power' to harness that the same game should have come out in 1080p @ 60fps without breaking a sweat....so either this developer did a bad job harnessing the WiiU's power, or maybe they did their best, hit a limit and had to cut resolution and framerate to gain back performance to achieve the look and feel they wanted for their game, the latter seems the most convincing to me as this is what all developers seem do when faced with a compromise.

Just my opinion from a different perspective, take it as you will.

 

Nano Assault Neo runs at 60fps, not 30fps.

As for 1080p, they said: "We had the game also running in 1080p but the difference was not distinguishable when playing. Therefore we used 720p and put the free GPU cycles into higher resolution post-Fx. This was much more visible. If we had a project with less quick motions we would have gone 1080p instead"



Around the Network
curl-6 said:
Hynad said:

The word of a developer with a track record of PS3 and 360 experience of ... zero. Shin'en is as close a third party developer could get to be first party. What with having only developed games for Nintendo platforms since its creation in 1999.  I'd be really surprised if he ever got his hands on a PS3 or 360 dev kit.

So you'll excuse me if I don't take his comments as anything other than cheerleading PR.

The PS3/360 are known inside out by the whole world by now, you don't need to have made games for them to know their capabilities.

Shin'en don't have a history of lying. They do, however, have a history of performing great technical feats across a range of different hardwares.

Oh, please.  You will always take the word of the people that say positive things about Nintendo, no matter what. No questions asked. They say what you want to hear, you believe them. And whoever doesn't talk positively about them, you filter what you don't want to believe. 

That's your own track record here.



TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:
 

 



1) Silcon size is important. It tells allows you to get an estimate for the power of the chip, and then what other features it will have eg. bandwith, inbuilt AA features, resolution support etc. Resolution is a deal breaker. Anyone running a gaming PC will tell you this from experience. It also means less power goes into AA and more cna be placed on having prettier textures, shadows, more obejects and so on.

2) PS4 can run it....hence why it ran the more demanding U4 engine demo. No idea about the xbone, and the wiiU certainly can not.  As for the anology, that would be bandwidth. More traffic is not a problem is you have more lanes, something the PS4 has a brutal amount of.

 

 


Ok, well a few things with that demo. First off, what is it running on? Then you have the issue of falling between 24-30fps for realtively small scences, and then, most importantly, he never mentiones the WiiU but specifically mentions the ps4 and xbone for dx11 features, but mentions how they can turn them off for current consoles. I think it will be a case of dx11 for ps4, pc, xbone, and dx9 for the wiiu.

Because I'm totally not a PC gamer and not in the field and don't know what I'm talking about right? :P The size you listed is the process, not the actual size of the chip, and no, I don't think it will beat PS4 or Xbone by a long shot, I just know that the feature set isn't too different but there is a raw power difference, high->low settings if devs bother is about it, but I doubt they will.

The PS4 couldn't handle the real UE4 engine, nor can it run Samaritan at the level of that demo, which was more demanding than the UE4 demo, you got it reversed, and your analogy still doesn't work, as you can't dynamically change the amount of lanes you drive on like a sci-fi movie and that's what scalable engines or game creators can do. You make software makers sound like lazy idiots, which a lot of them actually are now that I think about it.....

 

The demo was partly PC and partly on Wii U, the PC one is the one with the higher FPS and the Wii U one is the 24-30FPS one, you can see the Wii U gamepad button at the 7 some minute mark. You'd get better lighting and better tessllation with DX11 level feature sets. Wii U doesn't have DX11, it's more DX10 level with a tessllator most likely, which would be just like DX11 but not as efficient, the difference would be settings, and they are most likely under NDA and can't really share tech info about the Wii U anyways, Nintendo is very asshole-istic about that.



jake_the_fake1 said:
z101 said:
jake_the_fake1 said:

On the second part, you do realise that the PS4 has the same setup but it's intergrated into a single chip, plus both the CPU and GPU have access to 8GB of high bandwidth ram making EDram a non-requirment,


The Wii U eDRAM bandwith is much faster than normal RAM the PS4 uses. Interesting statement from the lead system architect from the PS4: 

For example, if we use eDRAM (on-chip DRAM) for the main memory in addition to the external memory, the memory bandwidth will be several terabytes per second. It will be a big advance in terms of performance.

He even explain why the PS4 don't use eDRAM:

However, in that case, we will make developers solve a puzzle, 'To realize the fastest operation, what data should be stored in which of the memories, the low-capacity eDRAM or the high-capacity external memory?' We wanted to avoid such a situation. We put the highest priority on allowing developers to spend their time creating values for their games.

Sony don't use eDRAM because they wanted to make console that is very easy to handle even for dumb programmers so they sacrifice performance for easy programming, the other reason is that eDRAM is very expensive.

Source: http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

...

Also I'd like to point out that neither the high end GPUs from Nvidia or AMD use EDram in their graphics card, in fact the stock GTX680 has a bandwidth of 192.2GB/s, while the Titan has 288.4GB/s, graphical beasts of their time, just a little perspective. 

..

I'd like to also emphasise that Cerny's approach was developer centric, and as he said he wanted to remove any stumbling blocks, and split memory set ups comprising of small fast ram and large slow ram makes developers life hard, they would rather have 1 large and fast pool of ram, and Sony have done this, hence Cerny's decision to have 8GB GDDR5 ram with 176GB/s of bandwidth, best of both worlds.

 

 


There is no chitchat neccessary about it: eDRAM gives a hugh advantance in power for a console and even the lead PS4 developer confessed that, but they decided not to use eDRAM on PS4 because programming would be more complicated for average programmer.

eDRAM is not so efficient on PCs, because to really utilize its possibilities it must be really used by the program code, but no PC programmer will assume that there is eDRAM on a graphics card. But in the future some high end graphics card will feature eDRAM and special logics will automatically use this eDRAM even if it is not in the program code and so will give a performance boost. Of course this boost could be higher if the program (game) is coded to use eDRAM.



dahuman said:
TheJimbo1234 said:
dahuman said:
TheJimbo1234 said:
 

 



1) Silcon size is important. It tells allows you to get an estimate for the power of the chip, and then what other features it will have eg. bandwith, inbuilt AA features, resolution support etc. Resolution is a deal breaker. Anyone running a gaming PC will tell you this from experience. It also means less power goes into AA and more cna be placed on having prettier textures, shadows, more obejects and so on.

2) PS4 can run it....hence why it ran the more demanding U4 engine demo. No idea about the xbone, and the wiiU certainly can not.  As for the anology, that would be bandwidth. More traffic is not a problem is you have more lanes, something the PS4 has a brutal amount of.

 

 


Ok, well a few things with that demo. First off, what is it running on? Then you have the issue of falling between 24-30fps for realtively small scences, and then, most importantly, he never mentiones the WiiU but specifically mentions the ps4 and xbone for dx11 features, but mentions how they can turn them off for current consoles. I think it will be a case of dx11 for ps4, pc, xbone, and dx9 for the wiiu.

Because I'm totally not a PC gamer and not in the field and don't know what I'm talking about right? :P The size you listed is the process, not the actual size of the chip, and no, I don't think it will beat PS4 or Xbone by a long shot, I just know that the feature set isn't too different but there is a raw power difference, high->low settings if devs bother is about it, but I doubt they will.

The PS4 couldn't handle the real UE4 engine, nor can it run Samaritan at the level of that demo, which was more demanding than the UE4 demo, you got it reversed, and your analogy still doesn't work, as you can't dynamically change the amount of lanes you drive on like a sci-fi movie and that's what scalable engines or game creators can do. You make software makers sound like lazy idiots, which a lot of them actually are now that I think about it.....

 

The demo was partly PC and partly on Wii U, the PC one is the one with the higher FPS and the Wii U one is the 24-30FPS one, you can see the Wii U gamepad button at the 7 some minute mark. You'd get better lighting and better tessllation with DX11 level feature sets. Wii U doesn't have DX11, it's more DX10 level with a tessllator most likely, which would be just like DX11 but not as efficient, the difference would be settings, and they are most likely under NDA and can't really share tech info about the Wii U anyways, Nintendo is very asshole-istic about that.


The jury is still out on that samaritan demo... You're just assuming here. With proper optimization and probably some compromises, I wouldn't be surprised if it could run on the PS4.

And the demo they showed suring the reveal event wasn't fully optimized and Epic said they could have made it better if they had more time with the hardware.



Anyone who isn't purely a fanboy or a massive troll knows that the Wii U is next gen, and was the first "Next Gen" console to release. Saying otherwise is outright lying. The other two consoles are going to be more powerful, basically a "no shit" kind of statement. But the Wii U itself is still hardly a weak machine, and it's GPGPU especially has plenty of muscle, and will be able to provide HD graphics on par with the competition.

And before "on par" gets a bunch of fanboy flaming, what I (obviously) mean, is that when considering the Wii vs. PS360 phenominon last gen, it was less of an enticing prospect for some 3rd party developers to port games to Wii, because they would have to practically redo the games from scratch. The gap was far more noticeable.

The gap THIS gen, will be almost neglegable. That doesn't mean to imply that there won't be some later gen games that Wii U simply might not be able to do. But it IS implying that the gap between the Wii U vs. PS4/One will be much smaller this gen, and most modern gaming engines SHOULD be built to scale, meaning that making Wii U versions of games should be much easier. It's already been said that CryEngine 3 and Unreal Engine 4 can run on it just fine, and it's very likely that even though EA are acting like babies, if it can run THOSE engines it could certainly run the new Frostbite. Regardless, the Wii U has the hardware to compete, whether it's weaker or not, it will still be able to "keep up" where it counts.

And everyone knows that at the VERY least, Nintendo's own games, the BEST of them (3D Mario, Zelda, etc.), will absolutely be just as impressive graphically and otherwise as what the other consoles have to offer. A game like Mario Galaxy, despite Wii's weaker hardware, was STILL one of the prettiest and most impressive/innovative games of it's gen. The same could be said for games like Skyward Sword, Metroid Prime 3, etc. There will undoubtedly be games for Wii U where that will once again be the case.

After all, tech specs are nice, but the games are what ultimately matter.