By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Shin'en: If you can't make great looking games on Wii U, it's not the hardware's fault

Kane1389 said:
curl-6 said:
Kane1389 said:

But we know Kratos is still better looking than any Bayo's characters so far. Maybe if SSM (or similar talented studio) was working on WiiU, we'd have a better looking game

Until we know the poly count on Bayo 2's characters, though, we can't be certain that Kratos is higher poly.

As for Sony Santa Monica, there are few devs on the planet as skilled at graphics as them, and Platinum is not one of them. That said, I find Platinum's games more fun than SSM's. I like my action games batshit insane. :)


I agree when it comes to sheer action gameplay. Metal Gear rising was a much better hack n  slash game than any GOW, but it loses badly in pretty much all other aspects

I actually haven't played MGS Rising yet. I will eventually, since it's by Platinum.

For me though, Bayonetta 1 blew the pants off God of War. So much faster, more fluid, and energetic.

Not that God of War is bad at all, Bayo just blew my mind.



Around the Network
forethought14 said:
fatslob-:O said:

Dude the CPU's in the PS4 and xbone is 5 times faster in floating point workloads and I'm pretty sure jaguar is more similar bulldozer than phenom so it's no slouch in the integer performance department either. 

Oh and there are 4 modules I think which is completely different from a core. Each of those modules possess 2 128bit SIMD's. 

What do you mean by "blow" ? If anything core for core the jaguar's almost perform's twice as fast as the wii u does in terms of floating point operations.

Yeah this convo is quite a bit off topic.

The issue with Espresso, is that paired singles is something completely different than any other forms of SIMD (it's not even "real" SIMD), and porting heavy FP code from a 128-bit SIMD unit to a 32-bit x 2 Paired Singles isn't an easy operation. It's not like it can't do it, it just won't do it in the same way, nor at the same amount. And the fact that there's only 3 cores is a limiting factor. The amount of stuff it can do is limited, but it's no way a terrible architecture at all. It running several modern games using 32-bit x 2 paired singles is impressive. It's surprising that it was able to do that well in the matmul SIMD tests, something it shouldn't be able to do amazingly.

That core surprises people like this:

http://www.radgametools.com/bnkhist.htm

  • Added Wii-U support for Bink 2 - play 30 Hz 1080p or 60 Hz 720p video! We didn't think this would be possible - the little non-SIMD CPU that could!

Something like Espresso, can decode Bink 2 @ 1080p/30 or 720p/60, which is a pure SIMD instruction, even though Espresso doesn't even have real SIMD. I'm pretty sure a more modern CPU with proper SIMD instructions would be able to do this normally, but Espresso being able to do it AT ALL is surprising. Who knows what sorts of things this thing would be able to do in the future.

No, 32-bit x 2 paired singles is not as good as having a true 128-bit SIMD, but that doesn't mean we won't be seeing amazing physics, just not at the same levels as PS4/X1. But calling it "weak" or "terrible" is an understatement. With respect to multi-hundred dollar CPUs, yes, it's a joke, all 8th generation console CPUs  are terribly under-powered compared to even some lower-end CPUs on the market, and that's a shame. Core-for-core, Jaguar isn't going to do stuff that Espresso couldn't handle one way or another (it could do it, just not at the same level), and that won't make downports from next gen impossible, just difficult. Much better than trying to port from Xenon to Broadway though! 

Anyway, Shin'en had better show something good for their next game, because talking "big" like that just has to be supported by actually doing something nice. Nano Assault looks okay, and it's cool that it runs with only the main CPU core, but it's not gonna impress many people. They had better use Tessellation well in their next game, not just for small things. 

If I were you I wouldn't depend on an amd card for tessellation because their known for their not so stellar performance at higher tessellation factors.



Pemalite said:
fatslob-:O said:

Dude the CPU's in the PS4 and xbone is 5 times faster in floating point workloads and I'm pretty sure jaguar is more similar bulldozer than phenom so it's no slouch in the integer performance department either. 

Oh and there are 4 modules I think which is completely different from a core. Each of those modules possess 2 128bit SIMD's. 

What do you mean by "blow" ? If anything core for core the jaguar's almost perform's twice as fast as the wii u does in terms of floating point operations.

Yeah this convo is quite a bit off topic.


Bulldozer/Vishera's IPC is actually lower than the Phenom 2's. Both Stars and Bulldozer are slower in terms of IPC than the Core 2 series, which puts performance into perspective, of course the Phenom 2 will switch into another gear when you pump up the NB clock by roughly 33%, which can yield upwards of 15-20% in IPC increases.

Jaguar has more in common with Brazos than Bulldozer anyway if anything Brazos was an evolutionary/reworking of the K10h architecture.
When comparing Jaguar to Brazos, it's the same 2-wide, L1 cache with the same execution blocks as Brazos it's an evolutionary step, not a revolutionary one, which is standard with PC processors.
AMD went about improving performance by throwing in a loop buffer, improving the cache predicter, improved instruction buffer and providing more instructions than you can poke a stick at.

One take away from it all is that, Brazos had a very poor floating point unit, AMD fixed that with Jaguar by doubling up on essentially all of the FP execution blocks, however what most people don't realise is that, game engines don't only use Floating Point math and CPU's don't just deal with Floating Point, so using FP numbers a way to determine a CPU's performance is completely and utterly pointless.

Just so you know IPC isn't the whole story of performance. Oh and you appear to be right about jaguar (Got to check things out more often on my part.). I'm pretty sure floating point performance is a standard there are others like integer performance but that is not as important as the later.



for all the talk about wiiu graphics looking better then current gen, all you have to do is post direct feed gameplay pics of the best looking ps3/360 games vs the best looking wiiu games, its easy to se that current gen games still look much better then best of wiiu.



forethought14 said:
fatslob-:O said:

@Bold here's where your wrong, Jaguar features wider SIMD units as evidenced by its AVX extension while also having wayyy more than 2cores.

Can I like also get some source on how those cores are based off of a G5 please ?

Developers probably have strip some workloads for the wii u processor to get it running. How do you think developers got games running on older consoles ?  BTW the CPU isn't all important for purposes of of rendering today. 

can you like show some evidence as to the figures for JAGUAR AND NOT BOBCAT are close to the espresso.

BTW those were comparisons Whether you like it or not I can still accept the fact that it has a weak cpu plus I'm seriously worred about the WII U in the future unless ofcourse nintendo drops support easily.

Jaguar is a 4-core CPU, with 2MB of cache and performance enhancements with new instructions added from Bobcat, that's it. Obviously a Jaguar CPU alone would best a Bobcat CPU since there are more cores, and they have 15% higher IPC, 128-bit SIMD instead of 64-bit, and new instruction sets, but this is NOT going to do much to increase overall performance per-core the way you think. You will not see anything near double the performance out of this because it's only the SIMD width that increased by 2. Real world performance (factoring everything in), will be a bit over 50% more powerful than Bobcat. Again, I'm talking about "per core". 

And an 8-core Jaguar goes at 102.4GFLOPS. 2 cores are reserved for the OS in PS4/X1. 8 FLOPS per cycle, per core. 8 x 1.6ghz = 12.8 GFLOPS per core. 6 usable cores = 76.8 GFLOPS for PS4 CPU, 84 GFLOPS for the X1 CPU (8 x 1.75ghz x 6)

Xenon is based on PPE, which is a PowerPC970, which is a G5. The PPE in Cell is a PowerPC970, which is a G5. G5s were designed to have high clocks, not so high IPC, improved SIMD performances, but they are in-order processors. Here's a link with a good explanation for both:

http://lowendmac.com/ed/bashur/12db/g5-gaming.html

No I can't, because no one has tested Jaguar yet. But if we look at this:

http://semiaccurate.com/assets/uploads/2012/08/slide-1-728.jpg

We can take into account the enhancements from Bobcat, and easily make the calculations from the test blu made in NeoGaf. Those numbers I gave you for Jaguar aren't going to change much (they are probably higher, but likely only about 15-20sh % higher due to the 128-bit SIMD, newer instructions and increased cache). You can also add a bit to the numbers on Espresso as well, considering how there is more cache in it compared to Broadway and it's lower-latency cache eDRAM (I based the Espresso numbers on a 256KB Broadway. Remember that 2 Espresso cores have 512KB of cache, and one has 2MB of cache (as much cache as one Jaguar CPU), that will also add a bit to performance, probably around 5%+/-). But still, the fact that it's that close to Jaguar core-for-core says a lot about how efficient Espresso is. Of course, the advantage the PS4/X1 CPUs will have over Espresso is obviously core count, newer instructions and a 128-bit floating point unit, but core-for-core, it's not goint to blow Espresso away. 

But anyway, this isn't a thread to discuss the CPUs in PS4/X1, this is about Wii U, so I'm gonna stop here. 



Great post, but thought I'd add two important things. Firstly that Expresso has a ridiculously short pipeline (4 stages!) and Jaguar has a 17 stages, so over 4 times more stages to do before a process is completed and secondly that Expresso also has access to 32MB of eDRAM in addition to the 3MB of CPU cache.

The second point above is HUGE.



Around the Network

He has a point. We can ignore the CPU to some degree since it's part of how the system has been designed (virtually no CPU bottleneck) so it comes down to GPU and RAM.

The GPU alone is above the PS3 and X360, with two thirds of it identified, along with more modern API. The processors and shaders are more advanced, naturally.

The RAM amount is more than double PS3 and X360 when it comes to game development (~~448MB vs. 1024MB). Additionally, if developers use the eDRAM (32MB as opposed to the X360's 10MB) to buffer as much as possible then they can offset plenty of the bandwidth that looks lacking.

 

Comparing to PS4 and X1 is a different story.



e=mc^2

Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)

Solid-Stark said:

He has a point. We can ignore the CPU to some degree since it's part of how the system has been designed (virtually no CPU bottleneck) so it comes down to GPU and RAM.

The GPU alone is above the PS3 and X360, with two thirds of it identified, along with more modern API. The processors and shaders are more advanced, naturally.

The RAM amount is more than double PS3 and X360 when it comes to game development (~~448MB vs. 1024MB). Additionally, if developers use the eDRAM (32MB as opposed to the X360's 10MB) to buffer as much as possible then they can offset plenty of the bandwidth that looks lacking.

 

Comparing to PS4 and X1 is a different story.

something somethin Red Dead Redemption !!1!!

something something The Last of Us vs. Super Mario Bros. U !!1!11!one !!



orniletter said:
Solid-Stark said:

He has a point. We can ignore the CPU to some degree since it's part of how the system has been designed (virtually no CPU bottleneck) so it comes down to GPU and RAM.

The GPU alone is above the PS3 and X360, with two thirds of it identified, along with more modern API. The processors and shaders are more advanced, naturally.

The RAM amount is more than double PS3 and X360 when it comes to game development (~~448MB vs. 1024MB). Additionally, if developers use the eDRAM (32MB as opposed to the X360's 10MB) to buffer as much as possible then they can offset plenty of the bandwidth that looks lacking.

 

Comparing to PS4 and X1 is a different story.

something somethin Red Dead Redemption !!1!!

something something The Last of Us vs. Super Mario Bros. U !!1!11!one !!

I agree such comparisons can be very ignorant since there aren't many multiplats or they simply don't exist that do the WiiU justice in comparison to PS3 and X360. Time will change that, and not to mention inevitable Nintendo 1st party IPs such as Metroid, or even Mario Kart 8.



e=mc^2

Gaming on: PS4 Pro, Switch, SNES Mini, Wii U, PC (i5-7400, GTX 1060)

Solid-Stark said:

I agree such comparisons can be very ignorant since there aren't many multiplats or they simply don't exist that do the WiiU justice in comparison to PS3 and X360. Time will change that, and not to mention inevitable Nintendo 1st party IPs such as Metroid, or even Mario Kart 8.

agreed on all points !

The best comparison we have right now is probably Bayonetta 1 (X360) vs Bayo 2.

Both are in the same genre, by the same developer exclusively for one system (as everyone knows: Bayo360 was made inhouse by Platinum and the shitty PS3 port was outsourced to Nex Entertainment)

...and the sequel compares quite favourably to the first game (v-sync, framerate is superior, more going on, better textures etc.)

...but I guess RDR out-visuals everything ever created in the history of eyesight



ninjablade said:

the reason is  nintendo is embarrassed  to show there specs , its a 160 sp gpu, and nintendo wants to keep it a secret, no other company hides there specs like nintendo, and nintendo only started hiding there specs, with the wii brand, and the reason is, because the specs are embarrasing compared to the competition

Pretty sure MS has done a great job not providing the actual APU information. Just bland and purposefully confusing sum of processors. Why? because they wanted to prevent direct from source comparisons.

Same as Nintendo.

There is a lot more to it than GPU HD7xxx-b.

Wii U is certainly far more capable and we're just starting to see glimpses of what is in store.

X was likely very early on and it looked phenominal.
I'm betting Watch_Dogs will look and play better on Wii U than PS360 and in reality not be all that far from PS4/XBone.

There's a reason EA didn't put its sports games on Wii U. THey would have been identical to PS4/Xbone. Instead EA built crap last to create a situation where they could not release this year to help MSony. Now in 2014 they'll come back with artificially gimped work... just like last year.

By then it won't matter though as Nintendo will have the titles it needs with what is coming now plus smash/mk8.