By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Wii U's eDRAM stronger than given credit?

Your accusation is unfounded and incorrect also your argument is flawd.

Those benchmarks on PC use only DDR3 since there is no CPUs and motherboards that support GDDR5 that is being used in GPUs while PlayStation 4 is first to use it as main and only system RAM.

PC and Console enviroment are different with the latter can have far far less overhead since software can be fully optimized for the specific hardware it has. I read article at redtechgaming and I was slightly dissapointed at research of it because lack of examples, oversimplified with vague answer that may not be answers at all, no testing and use of tomshardware benchmark of RAM on Battlefield 3/4 that is rather laughable.

I wonder if Wii U CPU benefits more from Lower CAS latency than Jaguar CPUs in Xbox One and PlayStation 4. I remember looking at benchmark where Battlefield 3 benefited noticeably from lower CAS RAM at same frequency...

I just keep being amazed sometime how people can be such simpletons.



Around the Network
DevilRising said:

People that think showing pics of a game that has a dark/gritty/"realistic" art style vs. pics of a game with a more simplistic/colorful/cartoon art style is somehow "proof" of one piece of hardware being better than another, always have and always will be laughable. As in funny as hell.

You may have missed this. The new trend from your camp is Nintendo games (cartoony) don't push Nintendo hardware. I disagree based on Galaxy(Wii) and Twilight Princess (GC) but hey.

But grass in super mario is different to grass in xenoblade which in turn is different to grass in Crysis 1. Same goes for dust, fire and water. There is some truth to those statements.

All grass is equal but some grass is more equal than others. You think this has no effect on rendering demand?

 

 



I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

hated_individual said:
Your accusation is unfounded and incorrect also your argument is flawd.

Those benchmarks on PC use only DDR3 since there is no CPUs and motherboards that support GDDR5 that is being used in GPUs while PlayStation 4 is first to use it as main and only system RAM.

PC and Console enviroment are different with the latter can have far far less overhead since software can be fully optimized for the specific hardware it has. I read article at redtechgaming and I was slightly dissapointed at research of it because lack of examples, oversimplified with vague answer that may not be answers at all, no testing and use of tomshardware benchmark of RAM on Battlefield 3/4 that is rather laughable.

I wonder if Wii U CPU benefits more from Lower CAS latency than Jaguar CPUs in Xbox One and PlayStation 4. I remember looking at benchmark where Battlefield 3 benefited noticeably from lower CAS RAM at same frequency...

I just keep being amazed sometime how people can be such simpletons.

Remembering something is much more believable if you can provide a link.

Here's one http://www.anandtech.com/show/4503/sandy-bridge-memory-scaling-choosing-the-best-ddr3/6

"The results weren't very stimulating, were they? Just as expected, gaming with faster memory just doesn't make any notable difference. I could have potentially lowered the resolution and settings in an attempt to produce some sort of difference, but I felt that testing these games at the settings they're most likely to be played at was far more enlightening. If you want better gaming performance, the GPU is the best component to upgrade—no news there."

Of course I know that GDDR5 will have a higher latency, but the overall effect on the CPU in the balance of cache size, memory throughput, and GPU/CPU contention are not well understood.

Of course the heaviest user of this unified memory is the GPU, where there are clear benchmarks that show the benefits of higher bandwidth GDDR5 over budget and laptop GPUs paired with DDR3.

Given the difference in framerate on games like Tomb Raider on PS4 & XB1 I'd say GDDR5 looks good in the balance, higher latency or not.



My 8th gen collection

ICStats said:
curl-6 said:

Like PS4/Xbone, Wii U focuses on (GP)GPU and memory over CPU, while PS3/360 are leftovers of an era where CPUs were more prioritised.

Agreed, though devs have yet to really offload work to the GPU on any of PS4/Xbone/Wii U in their first years, and while the Wii U is mostly in a commercial position to receive ports from PS3/360 (if that) it's unfortunate to be lightweight on CPU compared to those.  Things would have been so different if devs were porting games with ease, and reviewers were consistently giving the Wii U thumbs up over PS3/360.

Same article:

"There was even some discussion on trying to utilise the GPU via compute shaders (GPGPU) to offload work from the CPU - exactly the approach I expect to see gain traction on the next-gen consoles - but with very limited development time and no examples or guidance from Nintendo, we didn't feel that we could risk attempting this work.

If we had a larger development team or a longer timeframe, maybe we would have attempted it, but in hindsight we would have been limited as to what we could have done before we maxed out the GPU again. The GPU is better than on PS3 or Xbox 360, but leagues away from the graphics hardware in the PS4 or Xbox One."

In fact I think you know that aside from the line that said they couldn't take advantage of everything, this article was extremely damning of the Wii U including "it is unlikely that we would ever release another Wii U title.".

Anyway thanks for the debate.

I am quite happy with the quality of Nintendo games on the platform, but I don't expect miracles like some fans do.

They are mostly damning of Nintendo's handling of the situation; shitty dev tools, insufficient communication, no help with development, etc.

They praise the GPU and eDRAM.

And I expect no miracles. I simply expect Wii U graphics to improve over what we have seen in its first 16 months, just like any console.



hated_individual said:
Your accusation is unfounded and incorrect also your argument is flawd.

Those benchmarks on PC use only DDR3 since there is no CPUs and motherboards that support GDDR5 that is being used in GPUs while PlayStation 4 is first to use it as main and only system RAM.

PC and Console enviroment are different with the latter can have far far less overhead since software can be fully optimized for the specific hardware it has. I read article at redtechgaming and I was slightly dissapointed at research of it because lack of examples, oversimplified with vague answer that may not be answers at all, no testing and use of tomshardware benchmark of RAM on Battlefield 3/4 that is rather laughable.

I wonder if Wii U CPU benefits more from Lower CAS latency than Jaguar CPUs in Xbox One and PlayStation 4. I remember looking at benchmark where Battlefield 3 benefited noticeably from lower CAS RAM at same frequency...

I just keep being amazed sometime how people can be such simpletons.

CAS latency is the amount of latency per clockrate.

CPU's especially x86 architectures have been architectured around avoiding cache hits and being forced to retreive data from system Ram, because regardless of what Ram you use, System Ram is inferior to L1/L2/L3/L4 cache and it's derivatives.
Modern x86 processors are actually (think multiples) faster than what system Ram can provide in regards to latency, in some cases 500+ cycles, Intel and AMD have spent billions of dollars in R&D and thousands of man hours getting around that bottleneck over the past few decades, millions of transisters are spent to get around it.

The main technologies used to combat a CPU having to travel all the way down to system Ram are: Shared and larger caches, cache prediction, intelligent look ahead, intelligent prefetchers, better busses. (I.E. Move from the FSB to a controller on-die) etc'.
The main focus Intel and AMD try and do is to "predict" the data that will be required for processing ahead of time and get that into the local caches as soon as possible ready for processing.

If there is a cache mis-predict, then hundreds of cycles is wasted, which regardless of the Ram you have you will *always* have a worse case scenario where there is a miss-hit and the CPU has to waste cycles.

The Playstation 4 though, in terms of ns (Not latency per clockrate like CAS) is probably only roughly 20% higher than the Xbox One's main Ram. (eDRAM can also act like an L4 cache, placing things in the Xbox's favour, but that's an entirely different branch of topic.)
Whether that will have an impact on a console designed mostly for displaying pretty pictures remains to be seen, consoles have never taken CPU performance seriously.
And why should they? The masses only care about graphics. :P It's what sells copies of Call of Battlefield: Generic war fighter 2075. :P



--::{PC Gaming Master Race}::--

Around the Network
SubiyaCryolite said:
DevilRising said:

People that think showing pics of a game that has a dark/gritty/"realistic" art style vs. pics of a game with a more simplistic/colorful/cartoon art style is somehow "proof" of one piece of hardware being better than another, always have and always will be laughable. As in funny as hell.

You may have missed this. The new trend from your camp is Nintendo games (cartoony) don't push Nintendo hardware. I disagree based on Galaxy(Wii) and Twilight Princess (GC) but hey.

But grass in super mario is different to grass in xenoblade which in turn is different to grass in Crysis 1. Same goes for dust, fire and water. There is some truth to those statements.

All grass is equal but some grass is more equal than others. You think this has no effect on rendering demand?

"Your camp"? Would you listen to yourself?

For Pete's sake stop taking this so seriously and being such a passive aggressive fountain of negativity.



Pemalite...

Since Wii Us CPU and GPU can access each other directly then either CPU or GPU could process needed data first and then shuffle it back and forth to each as long as possible/needed by the game, right? If possible to forward data from system RAM trought each after processing it...

eDRAM in Wii U GPU is basically L3 Cache to CPU in a way.



curl-6 said:
SubiyaCryolite said:
DevilRising said:

People that think showing pics of a game that has a dark/gritty/"realistic" art style vs. pics of a game with a more simplistic/colorful/cartoon art style is somehow "proof" of one piece of hardware being better than another, always have and always will be laughable. As in funny as hell.

You may have missed this. The new trend from your camp is Nintendo games (cartoony) don't push Nintendo hardware. I disagree based on Galaxy(Wii) and Twilight Princess (GC) but hey.

But grass in super mario is different to grass in xenoblade which in turn is different to grass in Crysis 1. Same goes for dust, fire and water. There is some truth to those statements.

All grass is equal but some grass is more equal than others. You think this has no effect on rendering demand?

"Your camp"? Would you listen to yourself?

For Pete's sake stop taking this so seriously and being such a passive aggressive fountain of negativity.

 

No, its a new trend, such assertions have been brushed aside numerous times before. DevilRising did the exact same thing. A lot of fans are disagreeing with you in that thread you created.

I remember you quoting every "non anymous" 2D game creator when the WiiU secret developer article first came out as a counter to his largely negative opinions. You were still firmly in lazy dev land at that point. Now you agree with him.

You argue that Trine 2 is pushing the system hardest yet its enhancements aren't a generational gap above the 360 version if the same game.

You argue that all of Nintys 60fps titles aren't pushing the system. So are you basically saying the systems best looking games will all be 720p30 and all that 1080p secret sauce theories are a Pipedream?

Basically 720p30 like last gens best games. Yet you get offended when anyone states that the machines power is barely a step up from last gen?

Where do you stand on the U? No flip flopping. In black and white. Plain and simple. Its reasonable to expect 1080p60/30 from a PS4. Reasonable to expect 900p30/60 from a X1. What is reasonable for a U.



I predict that the Wii U will sell a total of 18 million units in its lifetime. 

The NX will be a 900p machine

I'd say it's reasonable to see 720p at 60fps with v-synch enabled considering that's what we've seen in the majority of first party titles so far.

You're all going around in circles, people. Just wait until Project CARS is released.



I love how this thread re emerges every week.



I predict NX launches in 2017 - not 2016