By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
The real question here is the game going to show all 4K-8K textures at once ? I presume that they won't because the WII U is severely lacking memory bandwidth, memory size, as well as TMUs to even attempt showing every detail of the texture. The engine that they have is probably modified to support john carmacks megatexture technology to dynamically stream textures at a different resolution.

Are you suggesting that the edram is just 512 bits wide? Just by checking Renesas eDRAM at 40nm you can clearly see that 512 bits wide is no longer supported, we have 1024, 4096 and 8192 bits besides, Xbox 360 has a 4096bits eDRAM on seperate die with 8 ROPS having full access to the internal edram bandwidth of 256GB/s, the drawback is that since its on a separate die, the resolve from eDRAM to GPU has to pass through an external bus of 32GB/s

Sum that with the fact that Wii U main RAM doesnt have huge bandwidth, its obvious that no matter how good a port of an Xbox 360 game is, it would not even run on the Wii U system, and even worse talking about lazy ports. Xbox  36'has just 10 MB of eDRAM and only 1 MB of cache, while the Wii U has triple in both memories. But when you do a direct port, the source code telsl Wii U to use just 10 MB of edram and only 1 MB of cache and all the code that could fit there goes to the slower main ra since the source code tells the Wii Uto do so.

Of course that developers can change this, but will take time to reallocate resources and determinee what fits in the eDRAM and cache, and what does not
and you think they would do it? pfff, in most cases they dont since they are in a hurry, just by seeing that Call Of Duty Ghosts has the same resolution as the Xbox 360 conterpart that is gets port from even though Wii U has more eDRAM to solve it one can tell that it was a quick port.
but of course that sicne is a port the dsp in must cases goes to waist and even worse the developers waste one of the cores of the wii u cpu for sound
Xbox 360 uses one core for sound so when you port you also waste one of the 3 cores of the wiiu cpu for sound instead of using the DSP in other words, that extra core could have been used for something else is like if the wiiu actually had 4 cores in a way compared to Xbox 360 if audio was done on DSP and all 3 cores were used for games. But of course that since is a port the DSP in most cases goes to waste and even worse the developers waste one of the cores of the Wii U CPU for sound.

Why would Nintendo use a shorter eDRAM than Xbox 360 anyway? Renesas(NEC the previous producer of the Xbox 360's edRAM now forms part of Renesas) already said that eDRAM in Wii U uses latest tecnologies and Shin'en says that Wii U eDRAM has high bandwidth. Do you know the fact that eDRAM is in Wii U's GPU that it is embedded into the GPU and not on seperate die like on Xbox 360.

"Nintendo could try to contract another company to produce the component, but there are circumstances that make it difficult. According to a Renesas executive the production of that semiconductor was the result of the “secret sauce” and state-of-the-art know-how part of the NEC heritage of the Tsuruoka plant, making production elsewhere difficult. In order to restart mass production in a different factory redesigning the component may be necessary."

"Wii U GPU and its API are straightforward. Has plenty of high bandwidth memory. Really easy."

"Especially easy when compared with the tricks you need to do on current gen consoles."

“What surprises me with Wii U is that we don’t have many technical problems. It’s really running very well, in fact. We’re not obliged to constantly optimize things. Even on the PS3 and Xbox 360 versions [of Origins], we had some fill-rate issues and things like that. So it’s partly us – we improved the engine – but I think the console is quite powerful. Surprisingly powerful. And there’ a lot of memory. You can really have huge textures, and it’s crazy because sometimes the graphic artist – we built our textures in very high-dentition. They could be used in a movie. Then we compress them, but sometimes they forget to do the compression and it still works! [Laughs] So yeah, it’s quite powerful. It’s hard sometimes when you’re one of the first developers because it’s up to you to come up with solutions to certain problems. But the core elements of the console are surprisingly powerful."

I know you will deny itand try to disprove it, what are you suggesting is not proven and it is based on a rumor that you state as a fact and the thread creator of that rumor himself said it is a rumor and to treat as one yet you don't treat it as rumor, but rather as a fact/evidence.

As Renesas said in their statement/response, it uses state-of-the-art eDRAM so it means most up to date and best of the best possible...

You will say that Shin'en is not credible because they make games only for platform's from Nintendo yet they do soundtracks for games and they did it for 200 games also they do audio middleware that is used on various platforms. Another thing that you will use to discredit Shin'en is that they have positive opinion/attitude to Wii U compared to other large/mainstream developers that are/were negative about it while you will also discredit them again by saying that they are a small team and not a large developer thus you will question their capabilities, knowledge and experience even when they have over couple of decades of experience and were part of demo scene in 1990is thus they have more experience in compression and optimization.

I know that you will try to disprove the statement of Michel Ancel that works for Ubisoft for almost 20 years! He is a game designer  also a programmer and graphic artist so he knows what he is talking about thus he can firmly confirm and prove own statements.

Tell me, how is it that I can even surpass those 35GB/s specs you are claiming for the whole 32 megabytes of eDRAM with just 4 megabytes of embedded memory of the old Nintendo Gamecube?

 

How did you calculate the bandwidth anyway?