By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
curl-6 said:
fatslob-:O said:
The real question here is the game going to show all 4K-8K textures at once ? I presume that they won't because the WII U is severely lacking memory bandwidth, memory size, as well as TMUs to even attempt showing every detail of the texture. The engine that they have is probably modified to support john carmacks megatexture technology to dynamically stream textures at a different resolution.

Shin'en have said on twitter that memory size was not a problem, as they could compress a 4k texture down to 10MB.

As for memory bandwidth they previously said that:

"Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency."

Just how many 4-8K textures are they going to use then ? Once you get alot of objects on screen the complexity increases. 

As for shin'en's over statement about memory bandwidth, there's only so much you can do with 32MB. See the reason as to why bandwidth was important in the first place was to feed the GPU otherwise functional units will start to get under utilized. TMUs, shaders, ROPS, and everything else in the GPU is exetremely dependent on it. The reason as to why the X1 wasn't able to achieve 1080p for some of multiplatform titles or exclusives has to do with the main memory bandwidth being a bottleneck. (Aside from the lowered amount of ROPS ofcourse.) How else does the GPU get fed with alot of other data ? You can not keep constantly relying on the eDRAM to feed the GPU much like how the X1 relies on the eSRAM! It has to eventually access the main memory and only the main memory because it likely has the most data being resident on it. The purpose of caching is to SAVE BANDWIDTH by storing frequently accessed data. It was not meant to COMPLETELY FEED THE GPU

Now don't get me wrong! I'm not saying that the WII U isn't capable of handling 4-8K textures but it shouldn't be able to handle it at a regular basis given that it likely has a lack of TMUs and everything else I stated before. Do not fret about this issue. There are other ways of about solving this issue like I had said before. The megatexture technology introduced by john carmack in RAGE will resolve alot of issues regarding the WII Us lack of bandwidth and TMUs by trying to only stream the required highest resolution textures for a certain assests of a scene so that it can conserve alot of texture fillrates ad bandwidth so that the level of detail gets scaled back when objects are far away while also being scaled up too where the objects near the camera will have the highest level of detail. 


You sure of this? Are you really, really sure of this? Then could you explain this to me, please...

QUOTE;
"The easiest way that I can explain this is that when you take each unit of time that the Wii U eDRAM can do work with separate tasks as compared to the 1 Gigabyte of slower RAM, the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM, regardless of the fact that the size and number applied makes the 1 Gigabyte of DDR3 RAM seem larger. These are units of both time and space. Fast eDRAM that can be used at a speed more useful to the CPU and GPU have certain advantages, that when exploited, give the console great gains in performance.
The eDRAM of the Wii U is embedded right onto the chip logic, which for most intent and purposes negates the classic In/Out bottleneck that developers have faced in the past as well. Reading and writing directly in regard to all of the chips on the Multi Chip Module as instructed."

ARTICLE