By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - FAST Racing NEO powered by 2nd generation engine for Wii U supports and uses 4k-8k textures

fatslob-:O said:
megafenix said:

nope, i am actually the guy who makes posts at ign with the same name

wanna see?

the world is wider than you think, dont think you just have one rival

 

if you allow me, i can give the formula to do it, but since you are the expert here i am just waiting

You probably won't give me the formula seeing as how this will probably require caluculus 3 shit at hand and that's not something I have avaible in my tool box as of now until next year but at the same time I also doubt that you have it. 


seriously?

you dont even know how to get the xbox 360 edram internal bandwidth?

 

pff, sorry, sorry, i though you really were a tch expert

 

come on, try, here are some clues

gpu 500 mhz, banndwidth of 256G/s

 

that shold do, just use some logic to get how many bits wide is the edram



Around the Network
megafenix said:
fatslob-:O said:
megafenix said:

nope, i am actually the guy who makes posts at ign with the same name

wanna see?

the world is wider than you think, dont think you just have one rival

 

if you allow me, i can give the formula to do it, but since you are the expert here i am just waiting

You probably won't give me the formula seeing as how this will probably require caluculus 3 shit at hand and that's not something I have avaible in my tool box as of now until next year but at the same time I also doubt that you have it. 


seriously?

you dont even know how to get the xbox 360 edram internal bandwidth?

 

pff, sorry, sorry, i though you really were a tch expert

 

come on, try, here are some clues

gpu 500 mhz, banndwidth of 256G/s

 

that shold do, just use some logic to get how many bits wide is the edram

Your really going to use the bandwidth between the rops and the eDRAM as a representation of the xbox 360's system bandwidth ? Wow you really don't know anything about tech!  BTW don't assume that the bus width is the same for each embedded memory. 



fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:

nope, i am actually the guy who makes posts at ign with the same name

wanna see?

the world is wider than you think, dont think you just have one rival

 

if you allow me, i can give the formula to do it, but since you are the expert here i am just waiting

You probably won't give me the formula seeing as how this will probably require caluculus 3 shit at hand and that's not something I have avaible in my tool box as of now until next year but at the same time I also doubt that you have it. 


seriously?

you dont even know how to get the xbox 360 edram internal bandwidth?

 

pff, sorry, sorry, i though you really were a tch expert

 

come on, try, here are some clues

gpu 500 mhz, banndwidth of 256G/s

 

that shold do, just use some logic to get how many bits wide is the edram

Your really going to use the bandwidth between the rops and the eDRAM as a representation of the xbox 360's system bandwidth ? Wow you really don't know anything about tech!  BTW don't assume that the bus width is the same for each embedded memory. 


thought you were smarter than that, or maybe you are but just decided to play fool when its not convenient

very well

here

4096bits * 500mhz/(8bit*1000) = 256GB/s

 

thats how you get the internal bandwidth of the xbox 360 edra to which the rops have full access, then when the rops finish the job they pass the result to the GPU frame buffer through an external bus of 32GB/s

 

so clearly no portcan work with the bandwidth you are claiming, and its also very strange you fail to provide more bandwidth than what i can get with just 4 megabytes of the old gamecube embedded memory on the gpu with a brand new edram

 

so, tell me, how did you get the wii u edram bandwidth?dont forget its embedded directly on the gpu, no external buses so please

lets see, eyeofcore suggested that you are claiming 512 bits wide edram for the wii u right?

512 bits edram * 550mhz gpu / (8bit* 1000) = 35.2GB/s

 

coincidence?

yea right, two strikes dude, one more and you are out

renesas doesnt support 512 bits, go to their website dude

Memory SizeConfiguration
(word x bit)
Random Access
Freq.
DVCC (VDD)Tj
64 Mb 256 Kw x 256 b 133 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 32 Kw x 256 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 64 Kw x 128 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C

Contact Us

 

just to let you know, Mb are megabits, just in case you didnt know

so

8 megabytes macro of 256 bits

1 megabyte macro of 256 bits

1 megabyte acro of 128 bits

 

can you please do the honors?

wii u edram its 32 megabytes, so...?



megafenix said:

thought you were smarter than that, or maybe you are but just decided to play fool when its not convenient

very well

here

4096bits * 500mhz/(8bit*1000) = 256GB/s

 

thats how you get the internal bandwidth of the xbox 360 edra to which the rops have full access, then when the rops finish the job they pass the result to the GPU frame buffer through an external bus of 32GB/s

 

so clearly no portcan work with the bandwidth you are claiming, and its also very strange you fail to provide more bandwidth than what i can get with just 4 megabytes of the old gamecube embedded memory on the gpu with a brand new edram

 

so, tell me, how did you get the wii u edram bandwidth?dont forget its embedded directly on the gpu, no external buses so please

lets see, eyeofcore suggested that you are claiming 512 bits wide edram for the wii u right?

512 bits edram * 550mhz gpu / (8bit* 1000) = 35.2GB/s

 

coincidence?

yea right, two strikes dude, one more and you are out

renesas doesnt support 512 bits, go to their website dude

Memory SizeConfiguration
(word x bit)
Random Access
Freq.
DVCC (VDD)Tj
64 Mb 256 Kw x 256 b 133 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 32 Kw x 256 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 64 Kw x 128 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C

Contact Us

 

just to let you know, Mb are megabits, just in case you didnt know

so

8 megabytes macro of 256 bits

1 megabyte macro of 256 bits

1 megabyte acro of 128 bits

 

can you please do the honors?

wii u edram its 32 megabytes, so...?

Again when did I ever claim the bandwidth to be 35.2 GB/s. You clearly made a strawman argument there. 

The gamecube had 3MB of embedded ram! My god, how can an apologist like yourself not know about the company's hardware that your obviously biased to. 

BTW  The WII Us reported embedded ram bandwidth is 70.4 GB/s which gives it a bus width of 1024 bits.

The one with strikes here is you, clown becuase your clearly some stranger who hasn't proven his own grounds yet on this site. 

How about solving the volume integral of 2x + 2y where the boundries for both x and y are -1 to 1 ? (This is something any engineer could and even including prospective ones too! You gonna cop out on this one or what ?) 

You sound like some fraud getting overly defensive about a PC master race's evaluation of your beloved WII U. 



fatslob-:O said:
megafenix said:

thought you were smarter than that, or maybe you are but just decided to play fool when its not convenient

very well

here

4096bits * 500mhz/(8bit*1000) = 256GB/s

 

thats how you get the internal bandwidth of the xbox 360 edra to which the rops have full access, then when the rops finish the job they pass the result to the GPU frame buffer through an external bus of 32GB/s

 

so clearly no portcan work with the bandwidth you are claiming, and its also very strange you fail to provide more bandwidth than what i can get with just 4 megabytes of the old gamecube embedded memory on the gpu with a brand new edram

 

so, tell me, how did you get the wii u edram bandwidth?dont forget its embedded directly on the gpu, no external buses so please

lets see, eyeofcore suggested that you are claiming 512 bits wide edram for the wii u right?

512 bits edram * 550mhz gpu / (8bit* 1000) = 35.2GB/s

 

coincidence?

yea right, two strikes dude, one more and you are out

renesas doesnt support 512 bits, go to their website dude

Memory SizeConfiguration
(word x bit)
Random Access
Freq.
DVCC (VDD)Tj
64 Mb 256 Kw x 256 b 133 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 32 Kw x 256 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 64 Kw x 128 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C

Contact Us

 

just to let you know, Mb are megabits, just in case you didnt know

so

8 megabytes macro of 256 bits

1 megabyte macro of 256 bits

1 megabyte acro of 128 bits

 

can you please do the honors?

wii u edram its 32 megabytes, so...?

Again when did I ever claim the bandwidth to be 35.2 GB/s. You clearly made a strawman argument there. 

The gamecube had 3MB of embedded ram! My god, how can an apologist like yourself not know about the company's hardware that your obviously biased to. 

BTW  The WII Us reported embedded ram bandwidth is 70.4 GB/s which gives it a bus width of 1024 bits.

The one with strikes here is you, clown becuase your clearly some stranger who hasn't proven his own grounds yet on this site. 

How about solving the volume integral of 2x + 2y where the boundries for both x and y are -1 to 1 ? (This is something any engineer could and even including prospective ones too! You gonna cop out on this one or what ?) 

You sound like some fraud getting overly defensive about a PC master race's evaluation of your beloved WII U. 


and  you say you are not a troll claiming 1024 bits when the old xbox had 4096 bits?

70GB/s, yea right

renesas said that wii u uses their latest technology,

fro 1024,4096 and 8192 whcih is latest to you dude?

they even say that is so latest that would be difficult to produce elsehwere, so, do you think after 7 years nobody else besides renesas(nec now forms part of them) can produce an edra of 4096bits?

shinen also says that wii u edram bandwidth is huge

come on dude, even by today pc standards like ddr3 of 50GB/s, 70GB/s fall short considering we are talking about embedded edram directly in the gpu die, not main ram like the ddr3

 

also, shinen comented that with just 7 megabytes of edra of wiiu you can do 720p with double buffering while with xbox you need the full 10 negas for that, and even miocrosoft admits it(doesnt this kind of suggest that you get the same bandwidth with 7 megabytes of edram that you would get with the xbox 10 megabytes?, remeber that the actual bandwidth of 256gb/s for the rops is later limited by a channel of 32gb/s, and wiiu gpu ssoesnt have this bottleneck so sounds about right)

 

so no, anywhere below 4096bits is bullshit and trolling

obviously the right choice and the must logical is 8192bits, given siomething like 563GB/s



Around the Network
fatslob-:O said:
megafenix said:

thought you were smarter than that, or maybe you are but just decided to play fool when its not convenient

very well

here

4096bits * 500mhz/(8bit*1000) = 256GB/s

 

thats how you get the internal bandwidth of the xbox 360 edra to which the rops have full access, then when the rops finish the job they pass the result to the GPU frame buffer through an external bus of 32GB/s

 

so clearly no portcan work with the bandwidth you are claiming, and its also very strange you fail to provide more bandwidth than what i can get with just 4 megabytes of the old gamecube embedded memory on the gpu with a brand new edram

 

so, tell me, how did you get the wii u edram bandwidth?dont forget its embedded directly on the gpu, no external buses so please

lets see, eyeofcore suggested that you are claiming 512 bits wide edram for the wii u right?

512 bits edram * 550mhz gpu / (8bit* 1000) = 35.2GB/s

 

coincidence?

yea right, two strikes dude, one more and you are out

renesas doesnt support 512 bits, go to their website dude

Memory SizeConfiguration
(word x bit)
Random Access
Freq.
DVCC (VDD)Tj
64 Mb 256 Kw x 256 b 133 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 32 Kw x 256 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 64 Kw x 128 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C

Contact Us

 

just to let you know, Mb are megabits, just in case you didnt know

so

8 megabytes macro of 256 bits

1 megabyte macro of 256 bits

1 megabyte acro of 128 bits

 

can you please do the honors?

wii u edram its 32 megabytes, so...?

Again when did I ever claim the bandwidth to be 35.2 GB/s. You clearly made a strawman argument there. 

The gamecube had 3MB of embedded ram! My god, how can an apologist like yourself not know about the company's hardware that your obviously biased to. 

BTW  The WII Us reported embedded ram bandwidth is 70.4 GB/s which gives it a bus width of 1024 bits.

The one with strikes here is you, clown becuase your clearly some stranger who hasn't proven his own grounds yet on this site. 

How about solving the volume integral of 2x + 2y where the boundries for both x and y are -1 to 1 ? (This is something any engineer could and even including prospective ones too! You gonna cop out on this one or what ?) 

You sound like some fraud getting overly defensive about a PC master race's evaluation of your beloved WII U. 

@bold You take Fourth Storm's speculation/theory as a fact? It has not been confirmed you fraud!

@underlined Your ignorance is remarkable, he did not said that you claim that you 35.2GB/s and I haven't read anything like that... He was explaining Xbox 360 eDRAM and the bandwidth of the ROP's that are embedded into eDRAM while eDRAM is not embedded into GPU thus it needs to go trough narrow bus. He is explaining the difference between Wii U and Xbox 360 also he just assumes that you think that Wii U's bandwidth is low since you said that it suffers from low bandwidth yet he and I can disprove anytime.

Most ports if not all ports from major 3rd party developers are cheap ports of Xbox 360 builds/versions of their games so those games use just 10MB of eDRAM so if it was 70.4 with all 32MB of eDRAM. So 10MB that would be used would mean 23,46 GB/s  compared to Xbox 360's external bus of 32 GB/s so it would be seriously bottlenecked and game would not run nor the ports like Need For Speed Most Wanted U would run nor even be better at all.

Renesas has 1024 available, yet by using that would cause trouble and would be a waste if Nintendo was not aiming for 1080p for their own games and trought for the future. 4096 is the next option available and would mean 281,6 GB/s for all 32MB of eDRAM so when 10MB used by Xbox 360 port it would have 93,86 GB/s yet there is another problem that the eDRAM in Xbox 360 functions considerably different than on the Wii U since it is not embedded.

There are countless differences between Wii U and Xbox 360 for example Xbox 360 has 2 threads per core while Wii U has just 1 thread per core.

Since most games on Wii U from 3rd party are cheap ports then those teams would only try to achieve an acceptable/bearable performance rather than rewrite in some major ways the game engine of that game plus those port teams are less experienced employees of that company.

Look at Assassin Creed 3 on Wii U and compare it to Assassin Creed 4 on Wii U and the difference is noticeable and in some areas huge!

 

(Assassin Creed 3 Wii U)

(Assassin Creed 4 Wii U)

Both of these games are cheap ports yet the AC4 looks better than AC3 because they rewritten some engine code and optimized better for Wii U on AC4 compared to AC3.



megafenix said:

come on dude, even by today pc standards like ddr3 of 50GB/s, 70GB/s fall short considering we are talking about embedded edram directly in the gpu die, not main ram like the ddr3

 

also, shinen comented that with just 7 megabytes of edra of wiiu you can do 720p with double buffering while with xbox you need the full 10 negas for that, and even miocrosoft admits it(doesnt this kind of suggest that you get the same bandwidth with 7 megabytes of edram that you would get with the xbox 10 megabytes?, remeber that the actual bandwidth of 256gb/s for the rops is later limited by a channel of 32gb/s, and wiiu gpu ssoesnt have this bottleneck so sounds about right)


I'll chime in here.
If you run Quad-Channel DDR3 @ 3,000mhz you can get a bandwidth of 96GB/s a second on the PC.
DDR4 is also coming.
Bandwidth = DDR Clock rate x bits per clock / 8.

However, from my own tests (As I have a Quad-Channel DDR3 set-up) memory bandwidth isn't actually that important, games practically see neglible difference regardless if you have 12GB/s of memory bandwidth or 90GB/s of memory bandwidth.
Remember the PC in general doesn't share it's system memory bandwidth with a Graphics processor, so there is far less memory bandwidth demands placed on it.
The general consensus on the PC is more Ram is preffered over faster ram, because if you run out of Ram, you will take a much larger performance penalty. (IGP's not withstanding.)
Because of the PC's constantly evolving nature, Graphics processors long out-stripped the bandwidth that the rest of the system could provide, not just in terms of memory bandwidth but interconnect bandwidth too, hence why GPU's come with dedicated high-speed GDDR5 memory which will soon be supplanted with GDDR6 memory, consoles can get away with a shared memory/edram set-up becuase, well. They aren't trying to push 20 Teraflops of single precision floating point and run at resolutions like 1440P, 1600P, 2k and 4k resolutions, 8-16X Anti-Aliasing with all the other bells and whistles.

Then we fast forward to APU's and Integrated graphics.
Intel wen't with an eDRAM approach with Iris Pro, however they included 128MB of the stuff, now regardless of the bandwidth, it's still slow because the rest of the GPU can't make use of it all, however in Intel's case, the CPU treats it as an L4 cache so it's not entirely going to waste.
But just because you have 1024GB/s of memory bandwidth, if your GPU is sub-par it's going to be a waste.

As for the 4k-8k textures, it's not entirely impossible, the Wii U does have a relatively modern GPU architecture complete with a modern implementation of texture compression (3dc), it wouldn't be surprising to see 8:1 or even 16:1 levels of texture compression, which is going to save a massive amount of memory and bandwidth.
The downside to using lots of compression is that you create allot of artifacts in the textures, however that's really not an issue for a game such as this due to the fact you aren't going to be getting down close-and-personal with the textures to see it. (Not to mention how fast your movement is.)

Basically it's a concession, but it should result in an overall larger improvement in image quality in the end, but it's not something all games can or will use.



--::{PC Gaming Master Race}::--

fatslob-:O said:

 

eyeofcore said:

@Bold Just how much pointless shit are you going to post just to respond to me ? 

Why is it "pointless shit"? It is above all on topic and in directly related to your statements and it involved bandwidth, crucial/important difference between Wii U and Xbox 360 also it is aimed at your post/reply so I assume that you are avoiding this part that you call it "pointless shit".

I don't give a rats ass about about the difference between the WII U and X360! You didn't respond to the question I was asking with the pointless shit you gave me.

The problem is that this "pointless shit" is only pointless shit to you and you don't care about the difference between Wii U and Xbox 360 to get a picture and yet I did respond to your question while you are avoiding mine like a black plague!

eyeofcore said:

"Secret sauce" ? Really dude ? I guess you don't really know how semiconductor foundries work then, eh ? Renesas were the ones DESIGNING the the eDRAM and the who FABBED it was TSMC. If renesas were to go under it would mean shit to nintendo because they can just liscence the IP from them to continue manufacturing the WII U.

I was quoting("") the article. tell me something that I don't know... I love how you presume that and that thing about me.

Quote from article;
"As part of that restructuring Renesas, that merged with NEC Electronics in 2012, announced that it decided to close four semiconductor plants in Japan within 2-3 years, including the state-of-the-art factory based in Tsuruoka, Yamagata Prefecture  (as reported by the Wall Street Journal), and this may spell trouble for Nintendo and the Wii U.

The reason is quite simple. The closing factory was responsible for manufacturing the console’s Embedded DRAM, that is quite properly defined the “life stone” of the console."

 

What do you understand by manufacturing?

I'll tell you something. You gave even more pointless shit to the discussion at hand and still haven't even went to answering my question! I'll say it once again and no more, RENESAS DOES NOT MANUFACTURE THE eDRAM ITSELF, THAT IS SUPPOSED TO BE TSMCs JOB

You told me pure BS. Wow... You accuse me of not answering the question yet I answered it, your ignorance is shocking! Renesas does produce the the eDRAM in their own plants like NEC did for Xbox 360 in their own plants and NEC got aquired by Renesas also the reason why Xbox 360 did not had eDRAM embedded into the GPU because it was produced at a seperate factory! While for WIi U's GPU the eDRAM is embedded into GPU and that is possible if the GPU and eDRAM are manufactured at the same plant!

(picture of Wii U GPU from Chipworks)

Quote from the ARTICLE;
"Nintendo could try to contract another company to produce the component, but there are circumstances that make it difficult. According to a Renesas executive the production of that semiconductor was the result of the “secret sauce” and state-of-the-art know-how part of the NEC heritage of the Tsuruoka plant, making production elsewhere difficult. In order to restart mass production in a different factory redesigning the component may be necessary." [CONCLUSION; It is not produced at TSMC, further confirmed by Wii U GPU die]

Just because AMD's GPU were produced at TSMC does not mean that they can not be produced else where and it has been confirmed that AMD can produce their GPU's else where like at Global Foundries with AMD's APU's that contain the GPU portion!

eyeofcore said:

Duh. The WII U is SUPPOSED to be easier to develop for becuase it has better programmablility. (Again don't give me pointless shit in your post becuase I have followed alot of tech sites while doing some research and I'm not clueless to these subjects.)

Another thing that I already know... :/

Of course is easier, but if its a port what can you expect?

Even the PlayStation 4 version/build of Assassin Creed Black Flag doesn't look that impressive on PlayStation 4 despite how powerful it is and the fact that is even easier to develop for since it has no eDRAM, but rather just main RAM! So no matter how lazy/sloppy the developers are that should ease things a lot, yet they fail to bring impressive port/build with all of those advantages!

AGAIN YOU CONTINUE TO NOT ANSWER MY QUESTION AT HAND. YOU GAVE ME ANOTHER POINTLESS RESPONSE. DO YOU HAVE AN ISSUE WITH ENGLISH ?! 

Yet again I did answer your freaking question then I added the "pointless" part that is not pointless, but rather connected to it what you were saying. You have a issue bro and I am afraid I can't help you. Your ignorance is out of my league to cure you bro.

You can't comprehend that people around the world have a different native language than yours and that your english is secondary not primary so they will most likely speak and write it perfectly and you even stumble sometimes so shut up. Nobody is perfect.

Deal with it... For once in your life!

eyeofcore said:

What about the rayman legends creator saying that the WII U is "powerful" ? The word "power" is a vague term once again. 

Word power is not a vague term, nice try. What about him you say and I already pointed out that he is a graphic artist and a programmer so he is not just a game designer since he works with the hardware and he codes for it! He says that Wii U's hardware is powerful since he worked on it and programmed for it, he said that the game worked with uncompressed textures on Wii U so it takes more RAM and also more resources because more data is processed.

Again the word "power" is undefined. What are we talking about specifically ? Is he talking about the fillrates, shaders, or it's programmability ? No context was given and thus the meaning is lost. What does this have to do with my comment initally at hand ?! 

Definition of word "Power"; the rate of doing work, measured in watts or less frequently horse power / capacity or performance of an engine or other device / the rate of doing work, measured in watts or less frequently horse power
Definition of word "Powerful"; Having great power or strength

@underlined You made a cowardly move my friend, that does not invalidates his statement and he is under NDA so he can't talk about the fillrates, shaders or the programmability so of course since he is under NDA he can not share it. Your counter argument just got denied and is labelled invalid! At least Shin'en was able to share some things;

"Getting into Wii U was much easier then 3DS. Plenty of resources, much faster dev environment."

"Wii U GPU and its API are straightforward. Has plenty of high bandwidth memory. Really easy."

"You need to take advantage of the large shared memory of the Wii U, the huge and very fast EDRAM section and the big CPU caches in the cores."

[Yes, the websites car is directly from the game. / Well, we are simply to small to build HQ assets just for promos]

@italic&underlined It has to do with your assumption that is incorrect, yet you will still continue to force it...

eyeofcore said:

Wow your really hurt about another man over the internet, aren't you ? Your afraid of somebody disproving you ?! *Facepalm*

You are just forcing the premise that is not correct and why I would be hurt over a person that I don't know? It is inlogical and you are trying to provoke me since you are a troll and force a rumor as a fact thus damaging your own credibility. If I was afraid then I would not be responding to you in the first place so you should be facepalming to yourself and not to me for asking these two question.

Then why don't you disprove the rumor at hand ?! There's more evidence for it than against it. "inlogical" ?! LMAO you really have terrible english, don't you ? It's "illogical"!

Why would I since the the person that created the thread that you should treat it as rumor, the guy said it is a rumor and a mere speculation so deal with it claiming that there is more evidence for it then against it is really a fallacy that became the "truth" thanks to people that keep fooling themself and lying themself.

First of all the shaders or ALU's should be 80 to 90% larger for no apparent reason and there are no reasons given why they would be 80 to 90% larger plus there is like 90mm^2 of die space available/dedicated for the GPU exclusively. Wii U die size is 146.48mm^2 according to Chipworks while with protective shell it is 156mm^2 from Anandtech. Chipworks confirmed it is made on 40nm litography.

A Radeon HD 6570M VLIW5 is 104mm^2 with protective shell and without it it should be roughly 96mm^2 and its specifications  are;
480 SPU's / 24 TMU's / 16 ROP's / 6 CU's with clock of 600Mhz and TDP of 30 watts

Lowering clock to by 8.5% to 550Mhz would lower TDP to 25 watts, now you will try to deny that it can not fit into 90mm^2 and that it consumes way to more for the Wii U. I can counter that easily since as you Fatslob-:O said that they can shrink/increase density by 10% by optimizing the design so it would be 86,4mm^2 and would also shave off another 10% of TDP thus it is now at 22.5 watts! Considering that Wii U's GPU codenamed Latte would be customized by Nintendo and AMD so more improvements to design and since Chipworks guy said that he can not identify which part of series and exact model this GPU is thus it proves that Wii U's GPU Latte is indeed a customized, a heavily customized GPU.

Now you will still argue that it is not possible because the Wii U consumes 30 to 35 watts maximum yet I can counter that simply by saying that Wii U has a Class A power suply with efficiency of 90% and is rated 75 watts so it can handle 67.5 watts at maximum. Since Wii U consumes 30-35 watts then we can assume that Nintendo locked some resources in Wii U's SDK/Firmware, you will try to counter that yet PlayStation 3 SDK/Firmware was updated and it freed up aditional 70MB for Games from the OS and with 3DS that allowed more CPU cycles for games also Xbox 360 SDK got updated which resulted in less usage of RAM, CPU and GPU resources with also one core not being exclusive to audio anymore like in the beginning. Shin'en said this in the interview;

"And for the future, don’t forget that in many consoles, early in their life cycle, not all resources were already useable or accessible via the SDK (Software Development Kit). So we are sure the best is yet to come for Wii U."

eyeofcore said:

Who the hell cares what renesas says ?! I want to see the end result! 

So nobody cares about a statement from a company that designed and produces eDRAM for Wii U? It was said by a Renesas executive and not by some guy on a forum that may or may not be correct, this is straight from Renesas its self and when DualShockers wrote "secret sauce" that means that it is most likely a custom design and "state-of-the-art know-how" literally means/hints use of best of the best is being applied to the Wii U. If you want results then wait, its like chemistry the result is not instant nor designing a weapon too like mortars. It needs time to shine like on PLayStation 3, like on Xbox 360 like the Source engine.

You can not get the result right from the start... :P

By your logic we would have seen everything from start in every console/handheld/mobile/technology generation... Your logic. :/

I like how you ignore PC. Are you going to answer the comment directly or will you continue posting more pointless crap for me to read ? 

I answered to you directly, it seems you can not comprehend that I answered to you question directly... I am sorry for ignoring the PC, did I hurt your feelings or something? When comes to devices PC is configurable while consoles, handhelds and mobile phones have basically bunch of SKU's and you can not upgrade their CPU or GPU and RAM except for N64 that had Expansion pack that doubled the amount of RAM on N64.


eyeofcore said:

Who gives a damn about shin'en ?! Are they really that important ?!

Same could be said for Crytek, Rare, Sony Santa Monica, Guerilla Games, Valve, ID Software, Epic Games, etc... When they were small and not really relevant also they are really that important because they use the hardware and they will try to get most out of and you forgot that these guys came from demo scene with different principles than modern day programmers.

Look at FAST Racing League and Jet Rocket that people would have think it is a game on Xbox 360 and not on Wii and Shin'en did things that other did not on Wii, including ambient occlusion in theri games. Don't doubt these guys that were in PC demo scene.

More pointless crap for not answering my intital concerns at hand, eh ? 

Hey, I can't blame you. For you it is pointless crap while for others it is an answer and I was discussing your logic so you logic you actually think that your opinion is pointless crap. I am not trolling you, I am just stating your opinion/behaviour that you have shown

eyeofcore said:

More people ?! Wow man you really can't defend your own grounds. 

DID I CLAIM THAT THE eDRAM ONLY HAD 35GB/s BANDWIDTH ? (Again your lack of english understanding makes everyone confused.) 

Seriously?

Sorry for the misunderstanding then, so would you be so kind to tell me how much you think it has?
Can you also at the very least provide some kind of proof like a formula to calculate it?

And please, don't forget I can get about 10GB/s of bandwidth with just 1 MB of the old embedded memory on the Gamecube's GPU!

Please answer the last two questions involving bandwidth!

It depends on where the memory needs to flow and BTW it's 3MB of eDRAM. Kinda sad how you don't know about the hardware of a certain console from your favourite console manufacturer. 

Agreed and I know that except we have two seperate pools of eDRAM not one pool if you are suggesting that it is one pool and its not and it is actually sad that you assume that a certain company is my favorite console manufacturer since only console I ever had was the original PlayStation that I got in 1999 and I am on PC/x86 platform since 2004. I admit that I had a brief contact with PlayStation 2 and PlayStation 3 at my friends house and thats about it. I've never seen in person any platform from Nintendo so I did not even touched let alone played games on Nintendo's platforms. I don't own any platform from Nintendo nor their games, your point and provokation is without any foundation so move along. Your desperation and ignorance that stinks this place.

The memory flow is not constant and what's more is that the constraints aren't clear either for every different workloads. 

Everything has variables... Tell me something I did not knew already.


Your ignorance should be studied, you would be an amazing individual to subject to some psychological experiments...

You are both highly and lowly inteligent at the same time, like the CPU with low amount of Cache, even a idiot can have a moment of pure geniousness!



on purpose, who reported the wii u edram having 70gb/s
let e guess, begassin right?

 

dude, for 70gb/s yopu need an edram with a total of 2048bits, and just by seeing the macros yo can clearly see that you have only 1024,4096 and 8192bits

so how the ... you get that bandwidth if you dont have the type of edra needed for that?

 

third strike

 

you cant make assumptions by using ports as exaples, if so i can also go around forums telling ps4 power sucks because asc 4 doesnt look any impressive compared to the current consoles, eventhough is easier to develop thanks to the pc archiecture similarities and that pack lots of maihn ram gb and bandwidth

 

with wiiu the ports are more difficult because programmers have to be more clever on how to use the edram properly, its easier to use the main ram and daddy sony has confrimed this

 

not to mention, that as eyeofcore says, when you port from 360 to wiiu, the source code tell wiiu to use just 1 mega of cache, 10 megas of edram and waste a cpu core for sound instead of using the dsp sicne xbox uses one of it cores for this

all what the developers didnt thin to put on teh additional cache and edram on wiiu goes to the ain ram since thats how was concieved on teh xbox game, to solve it, developers have to rework the source code, reallocate resources, etc, but in must cases they wont even bother and will try to get around that problem with the additional extra ram

 



megafenix said:
fatslob-:O said:
megafenix said:

thought you were smarter than that, or maybe you are but just decided to play fool when its not convenient

very well

here

4096bits * 500mhz/(8bit*1000) = 256GB/s

 

thats how you get the internal bandwidth of the xbox 360 edra to which the rops have full access, then when the rops finish the job they pass the result to the GPU frame buffer through an external bus of 32GB/s

 

so clearly no portcan work with the bandwidth you are claiming, and its also very strange you fail to provide more bandwidth than what i can get with just 4 megabytes of the old gamecube embedded memory on the gpu with a brand new edram

 

so, tell me, how did you get the wii u edram bandwidth?dont forget its embedded directly on the gpu, no external buses so please

lets see, eyeofcore suggested that you are claiming 512 bits wide edram for the wii u right?

512 bits edram * 550mhz gpu / (8bit* 1000) = 35.2GB/s

 

coincidence?

yea right, two strikes dude, one more and you are out

renesas doesnt support 512 bits, go to their website dude

Memory SizeConfiguration
(word x bit)
Random Access
Freq.
DVCC (VDD)Tj
64 Mb 256 Kw x 256 b 133 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 32 Kw x 256 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 64 Kw x 128 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C

Contact Us

 

just to let you know, Mb are megabits, just in case you didnt know

so

8 megabytes macro of 256 bits

1 megabyte macro of 256 bits

1 megabyte acro of 128 bits

 

can you please do the honors?

wii u edram its 32 megabytes, so...?

Again when did I ever claim the bandwidth to be 35.2 GB/s. You clearly made a strawman argument there. 

The gamecube had 3MB of embedded ram! My god, how can an apologist like yourself not know about the company's hardware that your obviously biased to. 

BTW  The WII Us reported embedded ram bandwidth is 70.4 GB/s which gives it a bus width of 1024 bits.

The one with strikes here is you, clown becuase your clearly some stranger who hasn't proven his own grounds yet on this site. 

How about solving the volume integral of 2x + 2y where the boundries for both x and y are -1 to 1 ? (This is something any engineer could and even including prospective ones too! You gonna cop out on this one or what ?) 

You sound like some fraud getting overly defensive about a PC master race's evaluation of your beloved WII U. 


and  you say you are not a troll claiming 1024 bits when the old xbox had 4096 bits?

70GB/s, yea right

renesas said that wii u uses their latest technology,

fro 1024,4096 and 8192 whcih is latest to you dude?

shinen also says that wii u edram bandwidth is huge

come on dude, even by today standards like ddr3 of 50GB/s, 70GB/s fall short considering we are talking about embedded edram directly in the gpu die

 

also, shinen comented that with just 7 megabytes of edra of wiiu you can do 720p with double buffering while with xbox you need the full 10 negas for that, and even miocrosoft admits it

 

so no, anywhere below 4096bits is bullshit and trolling

obviously the right choice and the must logical is 8192bits, given siomething like 563GB/s

My god, you need some tampons man and I really mean it! Your seriously that hurt over another master race's evaluation of the WII U ?!

It sounds about right because afterall the X1 is using 32MB of integrated SRAM which is significantly faster than integrated DRAM. You have a problem with 70 GB/s or else prove me wrong otherwise. Why does the WII Us eDRAM have MORE bandwidth than intel's eDRAM for iris pro 5200 ?! That doesn't sound right to me for a bleeding edge manufacturer like intel to have a lower bandwidth than what the WII U features for it's relatively small die size. If anything it sounds like the eDRAMs bandwidth may actually be 35 GB/s. 

@Bold When did they say that ? Provide me a source or otherwise. They did not disclose the numbers just so you know because the word "latest" means newest BTW but it doesn't mean more "powerful". We all know the GT640 is the newer card yet it's weaker than the older hd 7770 so your point about latest is moot. 

That 50 GB/s figure only applies to multichannel set-ups. 

Who cares about shin'en! 

The one bullshiting here is you obviously because none of these things add up. 

I'm straight up done talking to an apologist like you. Go and take this up to pemalite to see if he'll say anything different from me!