By using this site, you agree to our Privacy Policy and our Terms of Use. Close
megafenix said:


pathetic, so you like to mess up things so only you can undesrtand tourself and saying that others who dont get what you say are fools?

yea right

i was being sarcastic lol

sandy bridge and your blabla bla, just To go off on a tangent

just look at gaecube, 512 bits dude, why after more than a decade you suggest only double of that

yea, i brought tje xbox, so what?

is the wii u edram on a separatye die like xbox gpu was with its edram?

no dude, is in teh same die as the gpu, just like gamecube flipper, just that this time we have lots of megabytes this time and a newer edram design

 

dont joke around dude, thats a lot illogical

and again, renesas says best edram, which is 8192 bits

shinen says lotys of bandwidth on teh edram

1024bits will give you just 35gb/s since the clock speed with teh gpu is 550mhz, obviosuly is to have coherency just like gamnecube did with its ebedded memory

you cant get 70gb/s when you dont have the 2048bits option ,a nd even that is very few

choice is 8192 bits, 563gb/s

the gpu is able to handle that


Instead of resorting to personal attacks because you don't understand something... Try to be a little more tactfull in your replies, it will get you farther in life in general.
Nothing I have said should be construed as nasty or rude.

And do keep in mind that internal bandwidth isn't equal to the eDRAM's clock rate and external bus width, they're treated as seperate entities, something you seem to be getting confused about.

Here, allow me to assist in a simple explanation.
Lets say the Xbox 360's eDRAM runs at 500mhz, and the GPU runs at 500mhz.
There is still only going to be 32GB/s of interconnect bandwidth between them.

However, the eDRAM has an internal bus which shuffles around data to all the seperate pieces of logic inside the eDRAM at 256GB/s, the GPU doesn't get 256GB/s of bandwidth, far from it.
The Xbox 360's internal eDRAM bus operates at a 2ghz frequency on a 1024bit bus, despite the eDRAM's actuall frequency ending up something like 500mhz.

Keep in mind, different transisters operate at different frequencies and they all have different power and leakage characteristics, thus a processor such as a CPU may have different parts of a chip running at different speeds, the same thing goes for eDRAM.
The only difference is, Microsoft decided to advertise the 256GB/s number and people thought that was it's actuall bandwidth. (Unfortunatly, it's far more complex than that.)

Also, another thing to keep in mind is that the Wii U GPU doesn't exactly have a massive performance advantage over the Xbox 360's GPU, so it's bandwidth requirements is not going to be stupidly massive due to improvements in bandwidth conservation such as texture compression and culling, but that's just simple logic.

However, where the Wii U really has an edge is in efficiency, the Wii U can do far more work with extra effects per gigaflop than the HD twins can, but that's mainly due to the advancements in regards to efficiency in the PC space, the Wii U will benefit greatly from it, even if it is VLIW5.
It's also part of the reason why developers have a hard time extracting better performance out of ports, the GPU architecture has changed substantually not to mention the eDRAM.

One things for sure, the Wii U isn't a powerhouse, it's performance is only "good enough".
However, Nintendo in general has very good art-styles to hide such deficiencies, but from a graphical perspective, nothing has "wowed me" since the Gamecube days, art on the other-hand has often blown me away.

Ryudo said:


Bits were always a marketing term in the past up to Dreamcast. It's not an actual measurement when it came to game consoles. So bits have not mattered really since GBA and have always been worthless. It's not an OS.

You're correct to an extent.
Back in the SNES era the "bits" were used as a way to work out colour depth or CPU register sizes etc'.
For example, the Nintendo 64 had a 64bit processor.

However if you fast forward to today, it's use has shifted from that.
Bits and Clockrate can be used to gauge bandwidth of particular busses, you can't just go by the bit width or the clock rate seperately to gauge such things, they have to be used in conjunction.

For example, DDR3 on a 512bit memory bus running at 800mhz is equal to GDDR5 on a 256bit memory bus also running at 800mhz.
If both use the same 512bit bus, the GDDR5 will win thanks to it's Quad-data rate.

Essentially... larger the bus width between system memory, the larger the bandwidth (If everything else was kept equal) and the more performance you will obtain, hence why it's important.

Wyrdness said:


Oh yes they did and they did it quite brutally he's asked you a question which you completely tried to dodge, Hynad made the point that even with lower specs consoles run games at a level that PCs need significantly higher specs which all in all he's right, his point wasn't really disproved so learn to read. Point to where I mentioned eyecore or is that another reading problem you have? If it is I'm disappointed in the village elder's lack of obligation to ensuring his follower can read, as well as the village elder you have a man crush on this eyecore huh.

I can give you plenty of examples of PC games with less hardware running the same game as a console and still look better.
Oblivion? Call of Duty 4~ Bioshock? Unreal Tournament? All can be run on a Radeon 9500 - x850XT class hardware, the Xbox 360 has a Radeon x1800/2900 hybrid class GPU.



--::{PC Gaming Master Race}::--