By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - FAST Racing NEO powered by 2nd generation engine for Wii U supports and uses 4k-8k textures

fatslob-:O said:
megafenix said:
fatslob-:O said:

It doesn't matter whether their brought out or not, they obviously have some agenda.

History tells me that you didn't know how to do math LOL. I wonder how hard figuring out the pixel output of the WII U was ? The ones who are uneducated will continue to fear and I obviously abide by the rules. 


quite the arguent of the guy who didnt know how to calculate bandwidth and had to be shooled

quite the argument who jumps to conclusions like the wiiu edram bandwidth without bases to back it up

quite the arguiment for someone who needs the same schooling over and over by shines sae comments

quite the argumment who doesnt want to admit defeat when he has clearly been  beaten

quite the argument of someone with a clear agenda to troll the wiiu hardware and cant even do a better job than what reggie does in his work

Wow I really pity you then. Your obsessing another man over the internet so much that I got you to hate me LOL. Damn you must of got mind bended so much by me and pemalite LMAO. 

Still haven't answered my math question yet ? 


hahaha

lol, say what you want

you may not be the dumbest person in the world, but you make some merits...!!!

heres your shitty iris

http://www.engineering.com/ElectronicsDesign/ElectronicsDesignArticles/ArticleID/5838/Intel-Launches-Next-Generation-of-Microprocessors.aspx

"

The Iris Pro is a system-in-package (SiP) or multi-chip module (MCM) design with the micro/graphics processor die placed next to the eDRAM chip on the pin grid array package substrate. Although manufactured as a separate integrated circuit, the eDRAM gets that moniker because the DRAM cell is built using the low power SoC version of Intel's 22 nm TriGate transistor. It is a DRAM cell with one access transistor and one storage capacitor. Although no specific material details were forthcoming, the capacitor is a typical M-I-M crown device using a high-K dielectric located above the bitlines.

 

"

 

that looks like xbox 360 edram and gpu doesnt it?

 

does it really loook like this to you?

 

 

 

i will answer your question once you answer mine, pretty much all thi time i have been answering the questions i have done for you

tell me,  how much bandwidth do you calculate for wii u edram and please the formula?

~ Mod edit ~

This user was moderated by TruckOSaurus for this post.



Around the Network

why are you guys even bothering replying to megafenix it's obvious he has no idea what he's talking about, its all fanboy speculation that's impossible.



fatslob-:O said:
dahuman said:

LOL WHAT? They are both important, you can't think of it as the same thing as a PC to start with, if you look at how the circuit is laid out on the Wii U, then you'd know that all the memory are very close together so it's good for fast fetching and plenty for it to work with the 32MB eDRAM after compression is involved.

What's even more important is the main memory bandwidth in WII Us case. What's the problem with it thinking that it's like a PC ? All next gen consoles are literally dumbed down PCs much like the PS360 was. Fast fetching is nice and all but if I want to access a massive amount of textures ? (Something that is fairly limited by the main memory bandwidth.) Compression can only take you so far. 


I'm saying they are made differently, not that they function differently, PC has higher latency no matter what and we can offset that with high bandwidth, consoles are generally much faster at data fetching due to the way they are built (I can think of an exception or 2 but those were made pretty shitty). Texture compression will take you very far, especially on a hardware level and with a solution similar to Mega Textures or DX11.2's software/hardware solution, with all the RAM being so close together, if properly utilized, you will cap out the Wii U's GPU capabilities before running into memory speed problems if the Wii U GPU is where people are assuming it is at right now. Program codes don't take that much bandwidth or speed to run in the main RAM so that bandwidth wouldn't bottleneck execution speeds at all.



gmcmen said:
why are you guys even bothering replying to megafenix it's obvious he has no idea what he's talking about, its all fanboy speculation that's impossible.


You are not much better with your Dolphin pictures.



fatslob-:O said:
FrancisNobleman said:
fatslob-:O said:

This doesn't mean that shin'en won't exaggerate the WII Us performance. 


They are 3 guys. Not bought like other companies. They surely enjoy maximizing NIntendo's hardware for some years now. You're the one exaggerating.

 

But oh well, history tells me that you won't last long in this site son =)

It doesn't matter whether their brought out or not, they obviously have some agenda.

History tells me that you didn't know how to do math LOL. I wonder how hard figuring out the pixel output of the WII U was ? The ones who are uneducated will continue to fear and I obviously abide by the rules. 

You are the one with an agenda in here, in case you haven't noticed =)

Reported =)

Around the Network

Lol Red Dead the second got banned.



Wyrdness said:
Lol Red Dead the second got banned.


Your avatar totally matches that comment.

 

I wonder if we gonna have some videos and pics for this game, before he comes and ruins the thread once again with his nonsense.



FrancisNobleman said:
Wyrdness said:
Lol Red Dead the second got banned.


Your avatar totally matches that comment.

 

I wonder if we gonna have some videos and pics for this game, before he comes and ruins the thread once again with his nonsense.


I hope we do, he'll be hopping mad though after this though, I have to commend your earlier post's timing when you said people like him don't last that long here then a few posts later he's gone. That was some hilarious Doctor Who style timing there.



gmcmen said:
why are you guys even bothering replying to megafenix it's obvious he has no idea what he's talking about, its all fanboy speculation that's impossible.


like if 35GB/s for wii u edram bandwidth wasnt speculation lol(even 70 or 130GB/s is still speculation)

makes no sense since that would mean only 512bits edram that even gamecube packed more than a decade ago

makes no sense since ports would be impossible, sure the xbox gpu and edram were separated by an external bus of 32gb/s, but the rops were inside the edra and had full access to the 256gb/s of bandwidth, dude, nubers dont even match

here you can confirm that, just look for the render outputs and you can clearly see that they are inside the edram, thats why they have full access to the edram bandwidth of 256gb/s, the reason why the gpu didnt have full access to the edram is due that was in a separate die and had to wait the edram+rops for their delivery, but the wii u gpu and edram are both in the same die, so obviously there is no external bus and so there is full access from gpu to edram bandwidth

xbox 360 edram+render output units(rops)

http://meseec.ce.rit.edu/551-projects/spring2012/2-4.pdf

 

if wii u edram was really 32GB/s then devide tht with the rops, texture units, etc, clearly makes no sense

at the very least you need 4096 bits to have soething like 282GB/s, that should prove to be enough but only for resolution

thats why 8192 bits sounds ore logical, since 563GB/s is enough for resolution, tetures, vertex data, etc

it complies with renesas arguents about wii u edra being latest technology

it matches the stateent from shinen taht wii u edra has lots of bandwidth to the point that its scary

it also makes snese sionce 10 egabytes of 256GB/s only enough for framebuffer of 720p like microsoft stated and admited that is only for that and cant be useed for textures or vertex data or anything else except resolution or mssa, etc

 

besides, shinen mentioned that with wii u edram you just require 7 megabytes for 720p with double buffering, yet with xbox you need the whole bunch of 10 megabytes for that, this kind of suggests that you get the same bandwidth of those 10 megabytes of edram(rops have a bandwidth of 256ggb/s with the edram since are in the sae die) with just 7 megabytes of wii u edram

 

i dont see 563GB/s to much or impossible, come on, even sony said that he was aiming for 1 terabyte of bandwidth using edram

 

here,k sony both says 1 terabyte of edram and also explains why they decided not to use edram and instead increased both the amount and bandwidth f the gddr5 

http://www.gamechup.com/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gbs/

"

PS4: Sony Earlier Thought About Slow GDDR5 + eDRAM (1088GB/s)
Read more at http://www.gamechup.com/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gbs/#rlfOyOgWDPjYPSLZ.99

 

The PS4 has a super fast memory architecture in GDDR5 which is capable of 176GB/s bandwidth, however, Mark Cenry has mentioned that they had thought about an alternative memory architecture.

As you can see in the above image on the right, one of the so called memory architecture they had thought of was going with was GDDR5 memory with 88GB/s bandwidth which is slower than the one they finally went with, along with a small amount of eDRAM which provides a bandwidth of over 1000GB/s. That sounds great doesn’t it? But Cerny said that it would have been difficult to code for it in a straightforward way and developers had to come up with a separate technique to take full advantage of it.

The Xbox One features a small amount of eSRAM and offers the same functionality of high bandwidth to make up for the slow DDR3 RAM.

That’s the main reason Sony chose the memory architecture on the left of the image. The high bus and a unified memory at a 176GB/s would have made it really easy for developers to code for and it was their philosophy to provide a simple architecture with the PS4. Cerny said that the 256-bit bus GDDR5 bandwidth at 176GB/s  ”is quite a lot”.

He revealed that the third-parties had demanded a unified memory architecture with a powerful GPU. This is in complete contrast with the PS3 which had a fairly weak GPU in the RSX and a divided XDR memory. Sony had originally planned to go with just 4GB GDDR5 but certain third-party developers managed to convince them that it was a really bad idea.


Read more at http://www.gamechup.com/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gbs/#rlfOyOgWDPjYPSLZ.99

"



FrancisNobleman said:
Wyrdness said:
Lol Red Dead the second got banned.


Your avatar totally matches that comment.

 

I wonder if we gonna have some videos and pics for this game, before he comes and ruins the thread once again with his nonsense.

Sadly at this point the game has been built up so much that no matter how great it looks some people are going to be all "lol, this is what they were bragging about? So unimpressive, could be done on PS3, lololololol"