By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - FAST Racing NEO powered by 2nd generation engine for Wii U supports and uses 4k-8k textures

Roma said:
damn I forgot what this thread was about until curl mentioned it in his last post

cam in here hoping for some footage. I wonder when we will get some

Tell me about it!

They are being such incredible teases, talking up their new engine for months, then giving us only a title screen with a single in-game car on it.

I want footage, dammit! XD

People keep asking them on twitter when they'll show gameplay, or even screens, but all they'll say is "we don't know yet."

They're blue-balling us hardcore.



Around the Network
fatslob-:O said:

I could care less for top notch voice acting and story but atleast give me alot of content. 

I don't want a game that just looks better! I want a game to have more content for my money while also being enjoyable too. Both forza 5 and killer instinct were disappointments because of the lack in content, not techincal showcases! Forza 5 looks like a current gen title and killer instinct is in 720p but that's not the issue! The most important parts of the games are enjoyment and replayability. If shien'en can't deliver alot of fun content then i'm going to be disappointed because of it. 

I'll tell you right now; Shin'en make fun $10 downloadable games. For that price, you get a ton of content, far more than is usual for $10 eshop titles. But if you're expecting the amount of content you'd get from a $60 retail game, you're going to be disappointed.



megafenix said:


pathetic, so you like to mess up things so only you can undesrtand tourself and saying that others who dont get what you say are fools?

yea right

i was being sarcastic lol

sandy bridge and your blabla bla, just To go off on a tangent

just look at gaecube, 512 bits dude, why after more than a decade you suggest only double of that

yea, i brought tje xbox, so what?

is the wii u edram on a separatye die like xbox gpu was with its edram?

no dude, is in teh same die as the gpu, just like gamecube flipper, just that this time we have lots of megabytes this time and a newer edram design

 

dont joke around dude, thats a lot illogical

and again, renesas says best edram, which is 8192 bits

shinen says lotys of bandwidth on teh edram

1024bits will give you just 35gb/s since the clock speed with teh gpu is 550mhz, obviosuly is to have coherency just like gamnecube did with its ebedded memory

you cant get 70gb/s when you dont have the 2048bits option ,a nd even that is very few

choice is 8192 bits, 563gb/s

the gpu is able to handle that


Instead of resorting to personal attacks because you don't understand something... Try to be a little more tactfull in your replies, it will get you farther in life in general.
Nothing I have said should be construed as nasty or rude.

And do keep in mind that internal bandwidth isn't equal to the eDRAM's clock rate and external bus width, they're treated as seperate entities, something you seem to be getting confused about.

Here, allow me to assist in a simple explanation.
Lets say the Xbox 360's eDRAM runs at 500mhz, and the GPU runs at 500mhz.
There is still only going to be 32GB/s of interconnect bandwidth between them.

However, the eDRAM has an internal bus which shuffles around data to all the seperate pieces of logic inside the eDRAM at 256GB/s, the GPU doesn't get 256GB/s of bandwidth, far from it.
The Xbox 360's internal eDRAM bus operates at a 2ghz frequency on a 1024bit bus, despite the eDRAM's actuall frequency ending up something like 500mhz.

Keep in mind, different transisters operate at different frequencies and they all have different power and leakage characteristics, thus a processor such as a CPU may have different parts of a chip running at different speeds, the same thing goes for eDRAM.
The only difference is, Microsoft decided to advertise the 256GB/s number and people thought that was it's actuall bandwidth. (Unfortunatly, it's far more complex than that.)

Also, another thing to keep in mind is that the Wii U GPU doesn't exactly have a massive performance advantage over the Xbox 360's GPU, so it's bandwidth requirements is not going to be stupidly massive due to improvements in bandwidth conservation such as texture compression and culling, but that's just simple logic.

However, where the Wii U really has an edge is in efficiency, the Wii U can do far more work with extra effects per gigaflop than the HD twins can, but that's mainly due to the advancements in regards to efficiency in the PC space, the Wii U will benefit greatly from it, even if it is VLIW5.
It's also part of the reason why developers have a hard time extracting better performance out of ports, the GPU architecture has changed substantually not to mention the eDRAM.

One things for sure, the Wii U isn't a powerhouse, it's performance is only "good enough".
However, Nintendo in general has very good art-styles to hide such deficiencies, but from a graphical perspective, nothing has "wowed me" since the Gamecube days, art on the other-hand has often blown me away.

Ryudo said:


Bits were always a marketing term in the past up to Dreamcast. It's not an actual measurement when it came to game consoles. So bits have not mattered really since GBA and have always been worthless. It's not an OS.

You're correct to an extent.
Back in the SNES era the "bits" were used as a way to work out colour depth or CPU register sizes etc'.
For example, the Nintendo 64 had a 64bit processor.

However if you fast forward to today, it's use has shifted from that.
Bits and Clockrate can be used to gauge bandwidth of particular busses, you can't just go by the bit width or the clock rate seperately to gauge such things, they have to be used in conjunction.

For example, DDR3 on a 512bit memory bus running at 800mhz is equal to GDDR5 on a 256bit memory bus also running at 800mhz.
If both use the same 512bit bus, the GDDR5 will win thanks to it's Quad-data rate.

Essentially... larger the bus width between system memory, the larger the bandwidth (If everything else was kept equal) and the more performance you will obtain, hence why it's important.

Wyrdness said:


Oh yes they did and they did it quite brutally he's asked you a question which you completely tried to dodge, Hynad made the point that even with lower specs consoles run games at a level that PCs need significantly higher specs which all in all he's right, his point wasn't really disproved so learn to read. Point to where I mentioned eyecore or is that another reading problem you have? If it is I'm disappointed in the village elder's lack of obligation to ensuring his follower can read, as well as the village elder you have a man crush on this eyecore huh.

I can give you plenty of examples of PC games with less hardware running the same game as a console and still look better.
Oblivion? Call of Duty 4~ Bioshock? Unreal Tournament? All can be run on a Radeon 9500 - x850XT class hardware, the Xbox 360 has a Radeon x1800/2900 hybrid class GPU.



--::{PC Gaming Master Race}::--

curl-6 said:
fatslob-:O said:

I could care less for top notch voice acting and story but atleast give me alot of content. 

I don't want a game that just looks better! I want a game to have more content for my money while also being enjoyable too. Both forza 5 and killer instinct were disappointments because of the lack in content, not techincal showcases! Forza 5 looks like a current gen title and killer instinct is in 720p but that's not the issue! The most important parts of the games are enjoyment and replayability. If shien'en can't deliver alot of fun content then i'm going to be disappointed because of it. 

I'll tell you right now; Shin'en make fun $10 downloadable games. For that price, you get a ton of content, far more than is usual for $10 eshop titles. But if you're expecting the amount of content you'd get from a $60 retail game, you're going to be disappointed.

Thanks for killing my little hopes that I've had about left about having more game content in the future.  



Here's Shin'en's Twitter guys, if any of you would like to ask them about the game or their engine, They've been known to answer technical questions.

https://twitter.com/ShinenGames



Around the Network
curl-6 said:
Roma said:
damn I forgot what this thread was about until curl mentioned it in his last post

cam in here hoping for some footage. I wonder when we will get some

Tell me about it!

They are being such incredible teases, talking up their new engine for months, then giving us only a title screen with a single in-game car on it.

I want footage, dammit! XD

People keep asking them on twitter when they'll show gameplay, or even screens, but all they'll say is "we don't know yet."

They're blue-balling us hardcore.

I hope for their sake it does not disappoint





    R.I.P Mr Iwata :'(

fatslob-:O said:
eyeofcore said:
fatslob-:O said:
megafenix said:

Again when did I ever claim the bandwidth to be 35.2 GB/s. You clearly made a strawman argument there. 

The gamecube had 3MB of embedded ram! My god, how can an apologist like yourself not know about the company's hardware that your obviously biased to. 

BTW  The WII Us reported embedded ram bandwidth is 70.4 GB/s which gives it a bus width of 1024 bits.

The one with strikes here is you, clown becuase your clearly some stranger who hasn't proven his own grounds yet on this site. 

How about solving the volume integral of 2x + 2y where the boundries for both x and y are -1 to 1 ? (This is something any engineer could and even including prospective ones too! You gonna cop out on this one or what ?) 

You sound like some fraud getting overly defensive about a PC master race's evaluation of your beloved WII U. 

@bold You take Fourth Storm's speculation/theory as a fact? It has not been confirmed you fraud!

@underlined Your ignorance is remarkable, he did not said that you claim that you 35.2GB/s and I haven't read anything like that... He was explaining Xbox 360 eDRAM and the bandwidth of the ROP's that are embedded into eDRAM while eDRAM is not embedded into GPU thus it needs to go trough narrow bus. He is explaining the difference between Wii U and Xbox 360 also he just assumes that you think that Wii U's bandwidth is low since you said that it suffers from low bandwidth yet he and I can disprove anytime.

Most ports if not all ports from major 3rd party developers are cheap ports of Xbox 360 builds/versions of their games so those games use just 10MB of eDRAM so if it was 70.4 with all 32MB of eDRAM. So 10MB that would be used would mean 23,46 GB/s  compared to Xbox 360's external bus of 32 GB/s so it would be seriously bottlenecked and game would not run nor the ports like Need For Speed Most Wanted U would run nor even be better at all.

Renesas has 1024 available, yet by using that would cause trouble and would be a waste if Nintendo was not aiming for 1080p for their own games and trought for the future. 4096 is the next option available and would mean 281,6 GB/s for all 32MB of eDRAM so when 10MB used by Xbox 360 port it would have 93,86 GB/s yet there is another problem that the eDRAM in Xbox 360 functions considerably different than on the Wii U since it is not embedded.

There are countless differences between Wii U and Xbox 360 for example Xbox 360 has 2 threads per core while Wii U has just 1 thread per core.

Since most games on Wii U from 3rd party are cheap ports then those teams would only try to achieve an acceptable/bearable performance rather than rewrite in some major ways the game engine of that game plus those port teams are less experienced employees of that company.

Look at Assassin Creed 3 on Wii U and compare it to Assassin Creed 4 on Wii U and the difference is noticeable and in some areas huge!

 

(Assassin Creed 3 Wii U)

(Assassin Creed 4 Wii U)

Both of these games are cheap ports yet the AC4 looks better than AC3 because they rewritten some engine code and optimized better for Wii U on AC4 compared to AC3.

Still haven't adressed my first comment yet, eh ? 

I did, you are just being in denial and I can't help you about that...

BTW I wonder what happened to COD ghosts and origins for the WII U ? 

Origins got outsourced to Human Heads Studios and its their first time working on Wii U for all we know and this is known for several weeks while COD Ghost has been done by a small team of 20-30 people that were porting Xbox 360 build of the game so they were most likely in a rush because they were understaffed with a small budget that is enough to fund the porting also the difference between Wii U and Xbox 360 played the role. They did not had the budget or staff to rewrite major game/game engine code.

fatslob-:O said:
eyeofcore said:
fatslob-:O said:

 

eyeofcore said:

@Bold Just how much pointless shit are you going to post just to respond to me ? 

Why is it "pointless shit"? It is above all on topic and in directly related to your statements and it involved bandwidth, crucial/important difference between Wii U and Xbox 360 also it is aimed at your post/reply so I assume that you are avoiding this part that you call it "pointless shit".

I don't give a rats ass about about the difference between the WII U and X360! You didn't respond to the question I was asking with the pointless shit you gave me.

The problem is that this "pointless shit" is only pointless shit to you and you don't care about the difference between Wii U and Xbox 360 to get a picture and yet I did respond to your question while you are avoiding mine like a black plague!

eyeofcore said:

"Secret sauce" ? Really dude ? I guess you don't really know how semiconductor foundries work then, eh ? Renesas were the ones DESIGNING the the eDRAM and the who FABBED it was TSMC. If renesas were to go under it would mean shit to nintendo because they can just liscence the IP from them to continue manufacturing the WII U.

I was quoting("") the article. tell me something that I don't know... I love how you presume that and that thing about me.

Quote from article;
"As part of that restructuring Renesas, that merged with NEC Electronics in 2012, announced that it decided to close four semiconductor plants in Japan within 2-3 years, including the state-of-the-art factory based in Tsuruoka, Yamagata Prefecture  (as reported by the Wall Street Journal), and this may spell trouble for Nintendo and the Wii U.

The reason is quite simple. The closing factory was responsible for manufacturing the console’s Embedded DRAM, that is quite properly defined the “life stone” of the console."

 

What do you understand by manufacturing?

I'll tell you something. You gave even more pointless shit to the discussion at hand and still haven't even went to answering my question! I'll say it once again and no more, RENESAS DOES NOT MANUFACTURE THE eDRAM ITSELF, THAT IS SUPPOSED TO BE TSMCs JOB

You told me pure BS. Wow... You accuse me of not answering the question yet I answered it, your ignorance is shocking! Renesas does produce the the eDRAM in their own plants like NEC did for Xbox 360 in their own plants and NEC got aquired by Renesas also the reason why Xbox 360 did not had eDRAM embedded into the GPU because it was produced at a seperate factory! While for WIi U's GPU the eDRAM is embedded into GPU and that is possible if the GPU and eDRAM are manufactured at the same plant!

(picture of Wii U GPU from Chipworks)

Quote from the ARTICLE;
"Nintendo could try to contract another company to produce the component, but there are circumstances that make it difficult. According to a Renesas executive the production of that semiconductor was the result of the “secret sauce” and state-of-the-art know-how part of the NEC heritage of the Tsuruoka plant, making production elsewhere difficult. In order to restart mass production in a different factory redesigning the component may be necessary." [CONCLUSION; It is not produced at TSMC, further confirmed by Wii U GPU die]

Just because AMD's GPU were produced at TSMC does not mean that they can not be produced else where and it has been confirmed that AMD can produce their GPU's else where like at Global Foundries with AMD's APU's that contain the GPU portion!

eyeofcore said:

Duh. The WII U is SUPPOSED to be easier to develop for becuase it has better programmablility. (Again don't give me pointless shit in your post becuase I have followed alot of tech sites while doing some research and I'm not clueless to these subjects.)

Another thing that I already know... :/

Of course is easier, but if its a port what can you expect?

Even the PlayStation 4 version/build of Assassin Creed Black Flag doesn't look that impressive on PlayStation 4 despite how powerful it is and the fact that is even easier to develop for since it has no eDRAM, but rather just main RAM! So no matter how lazy/sloppy the developers are that should ease things a lot, yet they fail to bring impressive port/build with all of those advantages!

AGAIN YOU CONTINUE TO NOT ANSWER MY QUESTION AT HAND. YOU GAVE ME ANOTHER POINTLESS RESPONSE. DO YOU HAVE AN ISSUE WITH ENGLISH ?! 

Yet again I did answer your freaking question then I added the "pointless" part that is not pointless, but rather connected to it what you were saying. You have a issue bro and I am afraid I can't help you. Your ignorance is out of my league to cure you bro.

You can't comprehend that people around the world have a different native language than yours and that your english is secondary not primary so they will most likely speak and write it perfectly and you even stumble sometimes so shut up. Nobody is perfect.

Deal with it... For once in your life!

eyeofcore said:

What about the rayman legends creator saying that the WII U is "powerful" ? The word "power" is a vague term once again. 

Word power is not a vague term, nice try. What about him you say and I already pointed out that he is a graphic artist and a programmer so he is not just a game designer since he works with the hardware and he codes for it! He says that Wii U's hardware is powerful since he worked on it and programmed for it, he said that the game worked with uncompressed textures on Wii U so it takes more RAM and also more resources because more data is processed.

Again the word "power" is undefined. What are we talking about specifically ? Is he talking about the fillrates, shaders, or it's programmability ? No context was given and thus the meaning is lost. What does this have to do with my comment initally at hand ?! 

Definition of word "Power"; the rate of doing work, measured in watts or less frequently horse power / capacity or performance of an engine or other device / the rate of doing work, measured in watts or less frequently horse power
Definition of word "Powerful"; Having great power or strength

@underlined You made a cowardly move my friend, that does not invalidates his statement and he is under NDA so he can't talk about the fillrates, shaders or the programmability so of course since he is under NDA he can not share it. Your counter argument just got denied and is labelled invalid! At least Shin'en was able to share some things;

"Getting into Wii U was much easier then 3DS. Plenty of resources, much faster dev environment."

"Wii U GPU and its API are straightforward. Has plenty of high bandwidth memory. Really easy."

"You need to take advantage of the large shared memory of the Wii U, the huge and very fast EDRAM section and the big CPU caches in the cores."

[Yes, the websites car is directly from the game. / Well, we are simply to small to build HQ assets just for promos]

@italic&underlined It has to do with your assumption that is incorrect, yet you will still continue to force it...

eyeofcore said:

Wow your really hurt about another man over the internet, aren't you ? Your afraid of somebody disproving you ?! *Facepalm*

You are just forcing the premise that is not correct and why I would be hurt over a person that I don't know? It is inlogical and you are trying to provoke me since you are a troll and force a rumor as a fact thus damaging your own credibility. If I was afraid then I would not be responding to you in the first place so you should be facepalming to yourself and not to me for asking these two question.

Then why don't you disprove the rumor at hand ?! There's more evidence for it than against it. "inlogical" ?! LMAO you really have terrible english, don't you ? It's "illogical"!

Why would I since the the person that created the thread that you should treat it as rumor, the guy said it is a rumor and a mere speculation so deal with it claiming that there is more evidence for it then against it is really a fallacy that became the "truth" thanks to people that keep fooling themself and lying themself.

First of all the shaders or ALU's should be 80 to 90% larger for no apparent reason and there are no reasons given why they would be 80 to 90% larger plus there is like 90mm^2 of die space available/dedicated for the GPU exclusively. Wii U die size is 146.48mm^2 according to Chipworks while with protective shell it is 156mm^2 from Anandtech. Chipworks confirmed it is made on 40nm litography.

A Radeon HD 6570M VLIW5 is 104mm^2 with protective shell and without it it should be roughly 96mm^2 and its specifications  are;
480 SPU's / 24 TMU's / 16 ROP's / 6 CU's with clock of 600Mhz and TDP of 30 watts

Lowering clock to by 8.5% to 550Mhz would lower TDP to 25 watts, now you will try to deny that it can not fit into 90mm^2 and that it consumes way to more for the Wii U. I can counter that easily since as you Fatslob-:O said that they can shrink/increase density by 10% by optimizing the design so it would be 86,4mm^2 and would also shave off another 10% of TDP thus it is now at 22.5 watts! Considering that Wii U's GPU codenamed Latte would be customized by Nintendo and AMD so more improvements to design and since Chipworks guy said that he can not identify which part of series and exact model this GPU is thus it proves that Wii U's GPU Latte is indeed a customized, a heavily customized GPU.

Now you will still argue that it is not possible because the Wii U consumes 30 to 35 watts maximum yet I can counter that simply by saying that Wii U has a Class A power suply with efficiency of 90% and is rated 75 watts so it can handle 67.5 watts at maximum. Since Wii U consumes 30-35 watts then we can assume that Nintendo locked some resources in Wii U's SDK/Firmware, you will try to counter that yet PlayStation 3 SDK/Firmware was updated and it freed up aditional 70MB for Games from the OS and with 3DS that allowed more CPU cycles for games also Xbox 360 SDK got updated which resulted in less usage of RAM, CPU and GPU resources with also one core not being exclusive to audio anymore like in the beginning. Shin'en said this in the interview;

"And for the future, don’t forget that in many consoles, early in their life cycle, not all resources were already useable or accessible via the SDK (Software Development Kit). So we are sure the best is yet to come for Wii U."

eyeofcore said:

Who the hell cares what renesas says ?! I want to see the end result! 

So nobody cares about a statement from a company that designed and produces eDRAM for Wii U? It was said by a Renesas executive and not by some guy on a forum that may or may not be correct, this is straight from Renesas its self and when DualShockers wrote "secret sauce" that means that it is most likely a custom design and "state-of-the-art know-how" literally means/hints use of best of the best is being applied to the Wii U. If you want results then wait, its like chemistry the result is not instant nor designing a weapon too like mortars. It needs time to shine like on PLayStation 3, like on Xbox 360 like the Source engine.

You can not get the result right from the start... :P

By your logic we would have seen everything from start in every console/handheld/mobile/technology generation... Your logic. :/

I like how you ignore PC. Are you going to answer the comment directly or will you continue posting more pointless crap for me to read ? 

I answered to you directly, it seems you can not comprehend that I answered to you question directly... I am sorry for ignoring the PC, did I hurt your feelings or something? When comes to devices PC is configurable while consoles, handhelds and mobile phones have basically bunch of SKU's and you can not upgrade their CPU or GPU and RAM except for N64 that had Expansion pack that doubled the amount of RAM on N64.


eyeofcore said:

Who gives a damn about shin'en ?! Are they really that important ?!

Same could be said for Crytek, Rare, Sony Santa Monica, Guerilla Games, Valve, ID Software, Epic Games, etc... When they were small and not really relevant also they are really that important because they use the hardware and they will try to get most out of and you forgot that these guys came from demo scene with different principles than modern day programmers.

Look at FAST Racing League and Jet Rocket that people would have think it is a game on Xbox 360 and not on Wii and Shin'en did things that other did not on Wii, including ambient occlusion in theri games. Don't doubt these guys that were in PC demo scene.

More pointless crap for not answering my intital concerns at hand, eh ? 

Hey, I can't blame you. For you it is pointless crap while for others it is an answer and I was discussing your logic so you logic you actually think that your opinion is pointless crap. I am not trolling you, I am just stating your opinion/behaviour that you have shown

eyeofcore said:

More people ?! Wow man you really can't defend your own grounds. 

DID I CLAIM THAT THE eDRAM ONLY HAD 35GB/s BANDWIDTH ? (Again your lack of english understanding makes everyone confused.) 

Seriously?

Sorry for the misunderstanding then, so would you be so kind to tell me how much you think it has?
Can you also at the very least provide some kind of proof like a formula to calculate it?

And please, don't forget I can get about 10GB/s of bandwidth with just 1 MB of the old embedded memory on the Gamecube's GPU!

Please answer the last two questions involving bandwidth!

It depends on where the memory needs to flow and BTW it's 3MB of eDRAM. Kinda sad how you don't know about the hardware of a certain console from your favourite console manufacturer. 

Agreed and I know that except we have two seperate pools of eDRAM not one pool if you are suggesting that it is one pool and its not and it is actually sad that you assume that a certain company is my favorite console manufacturer since only console I ever had was the original PlayStation that I got in 1999 and I am on PC/x86 platform since 2004. I admit that I had a brief contact with PlayStation 2 and PlayStation 3 at my friends house and thats about it. I've never seen in person any platform from Nintendo so I did not even touched let alone played games on Nintendo's platforms. I don't own any platform from Nintendo nor their games, your point and provokation is without any foundation so move along. Your desperation and ignorance that stinks this place.

The memory flow is not constant and what's more is that the constraints aren't clear either for every different workloads. 

Everything has variables... Tell me something I did not knew already.


Your ignorance should be studied, you would be an amazing individual to subject to some psychological experiments...

You are both highly and lowly inteligent at the same time, like the CPU with low amount of Cache, even a idiot can have a moment of pure geniousness!

Another shitty post from you! I'm not surprised. Cool story you gave me. Now go and directly answer to my first comment instead of going on a stupid worthless tangent with me. 

I am not surprised you are in denial and as I said countless time before I did and you went into stupid worthless tangent by avoiding my questions and counter arguments. You are just a troll, no credibility and you avoided question about the freakin bandwidth...

Megafenix uses the formula every single day in the forums yet you so called "PC master race" knows nothing.

fatslob-:O said:
curl-6 said:
fatslob-:O said:

Who cares about shin'en! 

They're an excellent source to use here because (A) they are a very technically talented studio who know their stuff, (B) they are one of the few studios seriously invested in Wii U development, and hence have more experience with the hardware, and (C) they are pretty much the only experienced Wii U dev willing to discuss tech details. (Good luck getting EAD Tokyo talk texture resolution, rendering methods, etc. haha)

I'm not dissing them but if eyeofcore has to type up more shitty and long posts for me to read I'll only get more irritated because he's not answering to my first comment directly! Seriously, does that dude even read what he posts or some shit ?! His long paragraphs or essay  are redundant too which makes him even more tiresome. 

You are, stop being in denial. It is "shitty" to you because you can't answer it and that is your issue if you get irritated by my long response since you don't have patience and I did answer your first comment. Stop being in denial. I read before I reply it. You have an issue bro.

You are tiresome to the whole community... :/

@eyeofcore If your listening to me why don't you stop writing an essay to a couple lined comment and start by typing up less shit while getting to the point directly ?! 

As I wrote before, it is "pointless shit" to you and I see that you don't have a virtue of patience... If discussing things that are related to the topic are "shit" then all your responses are by your definition and logic "shit" and "pointless shit" also. It is sad that you and some other people are did not learn to counter act one thing that our brain does and that it to find simplest way to get answer while maintaining some sort of logic, the brain is doing that because it wants to save energy for upcoming decisions and tasks.

You always chose the simplest decision/answer and it is predictable, you don't think about other factors, you don't do the research and you don't connect the dots at all to have "Eureka!/Aha!" moment when you solve a problem. You continuisly prolonge things, you are looping.

 

fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:

thought you were smarter than that, or maybe you are but just decided to play fool when its not convenient

very well

here

4096bits * 500mhz/(8bit*1000) = 256GB/s

 

thats how you get the internal bandwidth of the xbox 360 edra to which the rops have full access, then when the rops finish the job they pass the result to the GPU frame buffer through an external bus of 32GB/s

 

so clearly no portcan work with the bandwidth you are claiming, and its also very strange you fail to provide more bandwidth than what i can get with just 4 megabytes of the old gamecube embedded memory on the gpu with a brand new edram

 

so, tell me, how did you get the wii u edram bandwidth?dont forget its embedded directly on the gpu, no external buses so please

lets see, eyeofcore suggested that you are claiming 512 bits wide edram for the wii u right?

512 bits edram * 550mhz gpu / (8bit* 1000) = 35.2GB/s

 

coincidence?

yea right, two strikes dude, one more and you are out

renesas doesnt support 512 bits, go to their website dude

Memory SizeConfiguration
(word x bit)
Random Access
Freq.
DVCC (VDD)Tj
64 Mb 256 Kw x 256 b 133 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 32 Kw x 256 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C
8 Mb 64 Kw x 128 b 220 MHz 1.1 V ± 0.1 V 0°C - 105°C
-40°C - 125°C

Contact Us

 

just to let you know, Mb are megabits, just in case you didnt know

so

8 megabytes macro of 256 bits

1 megabyte macro of 256 bits

1 megabyte acro of 128 bits

 

can you please do the honors?

wii u edram its 32 megabytes, so...?

Again when did I ever claim the bandwidth to be 35.2 GB/s. You clearly made a strawman argument there. 

The gamecube had 3MB of embedded ram! My god, how can an apologist like yourself not know about the company's hardware that your obviously biased to. 

BTW  The WII Us reported embedded ram bandwidth is 70.4 GB/s which gives it a bus width of 1024 bits.

The one with strikes here is you, clown becuase your clearly some stranger who hasn't proven his own grounds yet on this site. 

How about solving the volume integral of 2x + 2y where the boundries for both x and y are -1 to 1 ? (This is something any engineer could and even including prospective ones too! You gonna cop out on this one or what ?) 

You sound like some fraud getting overly defensive about a PC master race's evaluation of your beloved WII U. 


and  you say you are not a troll claiming 1024 bits when the old xbox had 4096 bits?

70GB/s, yea right

renesas said that wii u uses their latest technology,

fro 1024,4096 and 8192 whcih is latest to you dude?

shinen also says that wii u edram bandwidth is huge

come on dude, even by today standards like ddr3 of 50GB/s, 70GB/s fall short considering we are talking about embedded edram directly in the gpu die

 

also, shinen comented that with just 7 megabytes of edra of wiiu you can do 720p with double buffering while with xbox you need the full 10 negas for that, and even miocrosoft admits it

 

so no, anywhere below 4096bits is bullshit and trolling

obviously the right choice and the must logical is 8192bits, given siomething like 563GB/s

My god, you need some tampons man and I really mean it! Your seriously that hurt over another master race's evaluation of the WII U ?!

Only person in here that needs tampons is you and why would he be hurt over you... You can't even evaluvate poo.

It sounds about right because afterall the X1 is using 32MB of integrated SRAM which is significantly faster than integrated DRAM. You have a problem with 70 GB/s or else prove me wrong otherwise. Why does the WII Us eDRAM have MORE bandwidth than intel's eDRAM for iris pro 5200 ?! That doesn't sound right to me for a bleeding edge manufacturer like intel to have a lower bandwidth than what the WII U features for it's relatively small die size. If anything it sounds like the eDRAMs bandwidth may actually be 35 GB/s. 

Xbox One and Wii U embedded RAM implementations are different, we have proven you wrong and you are denying it so how many times do we need to repeat to you so you can learn for once. Wii U eDRAM and Intel eDRAM implementations are different and eDRAM in Iris Pro 5200 acts as L4 Cache so as you know L1 is the faster then you have slower L2 then you have even slower L3 Cache and then you have L4 Cache that is eDRAM on Iris Pro 5200. So it takes time to get...

http://www.anandtech.com/show/6911/intels-return-to-the-dram-business-haswell-gt3e-to-integrate-128mb-edram

Just because Intel is bleeding edge manufacturer that does not mean that they are always perfect and also does not mean that it is the best of the best because they can screw up in the design or other things related. Now you are forcing a premise of 35 GB/s that is not correct and there is a problem with you in your logic and your way of thinking.

You logic is broken, your brain simplifies things too much so I doubt that you can comprehend complex things according to your responses, behaviour and denial. Our brain loves to simplify things to save up energy for upcoming decisions and tasks so of course you will chose easier/simpler route and you are just doing that in this thread for the whole time. You are victim of our natural design.

You did not learn to resist to our potential flaw, so you don't count other factors, you are not really curious, you don't do the research.

You are practically gimped when comes to logical and rational thinking...

@Bold When did they say that ? Provide me a source or otherwise. They did not disclose the numbers just so you know because the word "latest" means newest BTW but it doesn't mean more "powerful". We all know the GT640 is the newer card yet it's weaker than the older hd 7770 so your point about latest is moot. 

I already provided the source, you went into denial mode and flat out ignore those two links;... You will do it again!

http://www.dualshockers.com/2013/08/04/nintendo-and-the-wii-u-may-be-in-trouble-due-to-closure-of-vital-semiconductor-factory/

That 50 GB/s figure only applies to multichannel set-ups. 

Who cares about shin'en! 

Oh this again!

Same could be said for Crytek, Rare, Sony Santa Monica, Guerilla Games, Valve, ID Software, Epic Games, etc... When they were small and not really relevant also they are really that important because they use the hardware and they will try to get most out of and you forgot that these guys came from demo scene with different principles than modern day programmers.

Shin'en tries to maximize the utilization of the console and push the graphics as much as possible as they can, same as Crytek, Sony Santa Monica, Epic Games, Guerilla Games and so on. So people care about Shin'en, you just can't handle the truth.

Look at FAST Racing League and Jet Rocket that people would have think it is a game on Xbox 360 and not on Wii and Shin'en did things that other did not on Wii, including ambient occlusion in theri games. Don't doubt these guys that were in PC demo scene.

The one bullshiting here is you obviously because none of these things add up. 

He is not BSing anyone, you BSing everyone in the forum acting innocent while you are just a troll avoiding questions...

I'm straight up done talking to an apologist like you. Go and take this up to pemalite to see if he'll say anything different from me! 

The problem is that he is not an apologist, the guy does the research you don't and you just look for the rumors then you state them as facts and when you get busted then you are in denial while also insulting people and claiming things that are not true like you said I was a Nintendo fanboy in an in direct way then I swiftly countered that. You are a narrow minded person, what a waste of genes/DNA...

Fellow members if you are curious then check this out;

http://gamrconnect.vgchartz.com/post.php?id=5830987

http://gamrconnect.vgchartz.com/post.php?id=5831479



fatslob-:O said:
Pemalite said:
megafenix said:

more poor excuses and bullshit from this suppodly tech guy

selecting 1024 bits over 8192 or even 4096 bits when xbox used the 4096 bits alost a decade ago


What? Are you talking memory bits? Or something more internal?

Because the origional Xbox had a 128bit memory bus with 6.4GB/s of bandwidth.

The Xbox 360 however had a 128bit memory bus with 22.4GB/s.
Where the Xbox 360 deviates is with the eDRAM.
It has 256GB/s of bandwidth inside the eDRAM which is actually 1024bit@2ghz = 256GB/s.
However, that's not interconnect bandwidth to the GPU or system memory, the bandwidth between the eDRAM and the GPU is 32GB/s.
If you want wan't a fair comparison Sandy Bridge has an internal ring bus that's 436GB/s.
The Radeon 3850's Ring bus was also 1024 bit.
The Radeon 48xx series had a 2048bit internal hub bus.


So this is actually nothing new.
However when you start talking interconnect, then it's simply not going to happen, 4096bit busses would drive up PCB/package complexity exponentially.


Basically, no part of the Origional Xbox or Xbox 360 had a 4096bit bus, unless you know something I don't know from an architectural perspective?

Gee I wonder who's the poor excuse for a tech guy right now ? /sarcasm *rollseyes* 

I wonder if we can't take our leave now from these peas ... I mean people, pemalite ?  

@pemalite

Me and Megafenix said that couple of pages ago... You mostly said thing that I and Megafenix knew and used, thanks for the bolded part. That's new. :)

@fatslob-:O

You are full of excuses, avoidance, sidesteps and ignorance... You could not even answer the question about bandwidth... Bwahaha.

Is permalite your dad or brother, your boyfriend or your husband? Everytime you lose you call him... Are you crying or something?

fatslob-:O said:
megafenix said:
fatslob-:O said:
AgentZorn said:
fatsob, are you interested in getting a Wii U? You seem to be very interested.

No sir, I'm only interested in knowing about it's hardware sir. LOL

Once the games come I'll eventually pick it up and enjoy it unlike others here arguing something pointless. 


like wii u edram being only 70gb/s which suggests an edram of 2048bits despite renesas doesnt support it?

saying things like there is little use for edram since its onlyu 32 megabytes despite eyeofcore schooling you, no sorry, i mean, shinen schooling you?

choosing 1024 bits despìte gamecube was 512 bits more than a decade ago?

choosing rthe worst edram configuration instead of the best or at least th middle one of 4096 bits despite renesdas saying that wii u use the best of the best?

 

talk about pointless

8192 bits

LOL You know that these corporations don't care about you. Quit defending the WII Us specs and start excepting reality. Nintendo obviously cheeped out on the WII U hardware as a whole. Just deal with it. 

Why don't you deal with it that we don't care about that corporation and you can shove that up your ass then deal with it that we are not defending the Wii U specs because the specs are not known while you just state rumors/speculation as facts so you are the prick that does not accepting the reality. Me and Megafenix are doing a freaking reality check on you yet you are in denial and so what if it cheaped out on that or that, Nintendo is not the only one. I will allow you to think that Nintendo is the only one that cheaped out, live in a fairy tale.

You should dealt with it that me and Megafenix tried explain to you that you are full of shit in a polite way.

You don't read, you can't comprehend, you stalk people, you don't give evidence, you try to dismiss our evidence with pathetic arguments that are not valid then you spread Fear Uncertainity Doubt and when you get pwned then you call your dad/brother/boyfriend/husband(Pemalite).

You are pathetic... Ignorant a**hole of a person.

 

Moderated - Kresnik.



fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:
ports will always make thw wii u to underpeform unless the developers change te source code to reallocate resources from main ram to cache or edram, and of course, they also have to leave the sound process to the dsp and use the other core of the wii u cpu for other job to speed up things

of course this takes time and effort and must of the dont ,mind doing it

This sounds like damage control for the WII U. 

If the WII U truly had more "power" these ports wouldn't be struggling because a game will automatically take advantage of more powerful hardware such as the PC. It wouldn't take any effort whatsoever if the hardware was truly more powerful. 

Clearly the WII U fails to provide the much needed bandwidth, fillrate, and processing power for games like COD and batman. 

it5 doesnt really matter how powerful it is if the source code doesnt take profit of the system

the port comes fro xbox, so what the source code takes into account?

only 1 mega of cache

only 10 megas of cache

 

even if wiiu has triple of that, the source code doesnt use it since its a port of xbox, all what could have been fitted on teh extra cache and edram does directky to main ram sicne the source code tells the system to do so

 

so, to sollve it, developers have to reallocate the resources, but mmust of them dont mind since are in a hurry

 

ps4 doesnt have to struggle with this since doesnt have edram, just a big gddr5 main ram, so no matter hopw lazy the developers are, the possibility of making a crap port is very unlikely

You need to understand that there are more factors to a games performance rather than memory usage! 

It is not the usage of memory alone that makes the game performs better. There are other things like fillrate which determine how fast it takes to write textures to a surface and how many pixels it can push out or the processing power required to calculate animations and lighting as well but all of those don't matter if it is under utilizied by low bandwidth either. 

The WII U has clearly demonstrated a lack of these elements altogether in some games shown. 

 

you understimate edram to much, unlike xbox, its not just for framebuffer, wiiu has edram also for textures, doing intensive cpu works though l3 cache, etc

 

the problem is that when developers dont make good use of the edram and use to much ram, the main ram bottleneck the gpu and cpu capailities, tahts whats going on

 

imagine an hd8800 with only 128 megabytes of vram

does it matter?cant we just rely on the main ram?

I'll say it one more time and never again in this thread. You cannot rely the eDRAM everytime becuase it is only used for caching frequently accessed data. There will be a time where the eDRAM misses and that's when the WII U is going to have to access the main memory and only the main memory. The WII U cannot by fed with just 32MB of data, it needs more than that and accessing the main memory is important for that operation. 

There are other bottlenecks in the WII U such as the fillrates and processing power too! The WII U is theoretically all around weaker than the PS360!

http://www.techpowerup.com/gpudb/1919/xbox-360-gpu.html

Look at the specifications of Xbox 360's GPU and if Wii U's GPU has more of anything it won't be used if the game is a port of Xbox 360 build that uses specific amount of SPU's, TMU's, ROP's, CU's also you ignore differences in the hardware that cause further difference and Xbox 360 and Wii U's eDRAM in couple of ways operate differently. ROP's are embedded into eDRAM in Xbox 360 while in Wii U it is not! That's a major difference...

Only issue why you can't get the picture about Wii U's hardware is that you check difference in architecture and how they operate...

If Wii U has any SPU, TMU, ROP and CU less than Xbox 360 then it would most likely not operate well since the build of that game is designed to use that and that. If the game uses just 10 MB of eDRAM on the GPU then it will only use 10MB with all tricks that were needed on Xbox 360 and that are not needed on Wii U yet those tricks may cause issue for Wii U's GPU that is different in every way.

You are basically forcing a premise that lets say code for Itanium should work on x86-64 because it is 64 bit code... It won't, the architecture is different.

Can you finally understand now?



Holy freaking Christ that's a lot of text, eyeofcore. XD
Maybe trim the quote trees, eliminate the stuff we've already covered? That's just too much.