By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - FAST Racing NEO powered by 2nd generation engine for Wii U supports and uses 4k-8k textures

fatsob, are you interested in getting a Wii U? You seem to be very interested.



Around the Network

I would love aif these guys can match the best of 360/ps3 on my wiiu, heck a metroid with halo 4 graphics, or a zelda that matches the LOU, would make me very happy, especially considering the wiiu has been only out for a year.



Wyrdness said:
fatslob-:O said:

The clowns didn't school me. Unfortunately for yourself those clowns got schooled by others too in this thread. 

http://www.ign.com/boards/threads/official-wii-u-lobby-specs-graphics-power-thread.452775697/page-195#post-483763771

I guess it's impossible like yourself to tell because afterall you don't speak latin LOL. Nobody here knows who's the pope or who's the educated so why don't you come back and get schooled before calling others out when a fraud like yourself doesn't know LOL. BTW have fun enjoying to agree with a clown like eyeofcore who got called out on everyone at anadtech LMAO. 

http://forums.anandtech.com/showthread.php?p=35562063

http://forums.anandtech.com/showthread.php?t=2346766

Hynad didn't disprove ANYTHING in that thread. Clearly someone here has a very short memory of what happened. 


Oh yes they did and they did it quite brutally he's asked you a question which you completely tried to dodge, Hynad made the point that even with lower specs consoles run games at a level that PCs need significantly higher specs which all in all he's right, his point wasn't really disproved so learn to read. Point to where I mentioned eyecore or is that another reading problem you have? If it is I'm disappointed in the village elder's lack of obligation to ensuring his follower can read, as well as the village elder you have a man crush on this eyecore huh.

He didn't ask me that question and plus they have no sources either for their own claims. Hynad could be wrong because consoles practically have no more advantages left for them to be deemed more efficient than PCs but once again you jump to the same conclusion as everyone when you let john carmack do the speaking for you. 

FYI you couldn't even read your own posts LOL. "Ironic that the clowns are schooling you in bandwidth and even giving you clues exposing your own ignorance, what's worse the clown or the person schooled by them?"

I wonder if your going to back track that statement because obviously you were referring to those two clowns. Is that hate consuming you ? LOL

Have fun with that neverland gaming rig you got going there LOL. "I game on PC." LMAO That statement will haunt you till the end of earth because you obviously shown ignorance on the subjects no matter what. BTW even if you were referring to the other clown megafeenix it's no use cause they're both agree with each other LOL.

I guess having nothing to play must suck hard for you, eh ? LMAO

How ironic that a man from the UK couldn't even know that he was referencing the two jesters LOL. A place where it prides on it's own origin of language. Bwahahaha. You couldn't even get the simple concept of referencing! I guess you need to be taught a hard lesson on the englsih language in the own schools of UK that are masters of it LOL. A fraud that keeps arguing at the end of the day is still a fraud that knows nothing. 



AgentZorn said:
fatsob, are you interested in getting a Wii U? You seem to be very interested.

No sir, I'm only interested in knowing about it's hardware sir. LOL

Once the games come I'll eventually pick it up and enjoy it unlike others here arguing something pointless. 



fatslob-:O said:
AgentZorn said:
fatsob, are you interested in getting a Wii U? You seem to be very interested.

No sir, I'm only interested in knowing about it's hardware sir. LOL

Once the games come I'll eventually pick it up and enjoy it unlike others here arguing something pointless. 


like wii u edram being only 70gb/s which suggests an edram of 2048bits despite renesas doesnt support it?

saying things like there is little use for edram since its onlyu 32 megabytes despite eyeofcore schooling you, no sorry, i mean, shinen schooling you?

choosing 1024 bits despìte gamecube was 512 bits more than a decade ago?

choosing rthe worst edram configuration instead of the best or at least th middle one of 4096 bits despite renesdas saying that wii u use the best of the best?

 

talk about pointless

8192 bits



Around the Network
megafenix said:
fatslob-:O said:
AgentZorn said:
fatsob, are you interested in getting a Wii U? You seem to be very interested.

No sir, I'm only interested in knowing about it's hardware sir. LOL

Once the games come I'll eventually pick it up and enjoy it unlike others here arguing something pointless. 


like wii u edram being only 70gb/s which suggests an edram of 2048bits despite renesas doesnt support it?

saying things like there is little use for edram since its onlyu 32 megabytes despite eyeofcore schooling you, no sorry, i mean, shinen schooling you?

choosing 1024 bits despìte gamecube was 512 bits more than a decade ago?

choosing rthe worst edram configuration instead of the best or at least th middle one of 4096 bits despite renesdas saying that wii u use the best of the best?

 

talk about pointless

8192 bits

LOL You know that these corporations don't care about you. Quit defending the WII Us specs and start excepting reality. Nintendo obviously cheeped out on the WII U hardware as a whole. Just deal with it. 



megafenix said:
Pemalite said:
megafenix said:
Pemalite said:
megafenix said:

well, i would like to think the way you do, but 2ghz for the edram is unlikely, nec reported 40nm edra at a maxiu of 800mhz in 2007


You're getting yourself confused.
I never stated the actuall eDRAM runs at 2ghz.


sorry, is just that you got many things fixed here and there and i couoldnt get what you tried to mean

could you simplify it and make your point?

what has that abything to do with the wii u edra anyway?

gamecube was capable of 512 bits, why wii u would bottleneck with more than 1024bits anyway?

and isnt that like saying that nintendo selected the worse edram instead of the best likie the 8192bits or 4096bits?

isnt 1024 bits for wiiu edram to short compared to main ram ddr3 ram of 51.2gb/s of current standard pcs or even the xbox one esram of 200gb/s?

That was simplified.

And you brought up the Xbox.

As for the eDRAM in the Wii U, it's not the best and it's also not the worst, if Nintendo wanted the best, they would have paid allot of cash for it and it wouldn't have been only 32Mb, the cost of the console could have gone up as a result, same with TDP and Power requirements.
It is however *good enough* for everything the Wii U currently needs.

As for the rest, I explained all that in my last couple of posts, I don't feel inclined to explain that all again, nor do I have the time currently. (I.E. I'll get back to you in 3-4 hours if need be.)


pathetic, so you like to mess up things so only you can undesrtand tourself and saying that others who dont get what you say are fools?

yea right

i was being sarcastic lol

sandy bridge and your blabla bla, just To go off on a tangent

just look at gaecube, 512 bits dude, why after more than a decade you suggest only double of that

yea, i brought tje xbox, so what?

is the wii u edram on a separatye die like xbox gpu was with its edram?

no dude, is in teh same die as the gpu, just like gamecube flipper, just that this time we have lots of megabytes this time and a newer edram design

 

dont joke around dude, thats a lot illogical

and again, renesas says best edram, which is 8192 bits

shinen says lotys of bandwidth on teh edram

1024bits will give you just 35gb/s since the clock speed with teh gpu is 550mhz, obviosuly is to have coherency just like gamnecube did with its ebedded memory

you cant get 70gb/s when you dont have the 2048bits option ,a nd even that is very few

choice is 8192 bits, 563gb/s

the gpu is able to handle that


Bits were always a marketing term in the past up to Dreamcast. It's not an actual measurement when it came to game consoles. So bits have not mattered really since GBA and have always been worthless. It's not an OS.



fatslob-:O said:
megafenix said:
fatslob-:O said:
AgentZorn said:
fatsob, are you interested in getting a Wii U? You seem to be very interested.

No sir, I'm only interested in knowing about it's hardware sir. LOL

Once the games come I'll eventually pick it up and enjoy it unlike others here arguing something pointless. 


like wii u edram being only 70gb/s which suggests an edram of 2048bits despite renesas doesnt support it?

saying things like there is little use for edram since its onlyu 32 megabytes despite eyeofcore schooling you, no sorry, i mean, shinen schooling you?

choosing 1024 bits despìte gamecube was 512 bits more than a decade ago?

choosing rthe worst edram configuration instead of the best or at least th middle one of 4096 bits despite renesdas saying that wii u use the best of the best?

 

talk about pointless

8192 bits

LOL You know that these corporations don't care about you. Quit defending the WII Us specs and start excepting reality. Nintendo obviously cheeped out on the WII U hardware as a whole. Just deal with it. 


ne, i am not defending the specs, just exposing facts

you should stop desceiving people and lowering the actual specs

 

coe on, does it really hurt if wii u has 500 gigaflops in the gpu and 563 gb/s of edram bandwidth?

thats not going to make it better than xbox one and ps4, yet it clearly sets it atop the current consoles

 

what you do is the contrary, you lower the specs so much that arent beleiveable



Ryudo said:
megafenix said:
Pemalite said:
megafenix said:
Pemalite said:
megafenix said:

well, i would like to think the way you do, but 2ghz for the edram is unlikely, nec reported 40nm edra at a maxiu of 800mhz in 2007


You're getting yourself confused.
I never stated the actuall eDRAM runs at 2ghz.


sorry, is just that you got many things fixed here and there and i couoldnt get what you tried to mean

could you simplify it and make your point?

what has that abything to do with the wii u edra anyway?

gamecube was capable of 512 bits, why wii u would bottleneck with more than 1024bits anyway?

and isnt that like saying that nintendo selected the worse edram instead of the best likie the 8192bits or 4096bits?

isnt 1024 bits for wiiu edram to short compared to main ram ddr3 ram of 51.2gb/s of current standard pcs or even the xbox one esram of 200gb/s?

That was simplified.

And you brought up the Xbox.

As for the eDRAM in the Wii U, it's not the best and it's also not the worst, if Nintendo wanted the best, they would have paid allot of cash for it and it wouldn't have been only 32Mb, the cost of the console could have gone up as a result, same with TDP and Power requirements.
It is however *good enough* for everything the Wii U currently needs.

As for the rest, I explained all that in my last couple of posts, I don't feel inclined to explain that all again, nor do I have the time currently. (I.E. I'll get back to you in 3-4 hours if need be.)


pathetic, so you like to mess up things so only you can undesrtand tourself and saying that others who dont get what you say are fools?

yea right

i was being sarcastic lol

sandy bridge and your blabla bla, just To go off on a tangent

just look at gaecube, 512 bits dude, why after more than a decade you suggest only double of that

yea, i brought tje xbox, so what?

is the wii u edram on a separatye die like xbox gpu was with its edram?

no dude, is in teh same die as the gpu, just like gamecube flipper, just that this time we have lots of megabytes this time and a newer edram design

 

dont joke around dude, thats a lot illogical

and again, renesas says best edram, which is 8192 bits

shinen says lotys of bandwidth on teh edram

1024bits will give you just 35gb/s since the clock speed with teh gpu is 550mhz, obviosuly is to have coherency just like gamnecube did with its ebedded memory

you cant get 70gb/s when you dont have the 2048bits option ,a nd even that is very few

choice is 8192 bits, 563gb/s

the gpu is able to handle that


Bits were always a marketing term in the past up to Dreamcast. It's not an actual measurement when it came to game consoles. So bits have not mattered really since GBA and have always been worthless. It's not an OS.

i a not talking abou bit wars, thats another matter

here

http://www.segatech.com/gamecube/overview/

"

Texture Cache

*Note: Data bus transfer is now 2.6 GB/sec due to revised specs, and the external 64-bit data-bus runs at 324 MHz now. 

More info on the internal bandwidth of the caches from here:

Both internal memory buffers have a sustained latency of under 5 nanoseconds. The frame and z-buffer memory is capable of 7.68 Gbytes/second of bandwidth. The texture buffer boasts an even faster bandwidth of 10.4 Gbytes/s because it's divided into 32 independent macros, each 16 bits wide for a total I/O of 512 bits. This gives each macro its own address bus, so that all 32 macros can be accessed simultaneously, said Mark-Eric Jones, vice president of marketing for Mosys. 

"

 

if you go to renesas website, you will clearly see 1024 bits, 4096 bits and 8192 bits for 32 megabytes, just transform the megabuts to mnegabytes and mutiply the necessarty number of macros to have 32 megabytes

 

taliking about bit wars, thsi should probe to be funny

think tjhat gameboy was the must sold console on the 90?

do you think virtual boy has been the worst console?

maybe we shouldgo back to the past

enjoy

http://www.youtube.com/watch?v=_u5dtBtG9yU



fatslob-:O said:
megafenix said:
fatslob-:O said:
AgentZorn said:
fatsob, are you interested in getting a Wii U? You seem to be very interested.

No sir, I'm only interested in knowing about it's hardware sir. LOL

Once the games come I'll eventually pick it up and enjoy it unlike others here arguing something pointless. 


like wii u edram being only 70gb/s which suggests an edram of 2048bits despite renesas doesnt support it?

saying things like there is little use for edram since its onlyu 32 megabytes despite eyeofcore schooling you, no sorry, i mean, shinen schooling you?

choosing 1024 bits despìte gamecube was 512 bits more than a decade ago?

choosing rthe worst edram configuration instead of the best or at least th middle one of 4096 bits despite renesdas saying that wii u use the best of the best?

 

talk about pointless

8192 bits

LOL You know that these corporations don't care about you. Quit defending the WII Us specs and start excepting reality. Nintendo obviously cheeped out on the WII U hardware as a whole. Just deal with it. 

And do you think nintendo fans really care, sure they argue about te wiiu being  much more powerful then currentgen, just to to feel better out the wiiu purchase but they really don't care, just look at 3d mario world, its a nice looking game by all means, but nothing close to being top tier when you compare it to 360/ps3, yet nintendo fans feels it nextgen cause its mario, finally in HD, now replace mario and put knack in those mario 3d world pics and everybody would be saying how ugly it is.