By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - FAST Racing NEO powered by 2nd generation engine for Wii U supports and uses 4k-8k textures

Ninjahound101 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:
The real question here is the game going to show all 4K-8K textures at once ? I presume that they won't because the WII U is severely lacking memory bandwidth, memory size, as well as TMUs to even attempt showing every detail of the texture. The engine that they have is probably modified to support john carmacks megatexture technology to dynamically stream textures at a different resolution.

Shin'en have said on twitter that memory size was not a problem, as they could compress a 4k texture down to 10MB.

As for memory bandwidth they previously said that:

"Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency."

Just how many 4-8K textures are they going to use then ? Once you get alot of objects on screen the complexity increases. 

As for shin'en's over statement about memory bandwidth, there's only so much you can do with 32MB. See the reason as to why bandwidth was important in the first place was to feed the GPU otherwise functional units will start to get under utilized. TMUs, shaders, ROPS, and everything else in the GPU is exetremely dependent on it. The reason as to why the X1 wasn't able to achieve 1080p for some of multiplatform titles or exclusives has to do with the main memory bandwidth being a bottleneck. (Aside from the lowered amount of ROPS ofcourse.) How else does the GPU get fed with alot of other data ? You can not keep constantly relying on the eDRAM to feed the GPU much like how the X1 relies on the eSRAM! It has to eventually access the main memory and only the main memory because it likely has the most data being resident on it. The purpose of caching is to SAVE BANDWIDTH by storing frequently accessed data. It was not meant to COMPLETELY FEED THE GPU

Now don't get me wrong! I'm not saying that the WII U isn't capable of handling 4-8K textures but it shouldn't be able to handle it at a regular basis given that it likely has a lack of TMUs and everything else I stated before. Do not fret about this issue. There are other ways of about solving this issue like I had said before. The megatexture technology introduced by john carmack in RAGE will resolve alot of issues regarding the WII Us lack of bandwidth and TMUs by trying to only stream the required highest resolution textures for a certain assests of a scene so that it can conserve alot of texture fillrates ad bandwidth so that the level of detail gets scaled back when objects are far away while also being scaled up too where the objects near the camera will have the highest level of detail. 

The car/spaceship models will probably always be 4-8K(maybe 8K at small numbers and 4 at larger numbers) and other significant stuff will be in those resolutions while the rest will be 1080p as far as i can tell

I was expecting just the car/spaceship and a part of the track to have 8K textures whereas everything else that is within 20 meters will have 4K textures and from 20 - 50 meters to sport 2K textures. Everything past 50 meters will just have 1080p to 720p textures. 



Around the Network
fatslob-:O said:
megafenix said:


here we go again

"

The easiest way that I can explain this is that when you take each unit of time that the Wii U eDRAM can do work with separate tasks as compared to the 1 Gigabyte of slower RAM, the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM, regardless of the fact that the size and number applied makes the 1 Gigabyte of DDR3 RAM seem larger. These are units of both time and space. Fast eDRAM that can be used at a speed more useful to the CPU and GPU have certain advantages, that when exploited, give the console great gains in performance.

The eDRAM of the Wii U is embedded right onto the chip logic, which for most intent and purposes negates the classic In/Out bottleneck that developers have faced in the past as well. Reading and writing directly in regard to all of the chips on the Multi Chip Module as instructed.

"

Do you even read what you post ? 

This crappy post has absolutely nothing with what I stated.


really?

the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM

 

on purpose, any luck why xbox strugles with 1080p?

even though i gave you a clue and even entioned your daddy sony?



megafenix said:
fatslob-:O said:
megafenix said:


here we go again

"

The easiest way that I can explain this is that when you take each unit of time that the Wii U eDRAM can do work with separate tasks as compared to the 1 Gigabyte of slower RAM, the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM, regardless of the fact that the size and number applied makes the 1 Gigabyte of DDR3 RAM seem larger. These are units of both time and space. Fast eDRAM that can be used at a speed more useful to the CPU and GPU have certain advantages, that when exploited, give the console great gains in performance.

The eDRAM of the Wii U is embedded right onto the chip logic, which for most intent and purposes negates the classic In/Out bottleneck that developers have faced in the past as well. Reading and writing directly in regard to all of the chips on the Multi Chip Module as instructed.

"

Do you even read what you post ? 

This crappy post has absolutely nothing with what I stated.


really?

the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM

Again you didn't even address what initial concerns I had within the first comment of this thread. 

That line was worthless and unnessecary as well. Another crappy post from a corporate apologist like yourself but at this point I'm not really surprised. 

User was banned for this post - Kantor



fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:


here we go again

"

The easiest way that I can explain this is that when you take each unit of time that the Wii U eDRAM can do work with separate tasks as compared to the 1 Gigabyte of slower RAM, the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM, regardless of the fact that the size and number applied makes the 1 Gigabyte of DDR3 RAM seem larger. These are units of both time and space. Fast eDRAM that can be used at a speed more useful to the CPU and GPU have certain advantages, that when exploited, give the console great gains in performance.

The eDRAM of the Wii U is embedded right onto the chip logic, which for most intent and purposes negates the classic In/Out bottleneck that developers have faced in the past as well. Reading and writing directly in regard to all of the chips on the Multi Chip Module as instructed.

"

Do you even read what you post ? 

This crappy post has absolutely nothing with what I stated.


really?

the amount of actual Megabytes of RAM that exist during the same time frame is superior with the eDRAM

Again you didn't even address what initial concerns I had within the first comment of this thread. 

That line was worthless and unnessecary as well. Another crappy post from a corporate apologist like yourself but at this point I'm not really surprised. 


well, is not me you should get angry

is shinen, why dont you tweety them about what you think of their comments?

 

and why did you bring up haswell anyway?

didnt know that they also had the edram in the same die with the gpu or cpu

wii u has the edram inside the gpu, the only one that access it as cache is the cpu, (probably between 2 to 4 megabytes like the 476fp l3 cache)

 

besides, is not that ipressive, even gamecube had 512 bits for the ebedded memory



megafenix said:


well, is not me you should get angry

is shinen, why dont you tweety them about what you think of their comments?

Once again I had already addressed the issue a few pages ago. Shin'en is obviously damage controlling much like how cevat yerli did it for the X1 when developing ryse son of rome. The one who should be angry is you and a few other nintendo apologists like yourself and eyeofcore because they were being partially dishonest about the capabilites of the WII U. 



Around the Network
fatslob-:O said:
megafenix said:


well, is not me you should get angry

is shinen, why dont you tweety them about what you think of their comments?

Once again I had already addressed the issue a few pages ago. Shin'en is obviously damage controlling much like how cevat yerli did it for the X1 when developing ryse son of rome. The one who should be angry is you and a few other nintendo apologists like yourself and eyeofcore because they were being partially dishonest about the capabilites of the WII U. 


dihonest h?

yea right

once again gamecube has schooled you

1024 bits is dishonest specially sinc gamecube had 512 bits and 360 4096 bits

ports are ipossible with edram of 1024 bits sinc gives you just 35gb/s

i can beat that with just 4 megabytes of the old embedded meory of gamecube

qalread not just shinen, but also renesas has pointed that the edram should be 8192 bits

sorry, but they are nmorentrustful than you

 

on purose, haswell coparasion is a joke, specially since haswell has the edra in a sparate die and not on the sae die like wii u does

haswell edram is limited by an external bus of 512 bits, wii u edra is not limited by such a thing



megafenix said:
fatslob-:O said:
megafenix said:


well, is not me you should get angry

is shinen, why dont you tweety them about what you think of their comments?

Once again I had already addressed the issue a few pages ago. Shin'en is obviously damage controlling much like how cevat yerli did it for the X1 when developing ryse son of rome. The one who should be angry is you and a few other nintendo apologists like yourself and eyeofcore because they were being partially dishonest about the capabilites of the WII U. 


dihonest h?

yea right

once again gamecube has schooled you

1024 bits is dishonest specially sinc gamecube had 512 bits and 360 4096 bits

ports are ipossible with edram of 1024 bits sinc gives you just 35gb/s

i can beat that with just 4 megabytes of the old embedded meory of gamecube

qalread not just shinen, but also renesas has pointed that the edram should be 8192 bits

sorry, but they are nmorentrustful than you

 

on purose, haswell coparasion is a joke, specially since haswell has the edra in a sparate die and not on the sae die like wii u does

haswell edram is limited by an external bus of 512 bits, wii u edra is not limited by such a thing

It doesn't matter whether the die is separate or not. You still don't understand the purpose of a cache! I wonder why you ain't answering my question as to why X1 mulitplat games are having trouble achieveing 1080p ? 

Thanks for giving the shittiest post on the planet. Why don't you take the fight to tomshardwareshare or anandtech ? Oh right, cause you'll get pwned badly like eyeofcore did LMAO. 



fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:


well, is not me you should get angry

is shinen, why dont you tweety them about what you think of their comments?

Once again I had already addressed the issue a few pages ago. Shin'en is obviously damage controlling much like how cevat yerli did it for the X1 when developing ryse son of rome. The one who should be angry is you and a few other nintendo apologists like yourself and eyeofcore because they were being partially dishonest about the capabilites of the WII U. 


dihonest h?

yea right

once again gamecube has schooled you

1024 bits is dishonest specially sinc gamecube had 512 bits and 360 4096 bits

ports are ipossible with edram of 1024 bits sinc gives you just 35gb/s

i can beat that with just 4 megabytes of the old embedded meory of gamecube

qalread not just shinen, but also renesas has pointed that the edram should be 8192 bits

sorry, but they are nmorentrustful than you

 

on purose, haswell coparasion is a joke, specially since haswell has the edra in a sparate die and not on the sae die like wii u does

haswell edram is limited by an external bus of 512 bits, wii u edra is not limited by such a thing

It doesn't matter whether the die is separate or not. You still don't understand the purpose of a cache! I wonder why you ain't answering my question as to why X1 mulitplat games are having trouble achieveing 1080p ? 

Thanks for giving the shittiest post on the planet. Why don't you take the fight to tomshardwareshare or anandtech ? Oh right, cause you'll get pwned badly like eyeofcore did LMAO. 


it doesn matter?

yea right

tell that to the old xbox, if it wasnrt in a separate die then there wouldn be no ecxternal bus 32gb/s limitation

i understant perfectly well what caxche is, what you dont undesrtand is that 32 egabytes is far ore than the tipiocall cache that standard pc gpus pack

i am not saying is not used as cache, just that the cache so big copared to the gpu standards like 8kb or 64kb for texture cache or global data share, etc that could be considered as the gpu ram

analogies dude, analogies

 

they have trouble achieving 1080p since using edram or esra is trickier than ram, bgesides, teh debelopers of ryse rome already said that are using the esram mnore for rich environents and other stuff using g buffers aned other stuff like msaa

if they were to remove those other buffers ande only use the esram forn the 1080p, then there would bge no such strugling,  of cours there would be sacrificies like less quality textures etc

we dont even know if they use compression for textures , if they dont they are using nto much emory for something they cold have used less

 

again, ask daddy sony why is trickier to use edram or esraj than main ram



megafenix said:

 

 

pealtie vs renesas

pe,altie vs shinen

sorry, cant believe hi, besides, gaecube had 512 bits of ebedded eory

adecade and only 1024 bits?

yea right

xbox 360 was 4096 bits, anywhere below that is bullshit

8192  bits its the right answer, didnt shinen already schooled youa nd your friend?

renesas latest technoplogy to the point aking it its difficult at other place

shinen saying that wii u edra andwidth is huge and scary


For the record I don't contradict shinen or renesas, you just don't seem to understand that it's the internal bandwidth that's being discussed.
The external bandwidth, that connects the eDRAM to the GPU and/or CPU is substantually lower.

Allow me to break it down for you some more, again.

With the eDRAM you have a bunch of functional units, think of these as peoples houses, in order for all the functional units to talk to each other and pass on work you need a big fat and wide road to connect them all, that's where the internal bandwidth comes into play.
Typically, internal bandwidth is stupidly fast and low latency. (They come with different implementations such as a Ring or Hub bus etc')
However once it comes time for all the work the functional units have done to be passed onto the GPU it needs to exit a much much much slower toll booth on a smaller, skinnier road.



--::{PC Gaming Master Race}::--

megafenix said:
fatslob-:O said:
megafenix said:
fatslob-:O said:
megafenix said:


well, is not me you should get angry

is shinen, why dont you tweety them about what you think of their comments?

Once again I had already addressed the issue a few pages ago. Shin'en is obviously damage controlling much like how cevat yerli did it for the X1 when developing ryse son of rome. The one who should be angry is you and a few other nintendo apologists like yourself and eyeofcore because they were being partially dishonest about the capabilites of the WII U. 


dihonest h?

yea right

once again gamecube has schooled you

1024 bits is dishonest specially sinc gamecube had 512 bits and 360 4096 bits

ports are ipossible with edram of 1024 bits sinc gives you just 35gb/s

i can beat that with just 4 megabytes of the old embedded meory of gamecube

qalread not just shinen, but also renesas has pointed that the edram should be 8192 bits

sorry, but they are nmorentrustful than you

 

on purose, haswell coparasion is a joke, specially since haswell has the edra in a sparate die and not on the sae die like wii u does

haswell edram is limited by an external bus of 512 bits, wii u edra is not limited by such a thing

It doesn't matter whether the die is separate or not. You still don't understand the purpose of a cache! I wonder why you ain't answering my question as to why X1 mulitplat games are having trouble achieveing 1080p ? 

Thanks for giving the shittiest post on the planet. Why don't you take the fight to tomshardwareshare or anandtech ? Oh right, cause you'll get pwned badly like eyeofcore did LMAO. 


it doesn matter?

yea right

tell that to the old xbox, if it wasnrt in a separate die then there wouldn be no ecxternal bus 32gb/s limitation

i understant perfectly well what caxche is, what you dont undesrtand is that 32 egabytes is far ore than the tipiocall cache that standard pc gpus pack

 

they have trouble achieving 1080p since using edram or esra is trickier than ram, bgesides, teh debelopers of ryse rome already said that are using the esram mnore for rich environents and other stuff using g buffers aned other stuff like msaa

if they were to remove those other buffers ande only use the esram forn the 1080p, then there would bge no such strugling

we dont even know if they use compression for textures , if they dont they are using nto much emory for something they cold have used less

 

again, ask daddy sony why is trickier to use edram or esraj than main ram

Once again another crappy post.