By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
curl-6 said:

360's use of eDRAM is terrible, 10MB is just too small for a HD system; in fact, it forced sub-HD resolutions in many games.

Both Shin'en  and the dev from that Eurogamer article on early Wii U development have stated that thanks to the eDRAM, memory bandwidth is not a problem for Wii U.

It may have being an issue when hitting HD resolutions but developers had a form of solution called "tiling" which allowed it to split up the render targets and depth buffers into "tiles". 

Another thing unique thing about the xbox 360's eDRAM was that it had an 8 extra ROPS which essentially allowed it "free anti aliasing". This had meant that blending operations were possible on a separate die such as the eDRAM. 

Memory bandwidth is still an issue seeing as how the eDRAM doesn't solve the texturing bandwidth of the WII U. It essentially limits the amount of high resolution textures the WII U can have.

Tiling still wasn't as good as actually having the space.

And again, both Shin'en and the dev from the Eurogamer article stated that memory bandwidth was not an issue. Plus, it didn't stop Need for Speed and Trine 2 using better-than-PS360 textures.



Around the Network

curl-6 said:

Tiling still wasn't as good as actually having the space.

32 mb isn't a whole lot either sooo  it really doesn't do much in the grand scheme of things compared to having 8 extra ROPs ...

And again, both Shin'en and the dev from the Eurogamer article stated that memory bandwidth was not an issue. Plus, it didn't stop Need for Speed and Trine 2 using better-than-PS360 textures.

Shin'en has never clarified the bottleneck of the WII U ... They have only spoken about it's strengths and not the weaknesses. Those games had better mip mapping to mitigate the bandwidth weaknesses and they weren't much better than the last generation HD twins.





Wait, there's going to be a PS3 version of Bayonetta 2? When was this announced?



“The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.” - Bertrand Russell

"When the power of love overcomes the love of power, the world will know peace."

Jimi Hendrix

 

fatslob-:O said:

curl-6 said:

 

Tiling still wasn't as good as actually having the space.

32 mb isn't a whole lot either sooo  it really doesn't do much in the grand scheme of things compared to having 8 extra ROPs ...

And again, both Shin'en and the dev from the Eurogamer article stated that memory bandwidth was not an issue. Plus, it didn't stop Need for Speed and Trine 2 using better-than-PS360 textures.

 

Shin'en has never clarified the bottleneck of the WII U ... They have only spoken about it's strengths and not the weaknesses. Those games had better mip mapping to mitigate the bandwidth weaknesses and they weren't much better than the last generation HD twins.



32MB is still over thee times more than 10MB. Instead of just being a framebuffer, Wii U's eDRAM is available for other tasks requiring fast memory, like some GPU writes and CPU-intense work.

"Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency" -Shin'en on RAM bandwith not being a bottleneck.

"This wasn't really a problem for us. The GPU could fetch data rapidly with minimal stalls (via the EDRAM) and we could efficiently pre-fetch" - Dev from the Eurogamer article on the same thing.

Trine 2 improvements:

-       Textures

-       Normal map compression

-       Anti-aliasing

-       Locked screen resolution

-       Physics

-       Water effects

 

Need for Speed improvements:

-       Textures

-       Reflection mapping

-       Framerate



curl-6 said:
fatslob-:O said:

curl-6 said:

 

Tiling still wasn't as good as actually having the space.

32 mb isn't a whole lot either sooo  it really doesn't do much in the grand scheme of things compared to having 8 extra ROPs ...

And again, both Shin'en and the dev from the Eurogamer article stated that memory bandwidth was not an issue. Plus, it didn't stop Need for Speed and Trine 2 using better-than-PS360 textures.

 

Shin'en has never clarified the bottleneck of the WII U ... They have only spoken about it's strengths and not the weaknesses. Those games had better mip mapping to mitigate the bandwidth weaknesses and they weren't much better than the last generation HD twins.



32MB is still over thee times more than 10MB. Instead of just being a framebuffer, Wii U's eDRAM is available for other tasks requiring fast memory, like some GPU writes and CPU-intense work.

"Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency" -Shin'en on RAM bandwith not being a bottleneck.

"This wasn't really a problem for us. The GPU could fetch data rapidly with minimal stalls (via the EDRAM) and we could efficiently pre-fetch" - Dev from the Eurogamer article on the same thing.

Trine 2 improvements:

-       Textures 

-       Normal map compression 

-       Anti-aliasing  

-       Locked screen resolution 

-       Physics

-       Water effects

 

Need for Speed improvements:

-       Textures

-       Reflection mapping

-       Framerate

It's best suited to just being a framebuffer ... Why would the CPU need that much bandwidth when it only has a 64 bit wide SIMD engine ?

As for your second account that doesn't matter because GPUs are meant to work on large data sets therefore having more bandwidth is beneficial than lower access times. Despite the fact that the xbox one can cache it's data faster it's still gets thrashed hard by the PS4. 



Around the Network
fatslob-:O said:

It's best suited to just being a framebuffer ... Why would the CPU need that much bandwidth when it only has a 64 bit wide SIMD engine ?

As for your second account that doesn't matter because GPUs are meant to work on large data sets therefore having more bandwidth is beneficial than lower access times. Despite the fact that the xbox one can cache it's data faster it's still gets thrashed hard by the PS4. 

But it's big enough to have room to spare even after a 1080p frame is stored, and the free space is available to both the CPU and GPU for anything that needs fast memory access.

Basically, you prioritise high speed tasks to the eDRAM, so that the main RAM bandwidth doesn't become an issue as it's primarily used for slower operations.



curl-6 said:
fatslob-:O said:

It's best suited to just being a framebuffer ... Why would the CPU need that much bandwidth when it only has a 64 bit wide SIMD engine ?

As for your second account that doesn't matter because GPUs are meant to work on large data sets therefore having more bandwidth is beneficial than lower access times. Despite the fact that the xbox one can cache it's data faster it's still gets thrashed hard by the PS4. 

But it's big enough to have room to spare even after a 1080p frame is stored, and the free space is available to both the CPU and GPU for anything that needs fast memory access.

Basically, you prioritise high speed tasks to the eDRAM, so that the main RAM bandwidth doesn't become an issue as it's primarily used for slower operations.

You need more than just a 1080p frame though ... Did you forget about depth and stencil buffers which are required for z-buffering and shadowing ? 

The only thing that the eDRAM can meaningfully priortise are render targets which are related to framebuffer operations ... Great that'll be enough to saturate the ROPS but how do I solve bandwidth for texturing purposes ? 



st0pnsw0p said:
BMaker11 said:

The reason for this comparison was two fold:

#1 The PS3 was the most powerful console last gen and #2 everyone says that the Wii U is barely more powerful than the PS3.

So they're making an example of how the Wii U can "blow the PS3 out of the water" when comparing two similar games. Problem being: the PS3 version of Bayonetta was a shitty, muddy looking port made by a completely different studio.

People can't talk about the 360 version (i.e. definitive version by a landslide) in this comparison because the 360, on paper, was weaker than the PS3.

Let's not talk about how Arkham City/Origins, ME3, Splinter Cell Blacklist, etc......games that are on an even playing field between the PS3 and WiiU (not comparing a port of a game to a sequel (which 9 times out of 10 look way better than the original) of said game, made by the original studio) have their best versions on the PS3. Even when they rebooted Goldeneye specifically for the Wii, and then made 007 Legends, which is built around that same Wii engine.....the PS3 version is better than the WiiU version (even though it's really moot since that game was terrible, period).

The thing is, though, that the PS3 was much more difficult to develop for than the 360 because of its use of the Cell processor, so multiplatform games ran better on the 360 than on the PS3 because they were coded better (at least, they ran better for the first part of the generation, I'm not sure if devs have gotten used to the Cell enough to close the gap). Neither Bayonetta 1 on 360 nor Bayonetta 2 on Wii U have this problem, so it makes more sense to compare those two versions.

Yes, it does make more sense, but like I said, on paper, the PS3 was more powerful than the 360. People are saying that the WiiU is only barely more powerful than the most powerful console from last gen. So, the Nintendo crowd, therefore, wants the most brownie points for showing how much "better" a WiiU game looks than a PS3 game, since the PS3 produced the best graphics last gen, overall. That's all.

They don't want to factor in everything that went into the PS3 edition of Bayonetta, resulting in a really sloppy port. They just want to say "WiiU >>>> PS3" and use this as an example of it, as if all things are equal between the two games. It'd be no different if you just took Bayonetta 360 vs PS3. The 360 crowd could just say "oh, the PS3 is so much more powerful than 360? Well look at this. Clearly the PS3 isn't" and ignore the fact that this particular game was just sloppily made on the PS3.



fatslob-:O said:
curl-6 said:
fatslob-:O said:

It's best suited to just being a framebuffer ... Why would the CPU need that much bandwidth when it only has a 64 bit wide SIMD engine ?

As for your second account that doesn't matter because GPUs are meant to work on large data sets therefore having more bandwidth is beneficial than lower access times. Despite the fact that the xbox one can cache it's data faster it's still gets thrashed hard by the PS4. 

But it's big enough to have room to spare even after a 1080p frame is stored, and the free space is available to both the CPU and GPU for anything that needs fast memory access.

Basically, you prioritise high speed tasks to the eDRAM, so that the main RAM bandwidth doesn't become an issue as it's primarily used for slower operations.

You need more than just a 1080p frame though ... Did you forget about depth and stencil buffers which are required for z-buffering and shadowing ? 

The only thing that the eDRAM can meaningfully priortise are render targets which are related to framebuffer operations ... Great that'll be enough to saturate the ROPS but how do I solve bandwidth for texturing purposes ? 

Actually you can use it for  more than just framebuffer operations:

"We use the eDRAM in the Wii U for the actual framebuffers, intermediate framebuffer captures, as a fast scratch memory for some CPU intense work and for other GPU memory writes." - Shin'en



AZWification said:

 " konzoloa"? What the hell  is that? Is that how rednecks spell the word " console"?


It's probably the name of his maid.