By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - 5th year of Xbox 360 : The Unreal Engine Effect.

selnor said:

Ok. Lets do this. Firstly there is no way Digital Foundry know how an engine use the system. Example. Digital Foundry state GeoW 1 has partial AA. Epic state it has no AA. If I were you I'd listen to devs more than some guys in a basement.

2nd, your FF13 article is a myth article. It talks about PS3 demo and 'perhaps' what the 360 could do to handle the game. Nothing about any official statements of how the engine is used on 360. They talk about what EDRam may do for FF13 not that it is used.

Here is a straight question to Sweeney that proves Digital Foundry often get things wrong:

Jacob- "Will UE3.0 support predicated tiling to make use of 4xAA on Xbox 360?"

Sweeney- Gears of War runs natively at 1280x720p without multisampling. MSAA performance doesn't scale well to next-generation deferred rendering techniques, which UE3 uses extensively for shadowing, particle systems, and fog.

http://www.nvnews.net/vbulletin/showthread.php?s=2a2cdbfa9f935a60371587e9b684e6f4&t=70056

Further here is Remedy's official comments on their 'OFFICAIL Forums' just in case anyone thinks it's made up. You know to suit their theory that EDRam gives bugger all free.

Heres the comment reffering to hardware AA and free alpha blends. Also if you know how much alpha blends can tax a system you would know having them and not actually having to use them is incredible.

"We like 4xAA. Due to the alpha-to-coverage feature on the Xbox 360 GPU, it's one of the key reasons we can render a lot of "alpha test" foliage like trees and bushes without them starting to shimmer or dither (as alpha-to-coverage with 4xAA effectively gives us 5 samples of alpha "blend" without actually using alpha blend). Of course that leads into a lot of interesting ways how to get the the other "standard" z-buffer based rendering schemes to not alias, but let's not get into that discussion right now."

http://translate.googleusercontent.com/translate_c?hl=de&ie=UTF-8&sl=de&tl=en&u=http://forum.alanwake.com/showthread.php%3Fp%3D60357&rurl=translate.google.de&usg=ALkJrhiWwBc4YXXqgcDkxIRUL3dYSC4ATg#post60357

So there you have it. Actual developer info. Not basement boys or a hypothetical article.

Uhm, I wonder if you read what I wrote. At all. Or if you read what I linked to.

Because you just linked what I linked and commented before, and still got almost everything  wrong.

The Sweeney interview (wich I quoted in my point 1b) dates back to 2006, as I already said, and is about UE3 in the form used in Gears 1. By the time Gears 2 came out the updated UE on 360 allowed limited 2x MSAA even with deferred particles, and the same was true of the first Mass Effect which is what Digital Foundry refers to. Nowhere to be seen is DF saying that Gears 1 had 2xMSAA. So you got that one wrong.

The post in the Remedy forum (which I quoted in my point 3a) talks indeed about using 4xAA, as I myself said. What I debated is that it is not  full screen 4xMSAA "for free" nor new as you made it sound, it's merely the result of optimizing a resource known to any developer on the 360.

Please, go read on beyond3d or any other 3d developer forums. Every 360 designer will have to cope with optimizing for the edram size. They all did since development on the 360 began six years ago, and that's why many multiplatform games had an easy 2xAA in their 360 port.

Facts are: the edram is a very fast embedded memory buffer that solves bandwidth issues as long as you can fit everything inside those 10MB. You can't fit in there 720p, 32bit color space, 32 bit z-buffer and 4xAA unless you resort to tiling, and that poses other performance problems. Some games resort to tiling, others like Halo 3 cut something (e.g. resolution) to fit in the edram.

Just to explain something else: the alpha-to-coverage feature the Remedy guy talks about basically means simulating transparences with dithering pixels. You might have seen an example in the (in)famous character hair details in FFXIII. The trick he explains is that they avoid the bad "grainy" or shimmering look because 1) they are applying it to foliage which is by its nature a dithered, irregular material 2) applying 4xAA to the foliage blurs the grain enough to be comparable to a many-steps alpha without a shimmering effect.

Nowhere in there he says they are using 4xMSAA full screen. They might be as far as we know, as other games such as Fallout 3 did, but they would need to use tiling because they would not materially have enough space in the edram. And/or they might use a smaller color space and then simulate a bigger range by careful application of blooms and other post effects, as palette-wise AW looks quite muted.

The point is: they didn't say what you reported as a fact. And even if they did use 4xMSAA full screen and 720p, there would be tradeoffs as is always the case. Nothing comes for free.

Finally, I don't take everything DF writes as the final word. I pointed to them as a place where you could learn more about the edram optimizations from people that certainly know a lot more about those technical aspects than you do, and still explain them often in layman's terms. You're welcome to browse the technical forums where the developers discuss the very same issues in much greater and bloody detail, but that would be less frutiful.

Trying to downplay DF because they don't fit your make-believe alternate reality where people just discovered the embedded ram in the 360 GPU? That reeks of intellectual dishonesty.

 

 

 



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

Around the Network

2006, LMAO. Great link, but this is 2010.



WereKitten said:
selnor said:

Ok. Lets do this. Firstly there is no way Digital Foundry know how an engine use the system. Example. Digital Foundry state GeoW 1 has partial AA. Epic state it has no AA. If I were you I'd listen to devs more than some guys in a basement.

2nd, your FF13 article is a myth article. It talks about PS3 demo and 'perhaps' what the 360 could do to handle the game. Nothing about any official statements of how the engine is used on 360. They talk about what EDRam may do for FF13 not that it is used.

Here is a straight question to Sweeney that proves Digital Foundry often get things wrong:

Jacob- "Will UE3.0 support predicated tiling to make use of 4xAA on Xbox 360?"

Sweeney- Gears of War runs natively at 1280x720p without multisampling. MSAA performance doesn't scale well to next-generation deferred rendering techniques, which UE3 uses extensively for shadowing, particle systems, and fog.

http://www.nvnews.net/vbulletin/showthread.php?s=2a2cdbfa9f935a60371587e9b684e6f4&t=70056

Further here is Remedy's official comments on their 'OFFICAIL Forums' just in case anyone thinks it's made up. You know to suit their theory that EDRam gives bugger all free.

Heres the comment reffering to hardware AA and free alpha blends. Also if you know how much alpha blends can tax a system you would know having them and not actually having to use them is incredible.

"We like 4xAA. Due to the alpha-to-coverage feature on the Xbox 360 GPU, it's one of the key reasons we can render a lot of "alpha test" foliage like trees and bushes without them starting to shimmer or dither (as alpha-to-coverage with 4xAA effectively gives us 5 samples of alpha "blend" without actually using alpha blend). Of course that leads into a lot of interesting ways how to get the the other "standard" z-buffer based rendering schemes to not alias, but let's not get into that discussion right now."

http://translate.googleusercontent.com/translate_c?hl=de&ie=UTF-8&sl=de&tl=en&u=http://forum.alanwake.com/showthread.php%3Fp%3D60357&rurl=translate.google.de&usg=ALkJrhiWwBc4YXXqgcDkxIRUL3dYSC4ATg#post60357

So there you have it. Actual developer info. Not basement boys or a hypothetical article.

Uhm, I wonder if you read what I wrote. At all. Or if you read what I linked to.

Because you just linked what I linked and commented before, and still got almost everything  wrong.

The Sweeney interview (wich I quoted in my point 1b) dates back to 2006, as I already said, and is about UE3 in the form used in Gears 1. By the time Gears 2 came out the updated UE on 360 allowed limited 2x MSAA even with deferred particles, and the same was true of the first Mass Effect which is what Digital Foundry refers to. Nowhere to be seen is DF saying that Gears 1 had 2xMSAA. So you got that one wrong.

The post in the Remedy forum (which I quoted in my point 3a) talks indeed about using 4xAA, as I myself said. What I debated is that it is not  full screen 4xMSAA "for free" nor new as you made it sound, it's merely the result of optimizing a resource known to any developer on the 360.

Please, go read on beyond3d or any other 3d developer forums. Every 360 designer will have to cope with optimizing for the edram size. They all did since development on the 360 began six years ago, and that's why many multiplatform games had an easy 2xAA in their 360 port.

Facts are: the edram is a very fast embedded memory buffer that solves bandwidth issues as long as you can fit everything inside those 10MB. You can't fit in there 720p, 32bit color space, 32 bit z-buffer and 4xAA unless you resort to tiling, and that poses other performance problems. Some games resort to tiling, others like Halo 3 cut something (e.g. resolution) to fit in the edram.

Just to explain something else: the alpha-to-coverage feature the Remedy guy talks about basically means simulating transparences with dithering pixels. You might have seen an example in the (in)famous character hair details in FFXIII. The trick he explains is that they avoid the bad "grainy" or shimmering look because 1) they are applying it to foliage which is by its nature a dithered, irregular material 2) applying 4xAA to the foliage blurs the grain enough to be comparable to a many-steps alpha without a shimmering effect.

Nowhere in there he says they are using 4xMSAA full screen. They might be as far as we know, as other games such as Fallout 3 did, but they would need to use tiling because they would not materially have enough space in the edram. And/or they might use a smaller color space and then simulate a bigger range by careful application of blooms and other post effects, as palette-wise AW looks quite muted.

The point is: they didn't say what you reported as a fact. And even if they did use 4xMSAA, there would be tradeoffs as is always the case. Nothing comes for free.

Finally, I don't take everything DF writes as the final word. I pointed to them as a place where you could learn more about the edram optimizations from people that certainly know a lot more about those technical aspects than you do, and still explain them often in layman's terms. You're welcome to browse the technical forums where the developers discuss the very same issues in much greater and bloody detail, but that would be less frutiful.

Trying to downplay DF because they don't fit your make-believe alternate reality where people just discovered the embedded ram in the 360 GPU? That reeks of intellectual dishonesty.

 

 

 

Utilizing the EDRam and just using it are 2 different things. Grid is a thing of the past if Metro 2033 is anything to go by."The 360 was running deferred rotated grid super-sampling for the last two years, but later we switched it to use analytical anti-aliasing (AAA)," reveals Shishkovtsov. "That gave us back around 11MB of memory and dropped AA GPU load from a variable 2.5-3.0 ms to constant 1.4ms. The quality is quite comparable."Also scarier is they comment if you take away V-Sync from Metro 2033 they believe they have 100mb of memory left to play with. They believe the 360 can do alot more. You speak of trade offs, while companies make more room to do even more. So less of trading off, but being able to do more.


The PS3 can and does do a lot more, point?



@Selnor

I said previously that I agreed with the sentiment of your piece: there's still headroom for the 360 hardware to shine if developers will competently squeeze everything out of it, and middleware such as UE has been a drag in some regards. Let new engines rise and shine.

Still, you were factually wrong about many technical details and many quotations and throwing in some other -admittedly impressive- new tech pieces you're just mudding the waters with unrelated information.

The edram is just 10MB, it is very well known to developers, there are known tradeoffs in its use. Just admit that you were very wrong about this and that some of what you called facts were just your speculations and we'll move on to greater and better things.
(If you don't we'll move on anyway, of course, but you'll lose a great chance for intellectual honesty. We'll love you anyway :) )



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

Around the Network

If you are using something, it's being utilized..... You can of course say people are getting better at optimizing their engine to be used with the eDRAM to get the look they want, instead of going "ZOMG BEST EVA! NOBODY USED THAT PIECE OF MEMORY BEFORE!" or something......



dahuman said:
If you are using something, it's being utilized..... You can of course say people are getting better at optimizing their engine to be used with the eDRAM to get the look they want, instead of going "ZOMG BEST EVA! NOBODY USED THAT PIECE OF MEMORY BEFORE!" or something......

Tell that to all the people who claim that "PS3 is only using *insert % here* of its power".



Truth does not fear investigation

NightAntilli said:
dahuman said:
If you are using something, it's being utilized..... You can of course say people are getting better at optimizing their engine to be used with the eDRAM to get the look they want, instead of going "ZOMG BEST EVA! NOBODY USED THAT PIECE OF MEMORY BEFORE!" or something......

Tell that to all the people who claim that "PS3 is only using *insert % here* of its power".

Yeah, Exactly.

What is this anyway, Dragon Ball Z?

I wasn't aware people could make use of scouters to tell a consoles power level.



newfgamer said:

These same sites have played alan wake, you ever hear of previews? its out in jyst a couple of months. again, if u think 360 is the only system that is always being tapped into, you have no clue, this is not new, its a constant thing with console games, you try and make it sound liek this is a 360 only thing, please, everyone knows PS3 is newer, and has not been tapped as much, there is a reason Crytek stated they will max out the PS3, because no one has done it before. UC2, only used 70% of the PS3, and it is far and away the best looking console game.

 

The small linear levels, and lack of scale, large battles means AW wont touch GOW 3. The detail and scope of GOW 3 is unparallelled.

 

In other words you dont know, but its your prediction that they will say that about AW? Pure specuation.. 7 years in the making actually points to an older engine, games they change so much through its lifecycle usually suffer, it was first announced as a PC only game. Then they changed it from sandbox to linear, all the while other games kept making betetr looking games.

 

Huge levesl, I suggest u do some research, they scrapped the open world for a linear approach, AW does not come close to the epic scale and chaos going on nor the texture detail of GOW 3. Aw has smaller linear levels with lite action compare to the scope of the battles on GOW.

 

the same sites that have sat down and played AW, IGn, GMESPOT, never said anything about its graphics, where as, they have all said one thing, GOW 3 is a new landmark in console graphics and sheer scale.

 

Im nor defending anything, the PS3 speaks for itself, GOW 3 speaks for itself.

AlanWake Linear?

God of War 3 Scope...?

Can someone please let this guy know that GOW as a whole is as linear as it gets?



NightAntilli said:
dahuman said:
If you are using something, it's being utilized..... You can of course say people are getting better at optimizing their engine to be used with the eDRAM to get the look they want, instead of going "ZOMG BEST EVA! NOBODY USED THAT PIECE OF MEMORY BEFORE!" or something......

Tell that to all the people who claim that "PS3 is only using *insert % here* of its power".

Yeah I don't get that shit either, it's all about optimizing and work around the console, because that's the point of console programming since the hardwares are static for the most part.