By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - 5th year of Xbox 360 : The Unreal Engine Effect.

These continual leaps in technology on Xbox 360 continue to astound.

It's hard to imagine how any other game is going to top Crysis 2 on Xbox 360. I'm guessing they just won't.



Around the Network

Are there any companies that specifically design and build engines for console companies or game developers to maximize the consoles power?

Say Microsoft hiring a third party to build said engine then leasing engine out to anyone who wanted to make an exclusive game.



markers said:
Say Microsoft hiring a third party to build said engine then leasing engine out to anyone who wanted to make an exclusive game.

Wouldn't be much point really. You can get more performance out of the leading edge engines (CE3 & Tech 5) then most if not all exclusives have shown on any console and all the major engines this gen perform better on Xbox 360 anyway.  UE3, Tech 4, Tech 5, EGO, Capcom Framework 1-2, Frostbite, It's only really behind on Crystal Tools, assuing all that FF stuff is true.



selnor said:
TheLastGuardian said:

 

That's ok news for xbox owners, but seriously, the ps3 is leaps ahead when it comes to Anti Aliasing. The xbox 360 maxes at 4 times MSAA, the ps3 is capable of much more than (16 times MSAA), More advanced than most high end computers, all thanks to the cell. For instance, uncharted 2 is flawless. Here read this article

 

http://gamer.blorge.com/2010/01/05/ps3-smoothing-beyond-that-of-high-end-pc-graphics-card/

 

 

 


LOL. The 360 can do x16 if it wanted as well. The point is to do that costs HUGE resources on Cell and leaves alot less resources for parts of the game. Why do you think KZ2 opted for 2xQSAA instead of the superior 2xMSAA? QSAA gives more than half the performance cut, but leaves textures blurred. U2 also is only 2xMSAA. The first 4 MSAA entire Full Screen game is Alan Wake.

The 360 does 4xMSAA free. Thats the point. PS3 would use considerable resource from Cell to do 4xMSAA, meaning less can be used for other things. Thats why it's so great that the 360 has a specially built chip just for AA. It means the entire system is never compromised.

And it's not much more than 16 MSAA as you quote. It's MLAA, which is software driven. Less reliable and many issues of further blurr. Less performance hit like QSAA but worse image on textures, shadows etc. Only good for polygon jaggies.

From Digital Foundry:

PS3 is curious, however, in that it has hardware support for two widely used AA techniques. We've discussed MSAA already, but quincunx AA is the other most frequently implemented technique. Unique to the NVIDIA hardware, it uses approximately the same amount of resources as MSAA but produces superior edge-smoothing at the expense of adding a blur to the entire screen. The use of quincunx and the impact on overall image quality varies game by game - intricately detailed textures will suffer much more than a more flat, anime style of art. However, this Assassin's Creed comparison of 2x MSAA up against quincunx demonstrates both the edge-smoothing advantages and the detail blur.


Sounds like 2xMSAA would use up the same resources as 2XQCAA and with superior edge smoothing. Like you said though, the devs have to compromise with blur. Sounds like they used 2xQCAA and were prepared to take the hit on texture detail.

Also, that 360 chip isn't restricted to just improving anti-aliasing. I'm pretty sure Bungie used it to do HDR on Halo 3, which was actually consequently blamed for the sub-HD resolution as the 10MB proved a bit too small for this particular task.



Gears Of War 1 had no AA at all!

Thanks for putting that there all by itself, that way I don't have to waste my time reading, because you obviously have no idea what you are talking about.



Fuck... I'm dead!

Around the Network
selnor said:

All the top games apart from 1 ( Forza 3 ) use UE3 which means there is a wealth of games not using 360 to it's potential. Even Splinter Cell Conviction is running on Unreal Engine 3.5. It's time we have devs really pushing 360 like Remedy and Bungie. Every game should utiliz EDRam chip as it saves huge resources for other things and allows the crispest games ever seen on a console. Remedy state EDRam chip essentially gives them 5 free alpha blends. They have full 4XMSAA on everything including foliage ( which is the most taxing on a system ). 5 years in, and we are only just seeing it's potential. Heres to the next 2 years of Xbox 360 technical capabilities. I dont know whether to thank UE3 or curse it now.

 

Modern Warfare 2 and Halo 3 (until Reach is released its still one of the top titles) do not use UE3, and neither do other popular games like Left 4 Dead, Assassins Creed and FIFA.

And Splinter Cell Conviction used UE2.5 not 3.5



selnor said:


LOL. The 360 can do x16 if it wanted as well. The point is to do that costs HUGE resources on Cell and leaves alot less resources for parts of the game. Why do you think KZ2 opted for 2xQSAA instead of the superior 2xMSAA? QSAA gives more than half the performance cut, but leaves textures blurred. U2 also is only 2xMSAA. The first 4 MSAA entire Full Screen game is Alan Wake.

The 360 does 4xMSAA free. Thats the point. PS3 would use considerable resource from Cell to do 4xMSAA, meaning less can be used for other things. Thats why it's so great that the 360 has a specially built chip just for AA. It means the entire system is never compromised.

And it's not much more than 16 MSAA as you quote. It's MLAA, which is software driven. Less reliable and many issues of further blurr. Less performance hit like QSAA but worse image on textures, shadows etc. Only good for polygon jaggies.

For the bolded part: as far as I know the eDRAM chip provides "free" MSAA only up to its 10MB internal memory limit, after which you have to tile and incur in heavy performance hits. For example once you use real 720p and HDR, even using 2xMSAA breaks that limit, which is why with the Halo 3 engine Bungie actually chose to render sub-HD.

Thus, when you say that U2 is "only" doing 2xMSAA, you're talking of an engine that actually provides something the eDRAM can't provide "for free". I'll wait for a proper tech analysis on the final Alan Wake to see what tradeoff they chose to overcome the eDRAM limitations, but it will either a) not use real 720p all the time or b) not use "proper" 10bit HDR or c) incur in tiling performance hit, which they might mitigate with other tradeoffs of their engine.

As for your speculations about "HUGE" resources being drained by the Cell when SPU-based antialiasing techniques are used or it being "less reliable" and "blurry", I'd like to see some basis for that considering that:

1) the Cell is designed very closely to dedicated stream processors

2) it's not like such techs will blur the textures for any mystical reason. Different techniques will bring different results.

Finally, the proof in the pudding argument.

I've seen you refer to ME2 as the "best graphics ever on console". That's a game that uses no AA afaik, except in cutscenes.

On the other hand Uncharted 2 has the most lavish graphical quality I've ever witnessed on consoles with its "mere" 2xMSAA, HD colour processing and lots of action happening onscreen, and I include in the comparison anything I've seen of Alan Wake up to now, though you quote it sports 4xMSAA.

And from what I've seen GOWIII seems to up the ante and - from what I know and the videos, but I'm ready to stand corrected after I try the real final version of the game - doesn't seem to suffer from "HUGE" performance penalties as it sports incredible scaling, many dozens of AI, a lot of fast physical interaction, continuous datastreaming.



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

Scoobes said:
selnor said:
TheLastGuardian said:

 

That's ok news for xbox owners, but seriously, the ps3 is leaps ahead when it comes to Anti Aliasing. The xbox 360 maxes at 4 times MSAA, the ps3 is capable of much more than (16 times MSAA), More advanced than most high end computers, all thanks to the cell. For instance, uncharted 2 is flawless. Here read this article

 

http://gamer.blorge.com/2010/01/05/ps3-smoothing-beyond-that-of-high-end-pc-graphics-card/

 

 

 


LOL. The 360 can do x16 if it wanted as well. The point is to do that costs HUGE resources on Cell and leaves alot less resources for parts of the game. Why do you think KZ2 opted for 2xQSAA instead of the superior 2xMSAA? QSAA gives more than half the performance cut, but leaves textures blurred. U2 also is only 2xMSAA. The first 4 MSAA entire Full Screen game is Alan Wake.

The 360 does 4xMSAA free. Thats the point. PS3 would use considerable resource from Cell to do 4xMSAA, meaning less can be used for other things. Thats why it's so great that the 360 has a specially built chip just for AA. It means the entire system is never compromised.

And it's not much more than 16 MSAA as you quote. It's MLAA, which is software driven. Less reliable and many issues of further blurr. Less performance hit like QSAA but worse image on textures, shadows etc. Only good for polygon jaggies.

From Digital Foundry:

PS3 is curious, however, in that it has hardware support for two widely used AA techniques. We've discussed MSAA already, but quincunx AA is the other most frequently implemented technique. Unique to the NVIDIA hardware, it uses approximately the same amount of resources as MSAA but produces superior edge-smoothing at the expense of adding a blur to the entire screen. The use of quincunx and the impact on overall image quality varies game by game - intricately detailed textures will suffer much more than a more flat, anime style of art. However, this Assassin's Creed comparison of 2x MSAA up against quincunx demonstrates both the edge-smoothing advantages and the detail blur.


Sounds like 2xMSAA would use up the same resources as 2XQCAA and with superior edge smoothing. Like you said though, the devs have to compromise with blur. Sounds like they used 2xQCAA and were prepared to take the hit on texture detail.

Also, that 360 chip isn't restricted to just improving anti-aliasing. I'm pretty sure Bungie used it to do HDR on Halo 3, which was actually consequently blamed for the sub-HD resolution as the 10MB proved a bit too small for this particular task.

No it doesnt use the same resources. Use a PC game for instance. Especially one that does QSAA and MSAA. 2xQSAA will give you 10FPS extra over MSAAx2. Using PC you can see the performance difference. And also the blur difference on the same game and PC. Yes QSAA is easier to the system but less quality.

EDIT: Also like you say it test on Wak would be amazing. According to the devs the shadows are also AA'd extremely well. As is textures. Apparantly it's the entire screen unlike any other game where alot of stuff is not AA'd ( just usually the polygons ).



WereKitten said:
selnor said:


LOL. The 360 can do x16 if it wanted as well. The point is to do that costs HUGE resources on Cell and leaves alot less resources for parts of the game. Why do you think KZ2 opted for 2xQSAA instead of the superior 2xMSAA? QSAA gives more than half the performance cut, but leaves textures blurred. U2 also is only 2xMSAA. The first 4 MSAA entire Full Screen game is Alan Wake.

The 360 does 4xMSAA free. Thats the point. PS3 would use considerable resource from Cell to do 4xMSAA, meaning less can be used for other things. Thats why it's so great that the 360 has a specially built chip just for AA. It means the entire system is never compromised.

And it's not much more than 16 MSAA as you quote. It's MLAA, which is software driven. Less reliable and many issues of further blurr. Less performance hit like QSAA but worse image on textures, shadows etc. Only good for polygon jaggies.

For the bolded part: as far as I know the eDRAM chip provides "free" MSAA only up to its 10MB internal memory limit, after which you have to tile and incur in heavy performance hits. For example once you use real 720p and HDR, even using 2xMSAA breaks that limit, which is why with the Halo 3 engine Bungie actually chose to render sub-HD.

Thus, when you say that U2 is "only" doing 2xMSAA, you're talking of an engine that actually provides something the eDRAM can't provide "for free". I'll wait for a proper tech analysis on the final Alan Wake to see what tradeoff they chose to overcome the eDRAM limitations, but it will either a) not use real 720p all the time or b) not use "proper" 10bit HDR or c) incur in tiling performance hit, which they might mitigate with other tradeoffs of their engine.

As for your speculations about "HUGE" resources being drained by the Cell when SPU-based antialiasing techniques are used or it being "less reliable" and "blurry", I'd like to see some basis for that considering that:

1) the Cell is designed very closely to dedicated stream processors

2) it's not like such techs will blur the textures for any mystical reason. Different techniques will bring different results.

Finally, the proof in the pudding argument.

I've seen you refer to ME2 as the "best graphics ever on console". That's a game that uses no AA afaik, except in cutscenes.

On the other hand Uncharted 2 has the most lavish graphical quality I've ever witnessed on consoles with its "mere" 2xMSAA, HD colour processing and lots of action happening onscreen, and I include in the comparison anything I've seen of Alan Wake up to now, though you quote it sports 4xMSAA.

And from what I've seen GOWIII seems to up the ante and - from what I know and the videos, but I'm ready to stand corrected after I try the real final version of the game - doesn't seem to suffer from "HUGE" performance penalties as it sports incredible scaling, many dozens of AI, a lot of fast physical interaction, continuous datastreaming.

You sir are massively off the beaten track.

The 10mb you refer to is EDRam. Faster than the normal DDR3 ram in the PS3 and 360's main ram. It's designed to handle upto 4xMSAA on it's own. Again you can use this for QSAA if you want but it's an inferior AA just like MLAA. QSAA and MLAA both blur textures. It's a fact. You can see that in any test and on PC games which let you change between MSAA, QSAA and MLAA. MSAA is harder on performance but gives better overall reults.

Again more impressive is Reach. What the engine is doing as a whole is scary as hell.



slowburn said:
Gears Of War 1 had no AA at all!

Thanks for putting that there all by itself, that way I don't have to waste my time reading, because you obviously have no idea what you are talking about.


"Sweeney- Gears of War runs natively at 1280x720p without multisampling. MSAA performance doesn't scale well to next-generation deferred rendering techniques, which UE3 uses extensively for shadowing, particle systems, and fog.


heh. Anyways 2xMSAA doubles the cost of shadow rendering, I guess epic doesn't care to solve that problem"

You sir haven't got a clue. You should research a bit. Gears Of War 1 has no AA. Gears 2 has partial AA. First game on console this gen to have AA on the entire screen as well as shadow rendering is Alan Wake. AA is more complicated than most on here know.