By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - Guerrilla Games: Regarding Killzone Shadow Fall and 1080p

Nice explanation of what they did here, makes things kind of blurry though. I wonder if there's a way to improve upon this technique that doesn't require too much more horsepower. I think it's funny they even had to explain it, but these resolution wars have become serious business to the point of bloodshed. Ha.



Around the Network
RJ_Sizzle said:
Nice explanation of what they did here, makes things kind of blurry though. I wonder if there's a way to improve upon this technique that doesn't require too much more horsepower. I think it's funny they even had to explain it, but these resolution wars have become serious business to the point of bloodshed. Ha.

Of course this technique will improve in the future, as everything. A programable pipeline provides the tools to do some awesome things, when done right. The problem here is that the developers aren't the PR guys. This Q&A should have been provided before the shitstorm.



fallen said:

 Next thing you know people will start claimingh it's actually better than real 1080P. Because Sony. Mark my words.

 

I mean come on Guerrilla felt the need to even note "it's very computationally expensive" or whatever. obviously defensive. If it's so computationally expensive then why didn't you just use real 1080P?

 

I think Xbox should start using this technique, then we should go point out all these past posts saying how good it was for Shadowfall, when people criticize it :)

It is not demanding like native 1080p but it is more demanding than 720p or I guess even 900p upscaled with better IQ too...

BTW it needs a good chunk of fast read/write RAM... I can't see how devs can store the frame buffer and these isolated addictional frames in the 32MB eSRAM to work with... store in DDR3 will be too slow to do that every frame in 60fps... so I guess this tech won't be used in Xbone's games.

I can be wrong too.



ethomaz said:

fallen said:

 Next thing you know people will start claimingh it's actually better than real 1080P. Because Sony. Mark my words.

 

I mean come on Guerrilla felt the need to even note "it's very computationally expensive" or whatever. obviously defensive. If it's so computationally expensive then why didn't you just use real 1080P?

 

I think Xbox should start using this technique, then we should go point out all these past posts saying how good it was for Shadowfall, when people criticize it :)

It is not demanding like native 1080p but it is more demanding than 720p or I guess even 900p upscaled with better IQ too...

BTW it needs a good chunk of fast read/write RAM... I can't see how devs can store the frame buffer and these isolated addictional frames in the 32MB eSRAM to work with... store in DDR3 will be too slow to do that every frame in 60fps... so I guess this tech won't be used in Xbone's games.

I can be wrong too.

You shouldn't guess/compare with 720p or 900p requirements, because you don't have the graphics development knowledge. That conclusion on the DDR3 is wrong, coping 960x1080 images from - to RAM isn't very bandwidth expensive.



Kynes:

You shouldn't guess/compare with 720p or 900p requirements, because you don't have the graphics development knowledge. That conclusion on the DDR3 is wrong, coping 960x1080 images from - to RAM isn't very bandwidth expensive.

It is... you need to move these 4 frames at least 60 times per second and create the framebuffer to work on before to send to screen... you can't do that on PS3 or 360... I guess you can't too on Xbone because the eSRAM is limited to 32MB that is already not enought for the frame buffer.



Around the Network
RJ_Sizzle said:
Nice explanation of what they did here, makes things kind of blurry though. I wonder if there's a way to improve upon this technique that doesn't require too much more horsepower. I think it's funny they even had to explain it, but these resolution wars have become serious business to the point of bloodshed. Ha.


this^, and because resolution wars people doesn't think clearly anymore, i think GG should get the praise of this technique instead of lying accusation, i don't mind if they trick my eyes or whatever, as long as it looks good, it's good then!!, i don't care what kind of cast or magic they used, i don't give a f*ck to be honest, hell even if they make something 480p look as good as 1080p then i'm OK with that (but unfortunately that is technically impossible), to be this long for being revealed, that's one hell of a technique, think about future, this technique should be followed by the other studios, all they need is improvement, so people won't be able to notice anymore between 960 x 1080p and 1920 x 1080p, and the most important thing, just be honest and tell people the truth from the beginning.



ethomaz said:

fallen said:

 Next thing you know people will start claimingh it's actually better than real 1080P. Because Sony. Mark my words.

 

I mean come on Guerrilla felt the need to even note "it's very computationally expensive" or whatever. obviously defensive. If it's so computationally expensive then why didn't you just use real 1080P?

 

I think Xbox should start using this technique, then we should go point out all these past posts saying how good it was for Shadowfall, when people criticize it :)

It is not demanding like native 1080p but it is more demanding than 720p or I guess even 900p upscaled with better IQ too...

BTW it needs a good chunk of fast read/write RAM... I can't see how devs can store the frame buffer and these isolated addictional frames in the 32MB eSRAM to work with... store in DDR3 will be too slow to do that every frame in 60fps... so I guess this tech won't be used in Xbone's games.

I can be wrong too.

Yep. They need 2.5 full HD frames for their technique, at least. The best I have seen on XB1 is a double buffered full HD frame (used by every 1080p XB1 games), I think 32MB video ram is not enough for this technique.

And don't forget that the 3 (half-)full HD frames need to store additionall information, the pixel motion vector.



overcomplicated technique, just set it to 720p and have done.



ethomaz said:

Kynes:

You shouldn't guess/compare with 720p or 900p requirements, because you don't have the graphics development knowledge. That conclusion on the DDR3 is wrong, coping 960x1080 images from - to RAM isn't very bandwidth expensive.

It is... you need to move these 4 frames at least 60 times per second and create the framebuffer to work on before to send to screen... you can't do that on PS3 or 360... I guess you can't too on Xbone because the eSRAM is limited to 32MB that is already not enought for the frame buffer.

A single 32 bits per channel 960x1080 rgb image weights less than 12 MB, copyng from - to 4 of them, 60 times per second uses less than 3 GBPS, much less than the 68 GBPS of the X1. This is a worst case scenario, taking into consideration that no one uses 32 bits per channel in gaming, and I'm not taking into consideration color compression. Using 8 bits per channel you use more or less 1 GBPS on this, something I'm sure it's doable.



A really interesting technique; so, MP is 1080p.   



”Every great dream begins with a dreamer. Always remember, you have within you the strength, the patience, and the passion to reach for the stars to change the world.”

Harriet Tubman.