By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony - Guerrilla Games: Regarding Killzone Shadow Fall and 1080p

ethomaz said:

Kynes:

You shouldn't guess/compare with 720p or 900p requirements, because you don't have the graphics development knowledge. That conclusion on the DDR3 is wrong, coping 960x1080 images from - to RAM isn't very bandwidth expensive.

It is... you need to move these 4 frames at least 60 times per second and create the framebuffer to work on before to send to screen... you can't do that on PS3 or 360... I guess you can't too on Xbone because the eSRAM is limited to 32MB that is already not enought for the frame buffer.


I love when you start make things up.

You don´t need all the whole  4 frames stored anywhere to calculate the movement vector, you don´t even need those pixel to have full color range. You can also work with blobs and other image artefacts other than each pixel movement vector. XOne BW is more than enouth to handle this situation, may be not in the exactly same way GG used on the PS4, but "temporal upscale" is not much more demanding than "spatial upscale".



Around the Network
Kynes said:
ethomaz said:

Kynes:

You shouldn't guess/compare with 720p or 900p requirements, because you don't have the graphics development knowledge. That conclusion on the DDR3 is wrong, coping 960x1080 images from - to RAM isn't very bandwidth expensive.

It is... you need to move these 4 frames at least 60 times per second and create the framebuffer to work on before to send to screen... you can't do that on PS3 or 360... I guess you can't too on Xbone because the eSRAM is limited to 32MB that is already not enought for the frame buffer.

A single 32 bits per channel 960x1080 rgb image weights less than 12 MB, copyng from - to 4 of them, 60 times per second uses less than 3 GBPS, much less than the 68 GBPS of the X1. This is a worst case scenario, taking into consideration that no one uses 32 bits per channel in gaming, and I'm not taking into consideration color compression. Using 8 bits per channel you use more or less 1 GBPS on this, something I'm sure it's doable.

It not work like that... you are working in a eSRAM frame buffer and so you need to go to DDR3 to read/write information while you work with it... in PS4 you already have theses India in the main RAM... in Xbone not... you have addictional read/write cycles not found on PS4... you are using Xbone bandwidth to others things like textures for example and you need to travel everything between main RAM and eSRAM... so add more these tasks will compromises and at the end devs will need to sacrifices something.

This won't work on Xbone for native 1080p framebuffer... maybe 720p or 900p that is not what people here are expecting.



ethomaz said:
Kynes said:
ethomaz said:

Kynes:

You shouldn't guess/compare with 720p or 900p requirements, because you don't have the graphics development knowledge. That conclusion on the DDR3 is wrong, coping 960x1080 images from - to RAM isn't very bandwidth expensive.

It is... you need to move these 4 frames at least 60 times per second and create the framebuffer to work on before to send to screen... you can't do that on PS3 or 360... I guess you can't too on Xbone because the eSRAM is limited to 32MB that is already not enought for the frame buffer.

A single 32 bits per channel 960x1080 rgb image weights less than 12 MB, copyng from - to 4 of them, 60 times per second uses less than 3 GBPS, much less than the 68 GBPS of the X1. This is a worst case scenario, taking into consideration that no one uses 32 bits per channel in gaming, and I'm not taking into consideration color compression. Using 8 bits per channel you use more or less 1 GBPS on this, something I'm sure it's doable.

It not work like that... you are working in a eSRAM frame buffer and so you need to go to DDR3 to read/write information while you work with it... in PS4 you already have theses India in the main RAM... in Xbone not... you have addictional read/write cycles not found on PS4... you are using Xbone bandwidth to others things like textures for example and you need to travel everything between main RAM and eSRAM... so add more these tasks will compromises and at the end devs will need to sacrifices something.

This won't work on Xbone for native 1080p framebuffer... maybe 720p or 900p that is not what people here are expecting.

As always, you are talking out of your ass. You don't know how it works, and say whatever makes you feel better. Of course this is an incremental thing, as you have a bandwith pool you work with, but you should take into consideration that "creating" the 960x1080 frame is tons more bandwidth intensive than reading - writting three 960 x 1080 textures.



A LIE is a LIE. I bashed albert penallo for lying about RYSE resolution. Now its time for guerrilla games time.



Dark_Feanor said:

I love when you start make things up.

You don´t need all the whole  4 frames stored anywhere to calculate the movement vector, you don´t even need those pixel to have full color range. You can also work with blobs and other image artefacts other than each pixel movement vector. XOne BW is more than enouth to handle this situation, may be not in the exactly same way GG used on the PS4, but "temporal upscale" is not much more demanding than "spatial upscale".

There is almost no demand to upscale a framebuffer... any weak CPU do that with videos on Cellphones without trouble... this tech is demanding... it is not like a 1080p native reader but it is demanding.

And Xbone BW is a limitation to work with it unless you didn't use the BW to render the game lol.



Around the Network

I think it's a great technique and should be used in all graphically challenging games. It's too bad you can't quite categorize it. It's not really a lower resolution as there is nothing scaled up and it's not quite half the frames.
Looks to me like the perfect compromise between resolution and framerate.



If you demand respect or gratitude for your volunteer work, you're doing volunteering wrong.

Kynes said:
ethomaz said:
Kynes said:
ethomaz said:

Kynes:

 You shouldn't guess/compare with 720p or 900p requirements, because you don't have the graphics development knowledge. That conclusion on the DDR3 is wrong, coping 960x1080 images from - to RAM isn't very bandwidth expensive.

It is... you need to move these 4 frames at least 60 times per second and create the framebuffer to work on before to send to screen... you can't do that on PS3 or 360... I guess you can't too on Xbone because the eSRAM is limited to 32MB that is already not enought for the frame buffer.

A single 32 bits per channel 960x1080 rgb image weights less than 12 MB, copyng from - to 4 of them, 60 times per second uses less than 3 GBPS, much less than the 68 GBPS of the X1. This is a worst case scenario, taking into consideration that no one uses 32 bits per channel in gaming, and I'm not taking into consideration color compression. Using 8 bits per channel you use more or less 1 GBPS on this, something I'm sure it's doable.

It not work like that... you are working in a eSRAM frame buffer and so you need to go to DDR3 to read/write information while you work with it... in PS4 you already have theses India in the main RAM... in Xbone not... you have addictional read/write cycles not found on PS4... you are using Xbone bandwidth to others things like textures for example and you need to travel everything between main RAM and eSRAM... so add more these tasks will compromises and at the end devs will need to sacrifices something.

This won't work on Xbone for native 1080p framebuffer... maybe 720p or 900p that is not what people here are expecting.

As always, you are talking out of your ass.

No. I'm talking abou what I already worked on... you are assumptions no other work is being done and using the DDR3 while you have to deal with these storage past frames outside the eSRAM... a hint whilee you are working with the actual framebuffer the GPU is already processing the new one using the DDR BW in parallel... it is not the GPU is just doing the predictions for the actual framebuffer or post-processing and not working in the results for the new framebuffer... or the DDR3 BW is not being used.

I can bet with you this tech won't be used to reach 1080p on Xbone because they can't.



ethomaz said:
Kynes said:
...

As always, you are talking out of your ass.

No. I'm talking abou what I already worked on... you are assumptions no other work is being done and using the DDR3 while you have to deal with these storage past frames outside the eSRAM... a hint whilee you are working with the actual framebuffer the GPU is already processing the new one using the DDR BW in parallel... it is not the GPU is just doing the predictions for the actual framebuffer or post-processing and not working in the results for the new framebuffer... or the DDR3 BW is not being used.

I can bet with you this tech won't be used to reach 1080p on Xbone because they can't.

So you have been developing an XB1 game? Interesting.



Anybody gonna tell me what other game has used this technique? And why they did it in the first place if the PS4 should be able to handle Native 1920x1080?



Kynes:

So you have been developing an XB1 game? Interesting.

No... you sound like you never worked with GPUs and bandwidth on PC.

But of course... Xbone have a magical secret sauce that makes the impossible possible... I think you are reading so much mixterxmedia's blog.