Quantcast
Digital Foundry: PS4 KILLZONE uses 3GB RAM! (CONFIRMED)

Forums - Sony Discussion - Digital Foundry: PS4 KILLZONE uses 3GB RAM! (CONFIRMED)

About the memory I found something interesting...

Form the 4.6GB at least 600MB is not to use from memory... it is supposed the be streaming directly from HDD (like dialogs) but Guerrilha is using a pool in memory... than can change in final build.

And about the CPU the guys in B3D are saying that the devkit was running at 1.6Ghz and Killzone using 6 threads... that information was analysing the profile output.



Around the Network
Pemalite said:
You can tell they are getting GPU limited with the demo, due to the fact it's only 30fps and the AA method used. (However, it's still early days, but that's their framerate target anyway.)
Then, to make 30fps feel smoother, they added blurring techniques and this Demo has a bit of blurring on the objects and characters, not very pronounced in the video, but it is there.

If you look back at the Xbox 360 and PS3 launches, most games were in a similar situation, most running at 1080P initially, but when you start adding more effects and geometry, sacrifices had to be made and one of those eventually became the resolution after a few years into the consoles life, overall graphics did look better though, so it was a sacrifice that was well worth it from an overall perspective.

Man, I wish it would come to PC though, so we can dial-up the graphics, framerate and resolution more, with that said, I'm actually looking forward to getting my hands on a PS4, new uncharted trilogy perhaps?

Slightly off-topic, but have you grabbed Last Light yet? It will make those PC specs of yours work!

OT: It will be interesting to see what benefits are generated from all of that RAM, for both it and PC.



                                                                                                                                            

ethomaz said:

zarx said:

Wait isn't the extra RAM in dev kits usually taken up by the debugging and performance analysis tools? And surely it would be considered bad practice to use more RAM than final hardware for the game, especially when they go out of their way to say "no cheats".

Memory optimizations, compression and others stuffs are not cheats...

They said in the slides they not did optimizations in memory yet... so they are using more than the final game will use.

Well that is my guess looking the slides.

I know a code running in debug mode uses more RAM than the normal... they are running in debug with other apps to trace everything that happened with the game running.

 

I meant running over 1.5GB over their presumed memory target would have been a cheat. 



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

ethomaz said:

About the memory I found something interesting...

Form the 4.6GB at least 600MB is not to use from memory... it is supposed the be streaming directly from HDD (like dialogs) but Guerrilha is using a pool in memory... than can change in final build.

And about the CPU the guys in B3D are saying that the devkit was running at 1.6Ghz and Killzone using 6 threads... that information was analysing the profile output.


All streaming engines I know of use a pool in memory for streamed content, it's basically a cache for the in use files. You can manually tweak the pool sizes in ini files for Unreal engine games on PC.

I wonder if the PS4 has 2 cores locked for OS and background tasks. They used one SPE in the PS3 but the PS4 is doing a lot more demanding stuff in the background. 



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

If it looks that good using only 3GB... wow.



Around the Network
zarx said:
ethomaz said:

About the memory I found something interesting...

Form the 4.6GB at least 600MB is not to use from memory... it is supposed the be streaming directly from HDD (like dialogs) but Guerrilha is using a pool in memory... than can change in final build.

And about the CPU the guys in B3D are saying that the devkit was running at 1.6Ghz and Killzone using 6 threads... that information was analysing the profile output.


All streaming engines I know of use a pool in memory for streamed content, it's basically a cache for the in use files. You can manually tweak the pool sizes in ini files for Unreal engine games on PC.

I wonder if the PS4 has 2 cores locked for OS and background tasks. They used one SPE in the PS3 but the PS4 is doing a lot more demanding stuff in the background. 

A lot of that demanding stuff is actually being done by a separate CPU (The video processor).  A single jaguar core will do more than an SPU, unless it's calculations, so one should be more than enough (Though I'm talking out of my ass here).



ethomaz said:

The game is not optimized yet.

The CPU jobs is way more threaded than PS3... PC eight-core games will be beneficied from that too.

 


"No low-level CPU optimizations"

 

And most likely no game will in NextGen. Almost nobody nowadays uses low-level x86-code anymore because compilers are especially good. Intrinsics are ok, sure, and I think that the guys from NaughtyDog / ICE have done a good job.

 

@Ethomaz:  PC eight-core games will be beneficied from that too.

Why? I don't see how this will also affect PC because the api is built by ND and is used by an internal team working for Sony.



walsufnir said:

@Ethomaz:  PC eight-core games will be beneficied from that too.

Why? I don't see how this will also affect PC because the api is built by ND and is used by an internal team working for Sony.


You reckon it will?

Take a look at any Multi-Platform game, the Xbox had 6 available threads, so did the PS3. Yet the majority of console ports were only Dual-Threaded. - Leaving me with 10 threads idle more often than not.
The PS4 and Xbox 720(Maybe) will only have 2 extra threads over the current generation consoles... - I'm just hoping that developers don't take the lazy way out and only throw everything onto a couple of threads for the PC version of the game like they did this generation.
Few exceptions to that rule though like Battlefield 3, but that was built for PC and downscaled for the consoles.

In all honesty, I share Anandtech's sentiments where I wish console manufacturers would take CPU performance a little more seriously, I can understand why they don't though, cost being a major one.

CGI-Quality said:

Slightly off-topic, but have you grabbed Last Light yet? It will make those PC specs of yours work!

OT: It will be interesting to see what benefits are generated from all of that RAM, for both it and PC.


I haven't actually. Waiting for a Steam sale to send me broke and pick it up then. :P



Pemalite said:
walsufnir said:

@Ethomaz:  PC eight-core games will be beneficied from that too.

Why? I don't see how this will also affect PC because the api is built by ND and is used by an internal team working for Sony.


You reckon it will?

Take a look at any Multi-Platform game, the Xbox had 6 available threads, so did the PS3. Yet the majority of console ports were only Dual-Threaded. - Leaving me with 10 threads idle more often than not.
The PS4 and Xbox 720(Maybe) will only have 2 extra threads over the current generation consoles... - I'm just hoping that developers don't take the lazy way out and only throw everything onto a couple of threads for the PC version of the game like they did this generation.
Few exceptions to that rule though like Battlefield 3, but that was built for PC and downscaled for the consoles.

In all honesty, I share Anandtech's sentiments where I wish console manufacturers would take CPU performance a little more seriously, I can understand why they don't though, cost being a major one.


Where do you have the info from with the 6 threads?

And yes, given that devs now can share even more code with consoles and PC, I think in the end PC-games will also be more threaded. But the reason is not Killzone or GG because they don't develop PC-games. It's up to the devs of frostbite and other multiplatfom-engines to do so. Also keep in mind that 2 threads of a Core i7 could easily outperform the 8-thread-jaguar ;) But either way: Even if you use 100 threads this does not mean that they are getting used properly. If you hope for better AI I think you will be disappointed mostly.



bananaking21 said:
dahuman said:
ethomaz said:
dahuman said:
You generally don't want MSAA these days though, HQ FXAA/MLAA would be a good thing anyways.

Because the MSAA uses more resourses... maybe they will use again with PS4 games.


yeah, I think it's fine to use it for 1st gen PS4 games, but they can use the RAM for other things later down the line to save resources since MSAA uses a lot of RAM and bandwidth depending on the engine without most people even noticing the difference, it'd be a really good and smart trade for future engines.


an off topic question but since you know what you are talking about i would like to ask. on a PC, it seems that FXAA uses less resources than MSAA? should i just use that?


I'd just use it personally, the performance hit is much lower overall and the quality is quiet good these days.