By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Mystro-Sama said:
Michelasso said:
Lortsamler said:
Michelasso said:
Lortsamler said:
I got a 200 Mb connection,and that's equal to a 25 MB memory bandwidth. How this cloud computing is supposed to work,is beyond me. The only thing i can think ofthat will work, is post processing. I need to see this before i belive it.


It's remote processing. And it is quite old technology in IT terms. Sun Microsystem implemented the RPC (Remote Procedure Calls) in the 1980s. Simply put developers have a set of API where when they call a subroutine the execution can be requested on a remote server. So in parallel programming one can send the execution of some jobs to a bunch of servers freeing the CPU which can do other tasks.

Both the latency and bandwidth are relevant, with the latter affecting more the total "computation time" larger they are the chunks of data sent. But for example one could use RPCs to have a game like Watch Dogs decrypting some user defined keys. The amount of data would be negligible but the computation times could be exceptionally high.

In Crackdown case the bandwidth should matter, because the "Cloud" (remote servers) have to send back the shape and position of each piece. Lower the bandwidth longer it would take for a buinding to explode, or less pieces must be computed. It may also happen that in the real environment (Internet) some pieces not arriving in time get discarded. Or another implementation would be to have a deferred computation. After the explosion is triggered the remote servers start to compute its whole dynamic and send it as a stream to the client (the console). the client then for each frame picks up the relevant data (already stored) and does the final rendering considering the Point of View and other factors.

Still there is a lot of smoke and mirrors. 20x the power of the XB1 means nothing. Is MS referring to the CPU (the one usually computing this kind of physics) or the GPU? In the first case it would be 2 TFLOPS per explosion. In the latter 26 TFLOPS. I highly doubt it takes so much for that kind of effect.

Also this is not the "Cloud power" MS promised at the reveal, where the Cloud can help rendering. That one simply isn't possible. The total latency in Internet is so high that once the data reaches the client the console is already computing other frames.

I am thinking there must be some some procedural generation technics involved,because i can imagine that would work.


"No Man's Sky" indeed could use the Cloud to generate the worlds. But as far as I understood  (with my surprise) that is not the case. It's probably like in Mineraft where each single seed generates its own world which is the same for everyone for that seed. And the world generation is computed in game.

Still it doesn't really matter. Communication performances apart from a theoretical point of view any algorithm that can be done in a local host can be implemented also using the Cloud. The Cloud servers can be seen as extra CPU cores dedicated to specific tasks. But the problem is indeed the time of sending the data from the remote servers to the "hub". If the time it takes to receive the data is higher than what it would be computing the same data locally it's just a waste. This greatly reduces the number of possible applications, especially real time applications like the video games.

Then let's not forget about the cost. Even if it is internal "phoney money" XBL must pay Azure for the resources used.

No Man's Sky doesn't need the the cloud. Everything is generated around you in a certain proximity and more things are generated as you move. They have a video explaining it.

We know that No man's sky does'nt use the cloud,we were only discussing om what type of code that might benefit from the cloud.