By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Azzanation said:

MS stated quite clearly the first time about Cloud computing, when they mentioned Crackdown 3 to use it. So instead of trying to downplay what you dont understand, wait for Crackdown 3 which had a playable demo at Gamescon not even acouple weeks old. This tech has just been proven to work, now all we need is a game being designed around it, which is coming soon. 


No, MS stated at the Xbone reveal that the Cloud will give the Xbone 3x times its power. And reiterated that at the Japanese presentation:

http://www.dualshockers.com/2014/04/26/xbox-one-microsoft-claims-cloud-computing-can-provide-power-of-3-xbox-ones-32-xbox-360s/

"combining the the local machine with the cloud, the computational juice of roughly three Xbox ones units could be used"

The Cloud as I wrote before but you clearly didn't read it can theoretially make available (in batch processes) to the client even thousands of times its power. This is nothing new, actually it is so old that it is lame to brag about it. I give you a simple but very old example: dumb X-terminals (thus less powerful than modern televisions) running X11 sessions. The computation is all done in the remote machine(s) and it is as powerful as the latter(s). I used it again in 1990 when we needed to run in USA a program we developed in Italy. Just imagine the pain on working 1 frame per minute.

But since I know this answer doesn't satisfy you because it sounds more like Cloud streaming, I repeat the example made before. One could use the Cloud in a game like Watch Dog to decrypt strings made by the users. For each key there would be a connection to a super computer (or a distributed algorithm) in the Cloud. There. Thousands if not million of times the power of the client, depending on what is available (lower the power available longer the time to decrypt the string). Or take the Chess game. Where the opponent runs in a supercomputer instead of the local host.

Crackdown 3 isn't much different. Trigger in game the explosion, send the corresponding data to the Cloud, have the Cloud computing and streaming back the result of the explosion simulation for each frame (numbered or else). The game (maybe after some buffering "just in case" to bypass low latencies) picks up the data for first frame  already stored locally and starts rendering. Then it proceeds with the following frames, while other may still be arriving. That's a possibe framework that would work. 

Techniques like this do not need to be proven to work, they have been working nearly forever in IT terms. The main difference is that to have them running for hundred of thousands consoles at the same time is costly. Also if it will work fine immediately with all servers under stress that's an achievement. Quantity does matter in this case. Just look at the many MP games broken at launch.

And for your reference I do understand what I am talking about.