By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - What you think of Crackdown's Cloud Computing?

Mystro-Sama said:

Don't get me wrong, Crackdown 3 looks fantastic so far and the physics with the cloud is amazing but i'm not sure how great it will be in the long run. Server cost money to run and they can't run forever so I wonder what will be left of this game once the servers turn off, or if I go to my house in the country without internet? Will there possibly be 2 modes to this game? One that gives you the full experience with the cloud and the other that only uses the X1 where a few things can't be destroyed?


From what i have heard is they are keeping the 100% destructive enviorments for online multiplayer only. Single player campaign can run offline and will not feature this mode. So if MS happenly go bankrupt and happen to shut the servers down, you casn still play the single player mode. So just like every other game which offers both Single and Multiplayer modes.

The way these consoles are going, without an online connection isnt wise.



Around the Network
Michelasso said:
Lortsamler said:

I agree with all of this. That's why i am question this claim from Microsoft.  


Ah, don't get me wrong. I thought you were asking how it worked. The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed. The Cloud can give even thousands of times more power in batch jobs (like in the decryption example), but on real time? Not a chance. The Internet latency kills it all. It's even paradoxical. More (total) computing power would also mean an higher frame rate. But higher the frame rate smaller is the time window to produce a frame. Smaller that time higher are the chances for the client to skip the remotely computed data. And the internet latency has a die hard constraint: the speed of light. The packets can't travel a geographical distance faster than that (1ms every 150Km/100 miles round trip). And we know how in real life the Internet is quite slower.

Quote *The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed.*  

MS just showed the world a demo where it was using 20 x X1s power to achieve these physics

And your claiming MS spoke BS when they stated 3 times the X1s power. I dont understand your point at all.

https://www.youtube.com/watch?v=s1b97RAZUBE

https://www.youtube.com/watch?v=OnzHNXFZ768&feature=youtu.be

 

Who do we listen to, an ingame demo of an upcoming game or just assumtions from fans? The ingame demo just put the deniers to shame.



Azzanation said:
Michelasso said:
Lortsamler said:

I agree with all of this. That's why i am question this claim from Microsoft.  


Ah, don't get me wrong. I thought you were asking how it worked. The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed. The Cloud can give even thousands of times more power in batch jobs (like in the decryption example), but on real time? Not a chance. The Internet latency kills it all. It's even paradoxical. More (total) computing power would also mean an higher frame rate. But higher the frame rate smaller is the time window to produce a frame. Smaller that time higher are the chances for the client to skip the remotely computed data. And the internet latency has a die hard constraint: the speed of light. The packets can't travel a geographical distance faster than that (1ms every 150Km/100 miles round trip). And we know how in real life the Internet is quite slower.

Quote *The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed.*  

MS just showed the world a demo where it was using 20 x X1s power to achieve these physics

And your claiming MS spoke BS when they stated 3 times the X1s power. I dont understand your point at all.

https://www.youtube.com/watch?v=s1b97RAZUBE

https://www.youtube.com/watch?v=OnzHNXFZ768&feature=youtu.be

 

 

Who do we listen to, an ingame demo of an upcoming game or just assumtions from fans? The ingame demo just put the deniers to shame.

What demo? The one on PC still about explosions made in a controlled enviroment using Ethernet/fiber optics connections? That's the definition of a computer farm, not Cloud computing. Yes, computer farms can duplicate the real time power. I know this from year 1990 since I worked on that. Still the efficiency on that parallel system using a SCSI bus as a communication channel deteriorated quite quickly. The CPUs were obviously inferior to anything avaiable right now, but the SCSI latency was still much lower than any remote Internet connection.

Or you're talking about Crackdown 3? Those explosions are deferred computations, they are not in real time. The Cloud can't help improving the computation in a frame. Which is what MS meant when they said "the Cloud will make to the XB1 3x more powerful" (or whatever was the full sentence). Bollocks. That would mean that the same game running locally at 30 fps can easily reach 90-120fps via Cloud computing. I invite MS to show us how that would be possible, because I have been waiting forever.



Its multiplayer only so its great imo since you'd need to have a decent connection in the first place.



Michelasso said:
Azzanation said:
Michelasso said:
Lortsamler said:

I agree with all of this. That's why i am question this claim from Microsoft.  


Ah, don't get me wrong. I thought you were asking how it worked. The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed. The Cloud can give even thousands of times more power in batch jobs (like in the decryption example), but on real time? Not a chance. The Internet latency kills it all. It's even paradoxical. More (total) computing power would also mean an higher frame rate. But higher the frame rate smaller is the time window to produce a frame. Smaller that time higher are the chances for the client to skip the remotely computed data. And the internet latency has a die hard constraint: the speed of light. The packets can't travel a geographical distance faster than that (1ms every 150Km/100 miles round trip). And we know how in real life the Internet is quite slower.

Quote *The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed.*  

MS just showed the world a demo where it was using 20 x X1s power to achieve these physics

And your claiming MS spoke BS when they stated 3 times the X1s power. I dont understand your point at all.

https://www.youtube.com/watch?v=s1b97RAZUBE

https://www.youtube.com/watch?v=OnzHNXFZ768&feature=youtu.be

 

 

Who do we listen to, an ingame demo of an upcoming game or just assumtions from fans? The ingame demo just put the deniers to shame.

What demo? The one on PC still about explosions made in a controlled enviroment using Ethernet/fiber optics connections? That's the definition of a computer farm, not Cloud computing. Yes, computer farms can duplicate the real time power. I know this from year 1990 since I worked on that. Still the efficiency on that parallel system using a SCSI bus as a communication channel deteriorated quite quickly. The CPUs were obviously inferior to anything avaiable right now, but the SCSI latency was still much lower than any remote Internet connection.

Or you're talking about Crackdown 3? Those explosions are deferred computations, they are not in real time. The Cloud can't help improving the computation in a frame. Which is what MS meant when they said "the Cloud will make to the XB1 3x more powerful" (or whatever was the full sentence). Bollocks. That would mean that the same game running locally at 30 fps can easily reach 90-120fps via Cloud computing. I invite MS to show us how that would be possible, because I have been waiting forever.

MS stated quite clearly the first time about Cloud computing, when they mentioned Crackdown 3 to use it. So instead of trying to downplay what you dont understand, wait for Crackdown 3 which had a playable demo at Gamescon not even acouple weeks old. This tech has just been proven to work, now all we need is a game being designed around it, which is coming soon. 



Around the Network
Azzanation said:

MS stated quite clearly the first time about Cloud computing, when they mentioned Crackdown 3 to use it. So instead of trying to downplay what you dont understand, wait for Crackdown 3 which had a playable demo at Gamescon not even acouple weeks old. This tech has just been proven to work, now all we need is a game being designed around it, which is coming soon. 


No, MS stated at the Xbone reveal that the Cloud will give the Xbone 3x times its power. And reiterated that at the Japanese presentation:

http://www.dualshockers.com/2014/04/26/xbox-one-microsoft-claims-cloud-computing-can-provide-power-of-3-xbox-ones-32-xbox-360s/

"combining the the local machine with the cloud, the computational juice of roughly three Xbox ones units could be used"

The Cloud as I wrote before but you clearly didn't read it can theoretially make available (in batch processes) to the client even thousands of times its power. This is nothing new, actually it is so old that it is lame to brag about it. I give you a simple but very old example: dumb X-terminals (thus less powerful than modern televisions) running X11 sessions. The computation is all done in the remote machine(s) and it is as powerful as the latter(s). I used it again in 1990 when we needed to run in USA a program we developed in Italy. Just imagine the pain on working 1 frame per minute.

But since I know this answer doesn't satisfy you because it sounds more like Cloud streaming, I repeat the example made before. One could use the Cloud in a game like Watch Dog to decrypt strings made by the users. For each key there would be a connection to a super computer (or a distributed algorithm) in the Cloud. There. Thousands if not million of times the power of the client, depending on what is available (lower the power available longer the time to decrypt the string). Or take the Chess game. Where the opponent runs in a supercomputer instead of the local host.

Crackdown 3 isn't much different. Trigger in game the explosion, send the corresponding data to the Cloud, have the Cloud computing and streaming back the result of the explosion simulation for each frame (numbered or else). The game (maybe after some buffering "just in case" to bypass low latencies) picks up the data for first frame  already stored locally and starts rendering. Then it proceeds with the following frames, while other may still be arriving. That's a possibe framework that would work. 

Techniques like this do not need to be proven to work, they have been working nearly forever in IT terms. The main difference is that to have them running for hundred of thousands consoles at the same time is costly. Also if it will work fine immediately with all servers under stress that's an achievement. Quantity does matter in this case. Just look at the many MP games broken at launch.

And for your reference I do understand what I am talking about. 

 



Michelasso said:
Azzanation said:

MS stated quite clearly the first time about Cloud computing, when they mentioned Crackdown 3 to use it. So instead of trying to downplay what you dont understand, wait for Crackdown 3 which had a playable demo at Gamescon not even acouple weeks old. This tech has just been proven to work, now all we need is a game being designed around it, which is coming soon. 


No, MS stated at the Xbone reveal that the Cloud will give the Xbone 3x times its power. And reiterated that at the Japanese presentation:

http://www.dualshockers.com/2014/04/26/xbox-one-microsoft-claims-cloud-computing-can-provide-power-of-3-xbox-ones-32-xbox-360s/

"combining the the local machine with the cloud, the computational juice of roughly three Xbox ones units could be used"

The Cloud as I wrote before but you clearly didn't read it can theoretially make available (in batch processes) to the client even thousands of times its power. This is nothing new, actually it is so old that it is lame to brag about it. I give you a simple but very old example: dumb X-terminals (thus less powerful than modern televisions) running X11 sessions. The computation is all done in the remote machine(s) and it is as powerful as the latter(s). I used it again in 1990 when we needed to run in USA a program we developed in Italy. Just imagine the pain on working 1 frame per minute.

But since I know this answer doesn't satisfy you because it sounds more like Cloud streaming, I repeat the example made before. One could use the Cloud in a game like Watch Dog to decrypt strings made by the users. For each key there would be a connection to a super computer (or a distributed algorithm) in the Cloud. There. Thousands if not million of times the power of the client, depending on what is available (lower the power available longer the time to decrypt the string). Or take the Chess game. Where the opponent runs in a supercomputer instead of the local host.

Crackdown 3 isn't much different. Trigger in game the explosion, send the corresponding data to the Cloud, have the Cloud computing and streaming back the result of the explosion simulation for each frame (numbered or else). The game (maybe after some buffering "just in case" to bypass low latencies) picks up the data for first frame  already stored locally and starts rendering. Then it proceeds with the following frames, while other may still be arriving. That's a possibe framework that would work. 

Techniques like this do not need to be proven to work, they have been working nearly forever in IT terms. The main difference is that to have them running for hundred of thousands consoles at the same time is costly. Also if it will work fine immediately with all servers under stress that's an achievement. Quantity does matter in this case. Just look at the many MP games broken at launch.

And for your reference I do understand what I am talking about. 

 

Yep this. The destruction is basically scripted from the console's point of view, a script remotely generated on the fly and downloaded to the console to be rendered locally. The console is not calling remote functions, the multiplayer game plays out on dedicated virtual servers. It's not that different than WoW in that respect, except that there are far more moving things and dynamic geometry in view.

I expect it to be a combination of a scrip and local tracking of objects. I doubt the server will send data for every piece for every frame. Rather it will send simple commands: Delete object A,  Add object X,Y,Z with geometry, position and velocity info, Change direction and rotation of object B to ... and only update the pieces involved in a collision. Meanwhile the console can track the falling pieces with simple gravity rules.


The 2 big problems are latency and bandwidth.
Latency buggers up real time improvements, forget about cloud powered VR devices.
Bandwidth limitations bugger up big dynamic environments, forget about cloud powered realistic water simulations.


Saying the cloud makes the XBox One more powerful is just marketing speak. Get WoW and your PC is now 75,000 times as powerful!
http://techcrunch.com/2009/09/18/blizzard-reveals-some-technical-data-about-world-of-warcraft/



Michelasso said:
Azzanation said:

MS stated quite clearly the first time about Cloud computing, when they mentioned Crackdown 3 to use it. So instead of trying to downplay what you dont understand, wait for Crackdown 3 which had a playable demo at Gamescon not even acouple weeks old. This tech has just been proven to work, now all we need is a game being designed around it, which is coming soon. 


No, MS stated at the Xbone reveal that the Cloud will give the Xbone 3x times its power. And reiterated that at the Japanese presentation:

http://www.dualshockers.com/2014/04/26/xbox-one-microsoft-claims-cloud-computing-can-provide-power-of-3-xbox-ones-32-xbox-360s/

"combining the the local machine with the cloud, the computational juice of roughly three Xbox ones units could be used"

The Cloud as I wrote before but you clearly didn't read it can theoretially make available (in batch processes) to the client even thousands of times its power. This is nothing new, actually it is so old that it is lame to brag about it. I give you a simple but very old example: dumb X-terminals (thus less powerful than modern televisions) running X11 sessions. The computation is all done in the remote machine(s) and it is as powerful as the latter(s). I used it again in 1990 when we needed to run in USA a program we developed in Italy. Just imagine the pain on working 1 frame per minute.

But since I know this answer doesn't satisfy you because it sounds more like Cloud streaming, I repeat the example made before. One could use the Cloud in a game like Watch Dog to decrypt strings made by the users. For each key there would be a connection to a super computer (or a distributed algorithm) in the Cloud. There. Thousands if not million of times the power of the client, depending on what is available (lower the power available longer the time to decrypt the string). Or take the Chess game. Where the opponent runs in a supercomputer instead of the local host.

Crackdown 3 isn't much different. Trigger in game the explosion, send the corresponding data to the Cloud, have the Cloud computing and streaming back the result of the explosion simulation for each frame (numbered or else). The game (maybe after some buffering "just in case" to bypass low latencies) picks up the data for first frame  already stored locally and starts rendering. Then it proceeds with the following frames, while other may still be arriving. That's a possibe framework that would work. 

Techniques like this do not need to be proven to work, they have been working nearly forever in IT terms. The main difference is that to have them running for hundred of thousands consoles at the same time is costly. Also if it will work fine immediately with all servers under stress that's an achievement. Quantity does matter in this case. Just look at the many MP games broken at launch.

And for your reference I do understand what I am talking about. 

 

Theres no point in trying to justify your point, i am not disagreeing or agreeing with your post, however MS stated 3 x the power of the X1 which they have proven. MS have the server structure and looks like there trying to implement it into there gaming devision. Regardless MS havnt lied, what they stated has so far been true. Those who want to shoot it down can, but the problem with listening to them is that what if there wrong? In this case its looking like the truth is finially coming out and the next gen wave of online games might be around the corner.

From what i have seen from Gamescon, is proof that what they stated in 2013 is more true then false. its amazing technologly and Sony or Nintendo probably couldnt afford to do this. 

You have to take risks to break the boundaries of reality, you will be made to look like a fool but if your successful, they would have make the world look like just as foolish. The greatest scientists in the world have visions that sound stupid, however its there visions that continue to grow mankind.

What MS is doing is unqiue and risky however i like what i see and there the only ones pushing it.



The misinformation and assumptions on this thread are funny as many are incorrect. Azure, or 'The Cloud' will use up to 11 severs to render the structure of the city - which is large and destructable.  The your local computer will decompress , like it does when you watch a movie off Youtube or Netflix, and apply your prescpective and other details.  It doesn't use that much bandwith at 2-4 down and 1 up.

Here is a demo from a few years ago from what Microsoft and Nvidia were working on together. 



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Michelasso said:
Lortsamler said:
Michelasso said:
Lortsamler said:
I got a 200 Mb connection,and that's equal to a 25 MB memory bandwidth. How this cloud computing is supposed to work,is beyond me. The only thing i can think ofthat will work, is post processing. I need to see this before i belive it.


It's remote processing. And it is quite old technology in IT terms. Sun Microsystem implemented the RPC (Remote Procedure Calls) in the 1980s. Simply put developers have a set of API where when they call a subroutine the execution can be requested on a remote server. So in parallel programming one can send the execution of some jobs to a bunch of servers freeing the CPU which can do other tasks.

Both the latency and bandwidth are relevant, with the latter affecting more the total "computation time" larger they are the chunks of data sent. But for example one could use RPCs to have a game like Watch Dogs decrypting some user defined keys. The amount of data would be negligible but the computation times could be exceptionally high.

In Crackdown case the bandwidth should matter, because the "Cloud" (remote servers) have to send back the shape and position of each piece. Lower the bandwidth longer it would take for a buinding to explode, or less pieces must be computed. It may also happen that in the real environment (Internet) some pieces not arriving in time get discarded. Or another implementation would be to have a deferred computation. After the explosion is triggered the remote servers start to compute its whole dynamic and send it as a stream to the client (the console). the client then for each frame picks up the relevant data (already stored) and does the final rendering considering the Point of View and other factors.

Still there is a lot of smoke and mirrors. 20x the power of the XB1 means nothing. Is MS referring to the CPU (the one usually computing this kind of physics) or the GPU? In the first case it would be 2 TFLOPS per explosion. In the latter 26 TFLOPS. I highly doubt it takes so much for that kind of effect.

Also this is not the "Cloud power" MS promised at the reveal, where the Cloud can help rendering. That one simply isn't possible. The total latency in Internet is so high that once the data reaches the client the console is already computing other frames.

I am thinking there must be some some procedural generation technics involved,because i can imagine that would work.


"No Man's Sky" indeed could use the Cloud to generate the worlds. But as far as I understood  (with my surprise) that is not the case. It's probably like in Mineraft where each single seed generates its own world which is the same for everyone for that seed. And the world generation is computed in game.

Still it doesn't really matter. Communication performances apart from a theoretical point of view any algorithm that can be done in a local host can be implemented also using the Cloud. The Cloud servers can be seen as extra CPU cores dedicated to specific tasks. But the problem is indeed the time of sending the data from the remote servers to the "hub". If the time it takes to receive the data is higher than what it would be computing the same data locally it's just a waste. This greatly reduces the number of possible applications, especially real time applications like the video games.

Then let's not forget about the cost. Even if it is internal "phoney money" XBL must pay Azure for the resources used.

No Man's Sky doesn't need the the cloud. Everything is generated around you in a certain proximity and more things are generated as you move. They have a video explaining it.