By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - What you think of Crackdown's Cloud Computing?

smroadkill15 said:
SvennoJ said:

https://www.youtube.com/watch?v=hysScgGboBA
Red Faction Armageddon in 2011

Impressive, but still not at the same scale. The buildings are much larger in crackdown and more realistic destruction. 

Well you would expect the scale to be bigger since Red Faction Armageddon ran at 1080p60 on a dual core with Radeon HD 6950. The buildings in Crackdown 3 are larger yet the pieces are larger too.

That's not the point anyway, total destruction has been done before, and it didn't revolutionize gameplay.

As it stands, Red Faction: Armageddon is a decent enough shooter, but one that fails to really live up to its potential. Clearly, the GeoMod technology is worth evolving still further for future Red Faction games - but it's evident that any evolution in the destruction really needs to be matched by an equivalent boost in the ingenuity of the game design. In the here and now, Red Faction: Armageddon is a solid game, but not an outstanding one.

The tech is impressive, but will the gameplay follow suit. If not then it might have the same fate as Red faction.


Then you have people already calling Crackdown 3 out for not looking all that great, and I expect it to be toned down further to get all the mayhem to render fast enough in the final version. It's not by chance that Crackdown 3, a mostly cell shaded game, is getting this tech. For example the producer of Saints row was asked about adding total destruction for next gen (ps4/x1):

here’s the issue: we would still have to make compromises to the graphic fidelity by still sharding it up and making it so they can break apart and all these pieces.

“My suspicion is that if you looked at another game coming out that didn’t have that technology, that their buildings would look so superior to ours, the gamers would still look at it and say, ‘Wow, what’s wrong with Saints Row, why does it look so horrible? I know the buildings come apart but you know.’

“It’s tough to communicate, and we always had that problem even on Red Faction, that there’s no easy way to clearly communicate that yeah, the graphics may not be as amazing as these other cutting-edge games, but look at the engine. Look at all these things things it does. It’s just when you’re looking at a screenshot, or when you’re looking at a trailer you just kind of look at it and it’s like, ‘Nah, it doesn’t look as good as some of these other games that are out there’.

“There’s a reason for it but at the end of the day I don’t think gamers necessarily care. It’s just, My suspicion is that if they saw a Saints Row game that didn’t look nearly as good as the competition that’s out there, that they would just feel bad and say, ‘What the hell’s happened to Volition? What the hell’s wrong with that? The city just looks awful compared to what I’m seeing in these other games because of the destruction.

“With the kind of competition that’s out there I think, I suspect it would almost be impossible to do it and still remain competitive visually.”
http://www.vg247.com/2013/08/12/a-fully-geo-mod-enabled-saints-row-is-literally-impossible-in-this-gen-says-volition/

Ofcourse that was without off-loading the cpu to the cloud, yet the rendering problems remain the same.



Around the Network

I got a 200 Mb connection,and that's equal to a 25 MB memory bandwidth. How this cloud computing is supposed to work,is beyond me. The only thing i can think ofthat will work, is post processing. I need to see this before i belive it.



If it is also used in singleplayer wouldn't that mean you need Xbox Gold for singleplayer to?



AnthonyW86 said:
If it is also used in singleplayer wouldn't that mean you need Xbox Gold for singleplayer to?


It is not used in singleplayer. For the millionth time.



Lortsamler said:
I got a 200 Mb connection,and that's equal to a 25 MB memory bandwidth. How this cloud computing is supposed to work,is beyond me. The only thing i can think ofthat will work, is post processing. I need to see this before i belive it.


It's remote processing. And it is quite old technology in IT terms. Sun Microsystem implemented the RPC (Remote Procedure Calls) in the 1980s. Simply put developers have a set of API where when they call a subroutine the execution can be requested on a remote server. So in parallel programming one can send the execution of some jobs to a bunch of servers freeing the CPU which can do other tasks.

Both the latency and bandwidth are relevant, with the latter affecting more the total "computation time" larger they are the chunks of data sent. But for example one could use RPCs to have a game like Watch Dogs decrypting some user defined keys. The amount of data would be negligible but the computation times could be exceptionally high.

In Crackdown case the bandwidth should matter, because the "Cloud" (remote servers) have to send back the shape and position of each piece. Lower the bandwidth longer it would take for a buinding to explode, or less pieces must be computed. It may also happen that in the real environment (Internet) some pieces not arriving in time get discarded. Or another implementation would be to have a deferred computation. After the explosion is triggered the remote servers start to compute its whole dynamic and send it as a stream to the client (the console). the client then for each frame picks up the relevant data (already stored) and does the final rendering considering the Point of View and other factors.

Still there is a lot of smoke and mirrors. 20x the power of the XB1 means nothing. Is MS referring to the CPU (the one usually computing this kind of physics) or the GPU? In the first case it would be 2 TFLOPS per explosion. In the latter 26 TFLOPS. I highly doubt it takes so much for that kind of effect.

Also this is not the "Cloud power" MS promised at the reveal, where the Cloud can help rendering. That one simply isn't possible. The total latency in Internet is so high that once the data reaches the client the console is already computing other frames.



Around the Network
Michelasso said:
Lortsamler said:
I got a 200 Mb connection,and that's equal to a 25 MB memory bandwidth. How this cloud computing is supposed to work,is beyond me. The only thing i can think ofthat will work, is post processing. I need to see this before i belive it.


It's remote processing. And it is quite old technology in IT terms. Sun Microsystem implemented the RPC (Remote Procedure Calls) in the 1980s. Simply put developers have a set of API where when they call a subroutine the execution can be requested on a remote server. So in parallel programming one can send the execution of some jobs to a bunch of servers freeing the CPU which can do other tasks.

Both the latency and bandwidth are relevant, with the latter affecting more the total "computation time" larger they are the chunks of data sent. But for example one could use RPCs to have a game like Watch Dogs decrypting some user defined keys. The amount of data would be negligible but the computation times could be exceptionally high.

In Crackdown case the bandwidth should matter, because the "Cloud" (remote servers) have to send back the shape and position of each piece. Lower the bandwidth longer it would take for a buinding to explode, or less pieces must be computed. It may also happen that in the real environment (Internet) some pieces not arriving in time get discarded. Or another implementation would be to have a deferred computation. After the explosion is triggered the remote servers start to compute its whole dynamic and send it as a stream to the client (the console). the client then for each frame picks up the relevant data (already stored) and does the final rendering considering the Point of View and other factors.

Still there is a lot of smoke and mirrors. 20x the power of the XB1 means nothing. Is MS referring to the CPU (the one usually computing this kind of physics) or the GPU? In the first case it would be 2 TFLOPS per explosion. In the latter 26 TFLOPS. I highly doubt it takes so much for that kind of effect.

Also this is not the "Cloud power" MS promised at the reveal, where the Cloud can help rendering. That one simply isn't possible. The total latency in Internet is so high that once the data reaches the client the console is already computing other frames.

I am thinking there must be some some procedural generation technics involved,because i can imagine that would work.



Lortsamler said:
Michelasso said:
Lortsamler said:
I got a 200 Mb connection,and that's equal to a 25 MB memory bandwidth. How this cloud computing is supposed to work,is beyond me. The only thing i can think ofthat will work, is post processing. I need to see this before i belive it.


It's remote processing. And it is quite old technology in IT terms. Sun Microsystem implemented the RPC (Remote Procedure Calls) in the 1980s. Simply put developers have a set of API where when they call a subroutine the execution can be requested on a remote server. So in parallel programming one can send the execution of some jobs to a bunch of servers freeing the CPU which can do other tasks.

Both the latency and bandwidth are relevant, with the latter affecting more the total "computation time" larger they are the chunks of data sent. But for example one could use RPCs to have a game like Watch Dogs decrypting some user defined keys. The amount of data would be negligible but the computation times could be exceptionally high.

In Crackdown case the bandwidth should matter, because the "Cloud" (remote servers) have to send back the shape and position of each piece. Lower the bandwidth longer it would take for a buinding to explode, or less pieces must be computed. It may also happen that in the real environment (Internet) some pieces not arriving in time get discarded. Or another implementation would be to have a deferred computation. After the explosion is triggered the remote servers start to compute its whole dynamic and send it as a stream to the client (the console). the client then for each frame picks up the relevant data (already stored) and does the final rendering considering the Point of View and other factors.

Still there is a lot of smoke and mirrors. 20x the power of the XB1 means nothing. Is MS referring to the CPU (the one usually computing this kind of physics) or the GPU? In the first case it would be 2 TFLOPS per explosion. In the latter 26 TFLOPS. I highly doubt it takes so much for that kind of effect.

Also this is not the "Cloud power" MS promised at the reveal, where the Cloud can help rendering. That one simply isn't possible. The total latency in Internet is so high that once the data reaches the client the console is already computing other frames.

I am thinking there must be some some procedural generation technics involved,because i can imagine that would work.


"No Man's Sky" indeed could use the Cloud to generate the worlds. But as far as I understood  (with my surprise) that is not the case. It's probably like in Mineraft where each single seed generates its own world which is the same for everyone for that seed. And the world generation is computed in game.

Still it doesn't really matter. Communication performances apart from a theoretical point of view any algorithm that can be done in a local host can be implemented also using the Cloud. The Cloud servers can be seen as extra CPU cores dedicated to specific tasks. But the problem is indeed the time of sending the data from the remote servers to the "hub". If the time it takes to receive the data is higher than what it would be computing the same data locally it's just a waste. This greatly reduces the number of possible applications, especially real time applications like the video games.

Then let's not forget about the cost. Even if it is internal "phoney money" XBL must pay Azure for the resources used.



Michelasso said:
Lortsamler said:
Michelasso said:
Lortsamler said:
I got a 200 Mb connection,and that's equal to a 25 MB memory bandwidth. How this cloud computing is supposed to work,is beyond me. The only thing i can think ofthat will work, is post processing. I need to see this before i belive it.


It's remote processing. And it is quite old technology in IT terms. Sun Microsystem implemented the RPC (Remote Procedure Calls) in the 1980s. Simply put developers have a set of API where when they call a subroutine the execution can be requested on a remote server. So in parallel programming one can send the execution of some jobs to a bunch of servers freeing the CPU which can do other tasks.

Both the latency and bandwidth are relevant, with the latter affecting more the total "computation time" larger they are the chunks of data sent. But for example one could use RPCs to have a game like Watch Dogs decrypting some user defined keys. The amount of data would be negligible but the computation times could be exceptionally high.

In Crackdown case the bandwidth should matter, because the "Cloud" (remote servers) have to send back the shape and position of each piece. Lower the bandwidth longer it would take for a buinding to explode, or less pieces must be computed. It may also happen that in the real environment (Internet) some pieces not arriving in time get discarded. Or another implementation would be to have a deferred computation. After the explosion is triggered the remote servers start to compute its whole dynamic and send it as a stream to the client (the console). the client then for each frame picks up the relevant data (already stored) and does the final rendering considering the Point of View and other factors.

Still there is a lot of smoke and mirrors. 20x the power of the XB1 means nothing. Is MS referring to the CPU (the one usually computing this kind of physics) or the GPU? In the first case it would be 2 TFLOPS per explosion. In the latter 26 TFLOPS. I highly doubt it takes so much for that kind of effect.

Also this is not the "Cloud power" MS promised at the reveal, where the Cloud can help rendering. That one simply isn't possible. The total latency in Internet is so high that once the data reaches the client the console is already computing other frames.

I am thinking there must be some some procedural generation technics involved,because i can imagine that would work.


"No Man's Sky" indeed could use the Cloud to generate the worlds. But as far as I understood  (with my surprise) that is not the case. It's probably like in Mineraft where each single seed generates its own world which is the same for everyone for that seed. And the world generation is computed in game.

Still it doesn't really matter. Communication performances apart from a theoretical point of view any algorithm that can be done in a local host can be implemented also using the Cloud. The Cloud servers can be seen as extra CPU cores dedicated to specific tasks. But the problem is indeed the time of sending the data from the remote servers to the "hub". If the time it takes to receive the data is higher than what it would be computing the same data locally it's just a waste. This greatly reduces the number of possible applications, especially real time applications like the video games.

Then let's not forget about the cost. Even if it is internal "phoney money" XBL must pay Azure for the resources used.

I agree with all of this. That's why i am question this claim from Microsoft.  



Lortsamler said:

I agree with all of this. That's why i am question this claim from Microsoft.  


Ah, don't get me wrong. I thought you were asking how it worked. The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed. The Cloud can give even thousands of times more power in batch jobs (like in the decryption example), but on real time? Not a chance. The Internet latency kills it all. It's even paradoxical. More (total) computing power would also mean an higher frame rate. But higher the frame rate smaller is the time window to produce a frame. Smaller that time higher are the chances for the client to skip the remotely computed data. And the internet latency has a die hard constraint: the speed of light. The packets can't travel a geographical distance faster than that (1ms every 150Km/100 miles round trip). And we know how in real life the Internet is quite slower.



Michelasso said:
Lortsamler said:

I agree with all of this. That's why i am question this claim from Microsoft.  


Ah, don't get me wrong. I thought you were asking how it worked. The original MS claim at Xbone reveal that the Cloud will give the Xbone 3 times more power is pure bullcrap indeed. The Cloud can give even thousands of times more power in batch jobs (like in the decryption example), but on real time? Not a chance. The Internet latency kills it all. It's even paradoxical. More (total) computing power would also mean an higher frame rate. But higher the frame rate smaller is the time window to produce a frame. Smaller that time higher are the chances for the client to skip the remotely computed data. And the internet latency has a die hard constraint: the speed of light. The packets can't travel a geographical distance faster than that (1ms every 150Km/100 miles round trip). And we know how in real life the Internet is quite slower.

I think we are on the same page here :)