By using this site, you agree to our Privacy Policy and our Terms of Use. Close
vivster said:

I think it would give developers a false sense of more capabilities. It might even be detrimental to the development of the game if they run into issues they didn't expect and might not even be able to solve due to the limitations of this technology.

In the end all this does is offloading computing power, nothing more, nothing less. However I have yet to hear a developer complain about not sufficient enough CPU resources. In fact I believe the trend is going towards offloading the CPU to the GPU instead of the cloud.

We don't know how this will end because there is no proper field study for this technology in games yet. Best case scenario it will make a handful of games make some more physics calculations that make the game slightly prettier. Worst case scenario is that it will result in a bunch of broken games, wasted development money and focus being pulled away from advancing CPU to GPU offloading.

My guess is somewhere in the middle where it will be used for one or two games, works almost properly and be then forgotten to concentrate on more promising technologies pertaining to gaming.

I have bigger faith in MS than that. If this would be such a "false" way of pushing cloud computing, MS would shoot themselves in the foot. What I saw in Crackdown 3 was a level of detailed destruction unlike anything I have ever seen before. I am not interested in that game at all, but rather what they can make of it.

Technologies evolve, however, with you it sounds like no developer ever can create new ways of using old technologies in a better way. I remember how every knowledgeable person "knew" it would not be possible to stream HD video with anything but a massive broadband connection. Yet, developers found ways to do it. I don´t care how they do it, I only care if they do it.