ElPresidente7 said:
|
Exactly. If you're going to off-load as much data processing as to actually make an impact on what the machine can do locally, you're probably sending more data back and forth then a much easier constant data rate image stream. Plus the headache of getting it to work together dealing with variable latencies is not worth it.
Imagine something cloud computing would be good for. A Skyrim game with from dust engine. Fully dynamic, constantly changing persistant world. It would have to update your console with the local geometry, weather conditions, basically all the data that normally comes off hdd. Then the game still needs to render it locally. You might as well let the server render it and send the image over.
That also means it's only really economically viable for MMO type games, lots of people sharing the same world. Cloud streamed games run at very modest graphic settings. It's not cost beneficial to have a 4x more powerful machines sitting in the cloud for every single user that wants to run the game at peak hours. The server could host a very detailed persistant dynamic world, and render lots of cheaper views for the users.
For now I expect the cloud to be used for more static traditional type MMO worlds. Limit the data that needs to be send over. It's not going to give Forza or Halo better physics or graphics.
It could stream picture in picture of what your friends see. Maybe for a battlefield commander mode. All your squad mates upload a compressed image of their view point, the server puts it all together around a strategic map for your squad commander to direct all of you. Plenty of possibilities. Minecraft cloud version with worlds the size of countries everyone can share. Just forget about off loading time critical stuff to enchance single player games.







