Stinky said: I've got 50ms ping to a few CDN endpoints, not bad latency for gaming. |
It should be good enough for most MMOGs and also to run AI and even physics of objects far away from the player on the cloud, as it means the cloud could send you refreshed data 20 times per second. But even for far away scenery graphics, though, it would be still eccessive, unless you accept them to be updated at 20fps and admitting the bandwidth is enough, although I guess it should be possible to encode those data on the fly and send them as a compressed video stream, like an mpeg, but I don't know whether the GPU can combine its locally generated graphics at higher fps with a remote video stream in a seamless way along an irregular edge between the two parts: surely every GPU can show a video stream in a squared window inside other graphics generated by itself in 2D or 3D, but seamlessly join a stream to the local graphics along an irregular edge is something I never heard of, also because when you encode those graphics as a video stream you lose every 3D information about the images, so the GPU can do simple things with it, but not having anymore infos about position and depth of those remote polygons, it can't apply 3D operation to them, for example to calculate clipping, and assumptions about locally computed graphics being closer, and cloud computed parts being far away could still leave holes in the frames, unless a very conservative redundancy is applied.