SvennoJ has actually a pretty good point. A mere 4 ms of added latency would beat the speed of light for anything further away than some 600 km. And optic fibers are slower than the speed of light, plus all the hardware and streaming involved on both ends.
Unless servers are really, really decentralized, which seems... expensive for an affordable streaming service, there's no way even the optimal, best case added latency is just 4 ms. The xCloud being demonstrated was probably hosted from a neighboring borough, or something.