Too inefficient for normal gaming and it will only introduce judder every time you do something unpredictable. Especially in first person horror games where you are frantically looking around or suddenly changing direction.
Something similar will be the only possible way to get VR to work via cloud gaming. The server will have to send a larger field of view image, so that the local device can update head movements with the lowest latency possible. Like the 360 degree views of Google street view, although just big enough to allow for the fastest possible head movements, with minimal distortion in 3D.
That will already be necessary for VR gaming on a local machine, so that head movements can be updated at 120hz at 8ms head position poll to display latency while new frames come in at 60hz with 50ms input latency.
Anyway back on topic, latency is noticeable when you pull the trigger, stop, start, reverse, turn left/right/up/down, jump. How is it going to send all those possibilities continuously, with 5x the bandwidth apparently. And 5x the rendering power needed...