Irides also does this with a technique called “likelyhood-based foveation” that renders higher quality parts of an image and ‘gracefully degrades’ others, all based on where the user is likely to look. It also uses a spherical mesh for said image. In a sense, this sort of mimics the human eye, and offers the user lower latency and higher quality at the same time.
So does it track your pupils at super low latency or are you in for a blurry mess when you look anywhere you're not supposed to be looking... The technique is sound, yet likely to look is anything but.
And I'm not sure what the cloud does when you suddenly turn your head. It either has to send a huge fov image several frames ahead all the time so the glasses can pick out a 100 degree field to display, or there will be some weird catching up effect. The glasses need to detect your head movement and display the correct image all in 13ms.
My average ping time to Bing is 30ms. Thus the cloud would have to be at least 3 frames ahead in a wider fov to compensate for the speed I can turn my head at and all the direction changes I can make in 3 frames.
Maybe make cloud gaming work as well as gaming locally first before attaching the much higher demands of a 75fps headset to it. Sounds like another tech project that works in the lab yet will be problematic out in the real world.







