Frequency said:
I am more than happy to go in to great detail to show you all exactly why cloud processing for live graphics processing isnt going to happen, and why it will at best be used for storing non-critical data such as npc text and player stats rather than processing actual, genuine, graphical processes. |
TCP/IP ensures packets are correctly received. Hashing is used to ensure that the content requested is the same as the content sent (a checksum). It is not error correction.
You didn't watch the video or listen to it. It's sent back down as a video stream. The video stream is then processed on the local system. The point of the video was to demonstrate the quality based on the latency. Even the at 1000ms latency (30FPS) it actually offered decent quality. That said, the highest average latency in the US is 70ms. While the average lowest is 12ms. Theorethically one could do lighting rendering with 100ms latency and still offer a high quality image. Especially if, as with the Xbox One, you have a separate video stream processor that allows you to interpolate frames to increase the frame rate.







