By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Frequency said:
Adinnieken said:
the-pi-guy said:

Does this actually mean that the cloud can boost graphics or does it mean that the cloud can offload some of the processes from the box, so that the box itself has more to give to graphics?  

This is what Nvidia is doing. 

http://www.youtube.com/watch?v=CkSSUChfjLk


Yes, offloading to a networked server, these demonstrations are done on a local network, not a standard ISP internet connection, and even WITH the processes being done locally they still ended up with 100-500ms latency on processing because of the time wasted collecting pools of data to process, transmitting that data, having the data processed externally, then receiving that processed data back and, assuming they didnt bother with error control to make sure received packets were correct (hashing), finally be used by the gpu, the process isnt perfect and never will be, for closed networks, so the chances of it ever being a viable solution for realtime graphics processing on standard network connections pretty much rules out this generation or the next 2 or 3 being able to use it.

I am more than happy to go in to great detail to show you all exactly why cloud processing for live graphics processing isnt going to happen, and why it will at best be used for storing non-critical data such as npc text and player stats rather than processing actual, genuine, graphical processes.

TCP/IP ensures packets are correctly received.  Hashing is used to ensure that the content requested is the same as the content sent (a checksum).  It is not error correction.

You didn't watch the video or listen to it.  It's sent back down as a video stream.  The video stream is then processed on the local system.  The point of the video was to demonstrate the quality based on the latency.  Even the at 1000ms latency (30FPS) it actually offered decent quality.  That said, the highest average latency in the US is 70ms.  While the average lowest is 12ms.  Theorethically one could do lighting rendering with 100ms latency and still offer a high quality image.  Especially if, as with the Xbox One, you have a separate video stream processor that allows you to interpolate frames to increase the frame rate.