By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Adinnieken said:
the-pi-guy said:

Does this actually mean that the cloud can boost graphics or does it mean that the cloud can offload some of the processes from the box, so that the box itself has more to give to graphics?  

This is what Nvidia is doing. 

http://www.youtube.com/watch?v=CkSSUChfjLk


Yes, offloading to a networked server, these demonstrations are done on a local network, not a standard ISP internet connection, and even WITH the processes being done locally they still ended up with 100-500ms latency on processing because of the time wasted collecting pools of data to process, transmitting that data, having the data processed externally, then receiving that processed data back and, assuming they didnt bother with error control to make sure received packets were correct (hashing), finally be used by the gpu, the process isnt perfect and never will be, for closed networks, so the chances of it ever being a viable solution for realtime graphics processing on standard network connections pretty much rules out this generation or the next 2 or 3 being able to use it.

I am more than happy to go in to great detail to show you all exactly why cloud processing for live graphics processing isnt going to happen, and why it will at best be used for storing non-critical data such as npc text and player stats rather than processing actual, genuine, graphical processes.