By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Adinnieken said:
Frequency said:
Adinnieken said:
the-pi-guy said:

Does this actually mean that the cloud can boost graphics or does it mean that the cloud can offload some of the processes from the box, so that the box itself has more to give to graphics?  

This is what Nvidia is doing. 

http://www.youtube.com/watch?v=CkSSUChfjLk


Yes, offloading to a networked server, these demonstrations are done on a local network, not a standard ISP internet connection, and even WITH the processes being done locally they still ended up with 100-500ms latency on processing because of the time wasted collecting pools of data to process, transmitting that data, having the data processed externally, then receiving that processed data back and, assuming they didnt bother with error control to make sure received packets were correct (hashing), finally be used by the gpu, the process isnt perfect and never will be, for closed networks, so the chances of it ever being a viable solution for realtime graphics processing on standard network connections pretty much rules out this generation or the next 2 or 3 being able to use it.

I am more than happy to go in to great detail to show you all exactly why cloud processing for live graphics processing isnt going to happen, and why it will at best be used for storing non-critical data such as npc text and player stats rather than processing actual, genuine, graphical processes.

TCP/IP ensures packets are correctly received.  Hashing is used to ensure that the content requested is the same as the content sent (a checksum).  It is not error correction.

You didn't watch the video or listen to it.  It's sent back down as a video stream.  The video stream is then processed on the local system.  The point of the video was to demonstrate the quality based on the latency.  Even the at 1000ms latency (30FPS) it actually offered decent quality.  That said, the highest average latency in the US is 70ms.  While the average lowest is 12ms.  Theorethically one could do lighting rendering with 100ms latency and still offer a high quality image.  Especially if, as with the Xbox One, you have a separate video stream processor that allows you to interpolate frames to increase the frame rate.   

For CPU loads computation is done in batch jobs, which is the route microsoft will take, processing and outputting video puts too much strain on the network infrastructure to be used effectively outside of lab experimentation, developing engines to work with lightmap muxing and processing that live video stream uses roughly 50% of the power saved by having an external server render it for you, than just rendering it locally, but again these are lab tests, in the real world gaming environments change vastly,  when you include the unpredictable nature of online worlds or multiplayer servers, processing light externally with any degree of latency will result in very choppy lighting solutions as environments dramatically change before the data has had time to be processed and received.

Additionally, the latency involved with sending a job back and forth then processing the result locally, is barely any different than having to process a video stream, the only difference is youre saving a fraction of time and bandwidth in exchange for sacrificing overall image quality, as compression of the video stream results in loss of clarity.

Of course, you could stream uncompressed video but then you're looking at insane latency and bandwidth usage.

Microsoft of course will say all of this is possible, and indeed it is, because they want the feature to be a good talking point, regardless of if actually implamenting it makes any sort of sense, when the reality is, the cloud is going to be used as a glorified perpetual storage system for profile and npc data.

Of course, cloud processing of video will end up getting used, eventually, but it's going to be the situation of "1080p60" on ps3 and 360, where barely anything actually uses it, but because a couple do, people act like it's a super important thing.

Turn 10 are basically using cloud as a perpetual storage system, with a players profile data being randomly plucked from the cloud by the client machine when joining a race, the result and overall progress of that race is then uploaded and added to that profile data, ready to be used by the next client, or the profile owner when they next sign in.

Of course, if you can point to a single instance of a game on the Xbox One that will be using remote defered computation, I would be more than happy to debate this further, but as of right now you are cherry picking alpha development demonstration concepts from a manufacturer that has nothing to do with either console and trying to apply it to the xbox one, and it just isnt working.