By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Microsoft’s Phil Harrison Explains that Xbox One’s Cloud Can Actually Improve Graphics

Tagged games:

Adinnieken said:
Frequency said:
Adinnieken said:
the-pi-guy said:

Does this actually mean that the cloud can boost graphics or does it mean that the cloud can offload some of the processes from the box, so that the box itself has more to give to graphics?  

This is what Nvidia is doing. 

http://www.youtube.com/watch?v=CkSSUChfjLk


Yes, offloading to a networked server, these demonstrations are done on a local network, not a standard ISP internet connection, and even WITH the processes being done locally they still ended up with 100-500ms latency on processing because of the time wasted collecting pools of data to process, transmitting that data, having the data processed externally, then receiving that processed data back and, assuming they didnt bother with error control to make sure received packets were correct (hashing), finally be used by the gpu, the process isnt perfect and never will be, for closed networks, so the chances of it ever being a viable solution for realtime graphics processing on standard network connections pretty much rules out this generation or the next 2 or 3 being able to use it.

I am more than happy to go in to great detail to show you all exactly why cloud processing for live graphics processing isnt going to happen, and why it will at best be used for storing non-critical data such as npc text and player stats rather than processing actual, genuine, graphical processes.

TCP/IP ensures packets are correctly received.  Hashing is used to ensure that the content requested is the same as the content sent (a checksum).  It is not error correction.

You didn't watch the video or listen to it.  It's sent back down as a video stream.  The video stream is then processed on the local system.  The point of the video was to demonstrate the quality based on the latency.  Even the at 1000ms latency (30FPS) it actually offered decent quality.  That said, the highest average latency in the US is 70ms.  While the average lowest is 12ms.  Theorethically one could do lighting rendering with 100ms latency and still offer a high quality image.  Especially if, as with the Xbox One, you have a separate video stream processor that allows you to interpolate frames to increase the frame rate.   

For CPU loads computation is done in batch jobs, which is the route microsoft will take, processing and outputting video puts too much strain on the network infrastructure to be used effectively outside of lab experimentation, developing engines to work with lightmap muxing and processing that live video stream uses roughly 50% of the power saved by having an external server render it for you, than just rendering it locally, but again these are lab tests, in the real world gaming environments change vastly,  when you include the unpredictable nature of online worlds or multiplayer servers, processing light externally with any degree of latency will result in very choppy lighting solutions as environments dramatically change before the data has had time to be processed and received.

Additionally, the latency involved with sending a job back and forth then processing the result locally, is barely any different than having to process a video stream, the only difference is youre saving a fraction of time and bandwidth in exchange for sacrificing overall image quality, as compression of the video stream results in loss of clarity.

Of course, you could stream uncompressed video but then you're looking at insane latency and bandwidth usage.

Microsoft of course will say all of this is possible, and indeed it is, because they want the feature to be a good talking point, regardless of if actually implamenting it makes any sort of sense, when the reality is, the cloud is going to be used as a glorified perpetual storage system for profile and npc data.

Of course, cloud processing of video will end up getting used, eventually, but it's going to be the situation of "1080p60" on ps3 and 360, where barely anything actually uses it, but because a couple do, people act like it's a super important thing.

Turn 10 are basically using cloud as a perpetual storage system, with a players profile data being randomly plucked from the cloud by the client machine when joining a race, the result and overall progress of that race is then uploaded and added to that profile data, ready to be used by the next client, or the profile owner when they next sign in.

Of course, if you can point to a single instance of a game on the Xbox One that will be using remote defered computation, I would be more than happy to debate this further, but as of right now you are cherry picking alpha development demonstration concepts from a manufacturer that has nothing to do with either console and trying to apply it to the xbox one, and it just isnt working.



Around the Network
Frequency said:

For CPU loads computation is done in batch jobs, which is the route microsoft will take, processing and outputting video puts too much strain on the network infrastructure to be used effectively outside of lab experimentation, developing engines to work with lightmap muxing and processing that live video stream uses roughly 50% of the power saved by having an external server render it for you, than just rendering it locally, but again these are lab tests, in the real world gaming environments change vastly,  when you include the unpredictable nature of online worlds or multiplayer servers, processing light externally with any degree of latency will result in very choppy lighting solutions as environments dramatically change before the data has had time to be processed and received.

Additionally, the latency involved with sending a job back and forth then processing the result locally, is barely any different than having to process a video stream, the only difference is youre saving a fraction of time and bandwidth in exchange for sacrificing overall image quality, as compression of the video stream results in loss of clarity.

Of course, you could stream uncompressed video but then you're looking at insane latency and bandwidth usage.

Microsoft of course will say all of this is possible, and indeed it is, because they want the feature to be a good talking point, regardless of if actually implamenting it makes any sort of sense, when the reality is, the cloud is going to be used as a glorified perpetual storage system for profile and npc data.

Of course, cloud processing of video will end up getting used, eventually, but it's going to be the situation of "1080p60" on ps3 and 360, where barely anything actually uses it, but because a couple do, people act like it's a super important thing.

Turn 10 are basically using cloud as a perpetual storage system, with a players profile data being randomly plucked from the cloud by the client machine when joining a race, the result and overall progress of that race is then uploaded and added to that profile data, ready to be used by the next client, or the profile owner when they next sign in.

Of course, if you can point to a single instance of a game on the Xbox One that will be using remote defered computation, I would be more than happy to debate this further, but as of right now you are cherry picking alpha development demonstration concepts from a manufacturer that has nothing to do with either console and trying to apply it to the xbox one, and it just isnt working.

If the bandwidth is sufficient for the data being requested and you have 10ms latency, then you will have 10ms latency.  Latency doesn't increase because you request an uncompressed file if the bandwidth is sufficient.  There are several causes to high latency, but compressed vs uncompressed video isn't an inherient cause of latency. 

To put this a different way, latency is pre-existing and contingent on several different in-flight factors.  There is a pre-existing latency based on the number of hops and the ability for each hop to adequately service each connection.  The bandwidth of the equipment, the bandwidth of the network connection, the state of the equipment, whether it's hot, cold, wet, or dry where the equipment is, the number of clients the switch or server is servicing, and the time it takes to service those client requests.

I'm on a connection right now that has 10ms latency to the backbone.  I can't stream shit with this connection because the bandwidth of the pipe chokes it off.  I use another connection that has up to 1300ms latency to the backbone (thats between me and the Internet -- not even getting out to the Internet) and I stream video content perfectly.  I'm using the lowest video quality (highest compression) settings.  The difference is the bandwidth.

Latency can be an issue with video, but video buffers, typically offsetting latency.  YouTube is an exception with it's new just in-time delivery, which doesn't work.

Sorry, but the Xbox One uses descrete video processessing, so you can have a video stream coming into the system and the streaming processors would do all the work breaking the video into frames sending them to the GPU with the pre-rendered lighting.  Because of the descrete video processors, it would off-load processing from the CPU and GPU allowing those two to do other work. 

I think what is important to consider here is everything that has been discussed in the "Secret Sauce" articles.  How many ingredients are there in McDonald's Big Mac sauce?  It isn't just one.  There are several.  Voxel lighting.  Descrete processors.  The Cloud.  300,000 servers.  You can laugh it off as wishful thinking.  You can push it aside and say it's impossible.  You can do everything possible to minimize it, but the fact is both GPU manufacturers are thinking about this and moving toward it, and Microsoft is designing the Xbox One to take advantage of it.

Does it mean that the Xbox One will be a 1:1 with the power of the PS4?  Probably not.  Does it mean that the Xbox One will be able to do anything different than the PS4?  Not exactly.  But, the design of the Xbox One is one in which Microsoft is planning on cloud-based processing/rendering to free up resources on the console so that the console can actually do more. 

Is that going to happen Day One?  No, obviously not, but the capability over the next ten years is there.

As for Turn 10 they're using the cloud for AI.  It's a little more than storage.  The AI engine uses the play style of the you and your friends to emulate their driving.  It doesn't replay their performance in a track.  You can be playing a track that they've never played and their playstyle will be emulated in a driver on that track.  The more varied the tracks they play, the better the game will emulate their playstyle, but if all they ever did was play a single track, the playstyle will still be emulated on any track you play on.  

The amount of data sent to and received from a multiplayer server is miniscule.  It may be constant, but you're talking about small data packets.  You're sending coordinating data, numbers.  This hardly puts a load on servers.  This was why Microsoft used peer-to-peer with the Xbox and Xbox 360.  It doesn't put a massive strain on the box.  The challenge is bandwidth with home connections.

So in multiplayer games rendering is done on the local client.  Yes, you have to wait to get data, but typically every client will slow the slowest client in the game or the game will automatically reposition the player onto a server with clients matching the same speed.  This happened in some cases on P2P multiplayer games, though there was the host advantage.

Bottom line, latency from off-system processing (cloud processing) isn't going to significantly create a problem.  Fiber-to-home is where things are going and it offers significantly low latency.  In the 10ms range.  That's what Microsoft is counting on.      



Wright said:
pokoko said:

As for single-player, no.  I don't want my single-player to be internet-dependent.

 

What if the singleplayer games are already maxed out, and connecting to the cloud pushes it even better? That way you have a singleplayer game great on its own, that receives further benefits (not possible on the game itself) if you happen to be connected.

 

Also, great gif.

ill leave that to the GDDR5.