By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Would it be feasible for a streaming xbox to have a local graphics card, but let the cloud do everything else?

For example, the rumor streaming Xbox is going to be released around 2020. Would it make sense to allow the cloud to handle everything other than graphics to decrease latency? This way, the streaming Xbox won't cost as much and deliver a great experience?



If google can do it, MS wont be far behind



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

Does Not Work that way. If anything what you are suggesting will increase latency not reduce it.



Separating two distinct compute points like that would increase latency and be inefficient. Which is why I found the initial plug of the XBONE to be kinda dumb, especially given where we _still_ are with the internet today. There are some advantages but only for certain kinds of games.



A warrior keeps death on the mind from the moment of their first breath to the moment of their last.



BasilZero said:

Why would you separate the two?

So you can have the option to stream and to play games physically?  It doesnt work like that though....

No, I mean the cloud handles CPU related tasks. I can imagine the reason why there is so much latency in regular streaming is because of the graphics.



Snoopy said:
BasilZero said:

Why would you separate the two?

So you can have the option to stream and to play games physically?  It doesnt work like that though....

No, I mean the cloud handles CPU related tasks. I can imagine the reason why there is so much latency in regular streaming is because of the graphics.

Rather that's the bandwidth issue which is why streaming games will end up costing more data than streaming netflix at the same resolution.  Which can also be mitigated in that video and audio streams can be buffered while gaming streams really cannot.  The latency is caused by the controller input having to travel from controller to receiver to internet to server, computed, video traveling from server to internet to screen.  Whereas in traditional gaming controller to console, computed, to screen.  The internet is the bottleneck, twice.



A warrior keeps death on the mind from the moment of their first breath to the moment of their last.



That's basically setting an "obsolete" timer on all games.



Snoopy said:
BasilZero said:

Why would you separate the two?

So you can have the option to stream and to play games physically?  It doesnt work like that though....

No, I mean the cloud handles CPU related tasks. I can imagine the reason why there is so much latency in regular streaming is because of the graphics.

Nope...

In a perfect streaming world... the graphics on display is irrelevant. What is relevant though,  is the actual video bandwidth and the time it takes to connect with the server hosting the game and the client.

Those are the only two real hurdles faced by every game streaming platform ever and that will face anyone that tries getting into game streaming. The rez on display (doesn't matter what kinda graphic you are pushing) and the framerate results in bandwidth required.If any of those tw0 variables ges up so does the bandwidth.

Then you have the latency, this comes down to a number of connection thing. In a typical set up (local hardware connected to display) you have (1)a controller sending an input to hardware, (2)hardware sending video signal to TV/Display, (3)TV/Display outputting signal. And that's it. Now every connection point listed there has its own contribution to the overall latency. Even doing something as little as switching the controller type in (1) between wired and wireless, increases or reduces some latency.

With a streaming service: you have (1)input being sent to server and (2)server sending back video to client. Thats it..... Sounds simple enough doesn't it? Te issue is that with regards to streaming each of those points is a lot more complicated. Lets just take (1) for instance, what's the connection speed and quality of the client (do you have enough bandwidth to get decnet rez and frame rate and is the connection strength strong and stable), how many points (local processing hardware like laptop, android tv box, chromecast, your house router, ISP....etc) does that input bounce off or go through before reaching the primary host (stadia in this case) and then there is now the "road back" issue which is something local hardware has never had to deal with...... to even see the result of that input.. it pretty much has to make a round trip from you and then ut t the world and ten back to you and your display.

And that's all just talking about the first part..... Thats what adds to the latency. And processing half of the game locally isn't going to change any of that. There is something that can change some of it though..... but thats another story.



Intrinsic said:
Snoopy said:

No, I mean the cloud handles CPU related tasks. I can imagine the reason why there is so much latency in regular streaming is because of the graphics.

Nope...

In a perfect streaming world... the graphics on display is irrelevant. What is relevant though,  is the actual video bandwidth and the time it takes to connect with the server hosting the game and the client.

Those are the only two real hurdles faced by every game streaming platform ever and that will face anyone that tries getting into game streaming. The rez on display (doesn't matter what kinda graphic you are pushing) and the framerate results in bandwidth required.If any of those tw0 variables ges up so does the bandwidth.

Then you have the latency, this comes down to a number of connection thing. In a typical set up (local hardware connected to display) you have (1)a controller sending an input to hardware, (2)hardware sending video signal to TV/Display, (3)TV/Display outputting signal. And that's it. Now every connection point listed there has its own contribution to the overall latency. Even doing something as little as switching the controller type in (1) between wired and wireless, increases or reduces some latency.

With a streaming service: you have (1)input being sent to server and (2)server sending back video to client. Thats it..... Sounds simple enough doesn't it? Te issue is that with regards to streaming each of those points is a lot more complicated. Lets just take (1) for instance, what's the connection speed and quality of the client (do you have enough bandwidth to get decnet rez and frame rate and is the connection strength strong and stable), how many points (local processing hardware like laptop, android tv box, chromecast, your house router, ISP....etc) does that input bounce off or go through before reaching the primary host (stadia in this case) and then there is now the "road back" issue which is something local hardware has never had to deal with...... to even see the result of that input.. it pretty much has to make a round trip from you and then ut t the world and ten back to you and your display.

And that's all just talking about the first part..... Thats what adds to the latency. And processing half of the game locally isn't going to change any of that. There is something that can change some of it though..... but thats another story.

That's why I thought the local graphics card handling the resolution would take some of the payload from the bandwidth. 



Snoopy said:
That's why I thought the local graphics card handling the resolution would take some of the payload from the bandwidth. 

Naa.... that's sending significantly more information down the chain. Think of it like having a PC... with a discrete CPU and GPU. The CPU an GPU is connected via a PCIe 3 bus. What you are suggesting basically turns the "internet" into your PIe bus. That is a lot of bandwidth....... a LOT.

What they could do though is make some really proprietary video codec. Something that will require good amount of owert decode. And have some sort of dedicated specialized chip wh pretty much only excels at decoding such a codec. Such an approach could cut down bandwidth required for video streams by as much as 70% if i were even possible.

I think its possible though, but why no one is doing that is that it means only people that buy your hardware can use your service. At that point you might as well just build a conventional console.

Anyways,you were talking about latency...bandwidth and latency are mostly two very different things.