By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Would it be feasible for a streaming xbox to have a local graphics card, but let the cloud do everything else?

Snoopy said:

For example, the rumor streaming Xbox is going to be released around 2020. Would it make sense to allow the cloud to handle everything other than graphics to decrease latency? This way, the streaming Xbox won't cost as much and deliver a great experience?

The graphics rendering isnt actually what adds Latency.... its the "other stuff".
Your "idea" would lower the amount of bandwidth needed to stream, but the Graphics part of a console is what makes up most of its cost.... so you ll basically be paying almost the same as a complete console, for a lesser experiance? why would anyone do that?

So your thinking is kinda backwards.... also why Microsoft is rumored to do it the otherway around, of what your saying.
Ms plan was to keep the input and cpu calculations in a box you buy, and have them do the rendering server side and stream that to you.
That way you get a streaming box, that has less latency.





Around the Network
Intrinsic said:
Does Not Work that way. If anything what you are suggesting will increase latency not reduce it.

He basically got it backwards :)
The complete oppersite of what MS was rumored to be doing, to help reduce latency when streaming.

His idea would reduce the amount of bandwidth needed to stream greatly, but it would basically require a total console.
The "price" saved aspect would be minimal to the point where, you might as well just have a real console then.

Like why would anyone want to be forced to stream, if they for the same money could get a actual real console, that gives a better experiance?



It is what ms promised with the xbox one



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

kirby007 said:
It is what ms promised with the xbox one

Not quite the same. AFAIK it was mostly about particle effects and such, which could be introduced as some sort of filter over the original, console-generated image, so even if there was latency, it wouldn't affect the game. No explanation was ever given, however, as to how the streaming service would interact with the GPU frame buffer etc. and actually do what it promised to do. I suppose there was a way, and it was actually a thing at some point, before being shelved, so who knows.

Edit - an alternative could be, say, the local hardware doing a 1080p image with 8 TFLOPS and the streaming service using ~ 2 TFLOPS or so to upscale the image to a very respectable 4K with checkerboarding. But again, the latency would have to be frame buffer-sized or it would just be as bad or worse than standard streaming.

Last edited by haxxiy - on 24 March 2019

 

 

 

 

 

Intrinsic said:
Snoopy said:
That's why I thought the local graphics card handling the resolution would take some of the payload from the bandwidth. 

Naa.... that's sending significantly more information down the chain. Think of it like having a PC... with a discrete CPU and GPU. The CPU an GPU is connected via a PCIe 3 bus. What you are suggesting basically turns the "internet" into your PIe bus. That is a lot of bandwidth....... a LOT.

What they could do though is make some really proprietary video codec. Something that will require good amount of owert decode. And have some sort of dedicated specialized chip wh pretty much only excels at decoding such a codec. Such an approach could cut down bandwidth required for video streams by as much as 70% if i were even possible.

I think its possible though, but why no one is doing that is that it means only people that buy your hardware can use your service. At that point you might as well just build a conventional console.

Anyways,you were talking about latency...bandwidth and latency are mostly two very different things.

Your Analogy is actually spot on.

The codecs are already in AMD's video decoder though, albeit... It's fallen severely behind nVidia in recent times. (The exception being the Ryzen notebook APU's.)
H.266 should provide significant reductions in bandwidth and enable 8k.

Snoopy said:

For example, the rumor streaming Xbox is going to be released around 2020. Would it make sense to allow the cloud to handle everything other than graphics to decrease latency? This way, the streaming Xbox won't cost as much and deliver a great experience?

There are so many ways to skin a cat.

In my opinion... If latency is the biggest issue, then you want as much work performed on the host hardware as possible... But then offload whatever is not latency sensitive to the Cloud... Such as weather effects.





--::{PC Gaming Master Race}::--