Intrinsic said:
Naa.... that's sending significantly more information down the chain. Think of it like having a PC... with a discrete CPU and GPU. The CPU an GPU is connected via a PCIe 3 bus. What you are suggesting basically turns the "internet" into your PIe bus. That is a lot of bandwidth....... a LOT. What they could do though is make some really proprietary video codec. Something that will require good amount of owert decode. And have some sort of dedicated specialized chip wh pretty much only excels at decoding such a codec. Such an approach could cut down bandwidth required for video streams by as much as 70% if i were even possible. I think its possible though, but why no one is doing that is that it means only people that buy your hardware can use your service. At that point you might as well just build a conventional console. Anyways,you were talking about latency...bandwidth and latency are mostly two very different things. |
Your Analogy is actually spot on.
The codecs are already in AMD's video decoder though, albeit... It's fallen severely behind nVidia in recent times. (The exception being the Ryzen notebook APU's.)
H.266 should provide significant reductions in bandwidth and enable 8k.
| Snoopy said: For example, the rumor streaming Xbox is going to be released around 2020. Would it make sense to allow the cloud to handle everything other than graphics to decrease latency? This way, the streaming Xbox won't cost as much and deliver a great experience? |
There are so many ways to skin a cat.
In my opinion... If latency is the biggest issue, then you want as much work performed on the host hardware as possible... But then offload whatever is not latency sensitive to the Cloud... Such as weather effects.

www.youtube.com/@Pemalite








