EpicRandy said:
Yes but the differences is all win. Cloud compute require lesser bandwith, it isn't dependant on lag or latency issue. Microsoft already stated that with a connection of 1.5 Mbps you would be fine. The world-wide average bandwith is about 2.9Mbps.http://www.examiner.com/article/xbox-one-has-a-1-5mbps-download-speed-internet-requirement So what's the big deal? |
It IS a big deal.
"Yes but the differences is all win."
No! With full game streaming and online gaming the dev makes a single specification for how those things would work. The only factor then is the lag - which is directly dependent on the network conditions.
With this cloud compute, there is a partial set of calculations which are chosen do be done online. The whole problem comes down to this choice. This choice has to be made scalable - i.e. it changes as network conditions change. So now it is a situation where a dev decides (or is forced) to do a certain aspect of their rendering through cloud compute. But what happens if network conditions fail during rendering? Moreover, what happens to the offline player?
What you end up with is SERIOUS fragmentation. How many sets of rendering would there be? What will the offline gamer loose? Can't sacrifice AI or physics for those without cloud compute, or can we?
Headaches for devs. That's what this is. Serious, serious headaches. And for what!?
10% better performance half the time? 20%?
You don't need to be a tech guru to be able to stare down this rabbit hole. It is ugly. It is very deep and very ugly.
*note, features such as driveatar are not part of this problem whatsoever. Online (and offtime!) calculations are a fantastic idea and I think driveatars are awesome. But this is hardly new.