Arkaign said: I can explain the Build Demo quite easily : they didn't show us anything at all in terms of how they connected the units. For all we know they ran 10Gbe or a direct-bus connection between the systems to handle the bandwidth involved. It was also quite sluggish with a poor framerate even so. If they really wanted to impress people, they would have said "here we are logging on to our Microsoft data center in South Africa, and watch how our client can manipulate this physics test!". But they didn't even go so far as running a test across town. I don't have to tell you that even Gbit lan, good old lowly Gbit, is an order of magnitude quicker than any average home ISP in the 1.5 to 50mbit range. Even 'Mbit' is deceptive, because the hops are mind boggling sometimes, and neither the ISP nor Microsoft can completely control all the various hardware in between the client and server. Run a tracert on a few of your favorite sites and enjoy watching the various layers tell you how 20th century our global network truly is. In closing, I have no problem with people keeping the faith that this is somehow going to work in a way that's relevant in the near future. I also have complete faith in the facts that this is unfeasible on a variety of levels so overwhelming that it's almost comedic to entertain. I've seen COUNTLESS promises from a huge variety of tech companies over the years that proved to exist either only for R&D/patent progress that will eventually be repurposed/sold elsewhere, and/or was to be a PR puff piece to attempt to boost their image or their stock price, to raise investor confidence. This smells 1000% of that, particularly when you match it up with the fact that Microsoft has been beaten to hell and back for having irrefutably second-rate hardware that launched at a higher price. They DESPERATELY need a win in the 'tech/specs' PR campaign in their minds. Whereas in my minds they're better off keeping their eye on the real aspects that they CAN control that will help : a great message and a great library of titles. If heaven and earth part and they DO get the impossible cloud physics model working in the real world with real people with real crappy internet, then that would be that much more impressive than touting something that they can't even demo in an open and believable manner. If they had a demo that could work on XB1, they KNOW how much that would impress people and get good word of mouth. All they would have to do is put it on the XB store. But they can't, they won't, and almost certainly this will never work in this gen. |
It would not need anywhere near that much bandwidth. But I think I know why you believe it will. You're thinking that console will need to provide the server every information about the world and that the data will transit back and forth with the console and the server the way a GPU would. isn't it?
Actually it will be much more similar to PS Now than you seems to think.
On playstation Now every feature of the game is process on a server (Graphics,physics, path finding, ai) the exact same way it would on your console then it streams the output, the same way the console would, to your TV, vita etc...
Benefits: you could play any game on every device that can connect to the internet and output video.
Inconvenient: you are dependant on the bandwidth and latency to have a great experience.
To solve the bandwidth inconvenient we could create a system that play every aspect of the game on a server except the graphics aspect.
The server still process (physics, path finding, ai etc.)but send to you only the position and rotation of the object in the scene for your console to render. (diablo 3, multiplayer game with dedicated server, multiplayer game in general except for the host etc... use this technique)
Benefit : lesser use of your bandwidth.
Inconvenient: you need a console to render the scene and your still dependant on latency.
But why would I need to use server for the computation when my console could do it?
Here's the tricks. Servers and most of all virtualized server are not limited on resource the way console are. You could create better physics, better ai, better pathfinding etc...
and if you have access to a datacenter you could create games that is more evolving and use more persistence.
But your still dependent on the latency?
Yes but not any more than a streaming service is( like PS Now).
But here's where the cloud processing Microsoft is promoting interferes.
Instead of playing every aspect of the game except graphics, servers only process certain feature, the console still runs everything that need to be sync.
So in the case of Crackdown 3 destructible environment, the console will certainly still process particle and smaller chunks. While the server play large chunks and bigger events. As Machiavellian already pointed out.
Benefit : does not need hign bandwith and less dependent on the latency(depending on the kind of feature).
Inconvenient: you need a console to render the scene.