By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Cloud Computing is a new fangled marketing phrase for Client/Server.

Client/Server applications are what were used when dumb terminals were hooked-up to mainframes or Unix servers. Later, PCs were hooked-up to them. The dumb terminals or PCs then ran applications that essentially were on the server or mainframe. This allowed the far more powerful servers or mainframe to do the majority of work.

This past decade or so, Web-based applications have been doing a similar thing. It once was, ActiveX or Java were necessary to run applications on the server to provide the UI. The stub application then sent data up to the server. Today, Web-based applications use scripting languages to present the UI in the Web-browser but the apps still send data up to the server to get processed. About 7 years ago Oracle and Microsoft started working on competing standards for remote computing, though essentially they were the same thing, Web services. A Web service is an application that runs on a server and can be remotely called via any type of application. The majority of functionality can be provided through this Web service, or various feature sets can.

With the Xbox One, some aspects of the game can be housed on a server and accessed via the game on the console, much like with Web services. The game sends data up, it gets processed, and the results are sent back down to the console to present on screen. There are a lot of a features to a game that can be off-loaded processes. Running NPC AI, for example. Why waste processing power on the console to run NPC AI for characters not on screen? Off-load that to a server, let it run there, and access it when the user actually needs it.

What it won't be used for, not in this generation, will be to do real-time rendering.