By using this site, you agree to our Privacy Policy and our Terms of Use. Close
NathObeaN said:
Michelasso said:

You're comparing two completely different types of computational processes. Memory calculations happen extremely quickly on local hardware. That information would NOT be sent across the wire (in the same way today, that information is NOT sent over a HDMI cable attached to your television). Local computations are run on the hardware, once processed, they are passed to the output medium. Example:

Code (written by developers) > Calculations (depending on variables such as player movement, calculations are required) > Processor (the calcuations are processed) > Output (the result of these calcuations is passed to your screen).

Neither the code, nor the calcuations ever reach the output. They are processed by the hardware and forwarded to the output device.

It would work exactly the same with a server. The server processes the calculations required for physics/light/AI etc and then the output is passed to the end user via the network (internet). The only difference between the technology used by game streaming and Crackdown's tech is that instead of passing the raw output (a video stream), the servers are passing a partially computed result to the end user device for futher processing.

With regards to 'can this tech improve frame rate?', of course it can. It cannot be used to improve the frame rate of existing games because they simply aren't designed with this technology in mind. New games however can take advantage of this technology. Crackdown is the perfect example. If it wasn't for the power of the servers running the physics calcuations the frame rate on the Xbox would grind to a hault. Instead, it is able to run more fluidly as some of the calcuations are offloaded to a seperate device (servers). Phyics is just one area this technology can be used. As other companies have demonstrated (including NVIDIA) it can be used to calcuate some lighting informaiton, physics, AI etc.

The real question is not CAN it be done, but WILL it be done. Most developers simply will not entertain the idea of using cloud computing for a long time due to:

 

  • Development budget - it's new tech, it will cost more to learn and develop for.
  • Audience - not everyone has a good enough internet connection and currently, Microsoft is the only platform holder with this technology so what 3rd party in their right mind would limit themselves?
  • Risk - It's new, the tech is uncertain, it's capabilities are uncertain, publishers are never keen to take risks when they already have a working formula.

 

Crackdown is indeed the perfect example for the relatively small requirements in bandwidth to update geometry and collisions of angular building pieces.

To help the GPU you need a lot more data. Cloudlight photons already proved to be up in the 40 mbps range after using lossy compression. Light and shadowmaps are huge, especially if you want to see a significant improvement at 1080p resolution. Other things are very latency sensitive like reflections. Using cloud computing for ray tracing is very possible, yet only if you send the completed image to the client. There's no point in sending intermediary results as it's just too much data.

Physics has it limits too. It would be cool to have realistic collision physics in car games. Yet considering car models are already over 100k polygons, a multiple car collision that needs to update all the cars easily comes in to 12 mbps per car and that's a very conservative estimate. (eg a quarter of the car is affected in the collision, 25k polygons, assume you can compress it to 1 byte per polygon to describe the displacements, 25kb per frame x 60 = 12mbps) A big pile up would stump the best of internet connections. The same goes for fluid dynamics, way too much data. I don't expect a larger version of From dust to be able to run through cloud servers. Too much data to transfer.

What it is suited for is AI, which is as old as the first mmorpg, and a few things that rely on small yet hard to calculate data sets like Crackdown 3. And simply for sharing data like Drivatars and community created content. Now combine it all together to make a persistant slowly evolving world. RPGs with actual seasons and weather affecting and wearing away the terrain. You'll have a relatively big download when you log back in, yet while its running the changes should be gradual enough for the bandwidth to keep up. Godus is such a game, too bad that's about the only thing it has going for it. Eco will be such a game, and hopefully future rpgs will go this route too. I had expected mmorpgs to be there already, yet WoW's popularity has held innovation back, plus the enormous costs involved in setting up such a risky project.

So I fully agree with you final points.
- Expensive to develop. And also to rent or maintain all that extra server time compared to a static game. (which already requires massive resources)
- The world average internet speed is 3.9 mbps. It's higher in developed countries, yet also going through wifi shared with a whole bunch of devices.
- And indeed risk. THQ went bankrupt after Red faction Armageddon. Total destruction didn't help to popularize the game. Godus is in shambles, Eco sim kickstarter is only at 19k of 100k (408 backers...)