By using this site, you agree to our Privacy Policy and our Terms of Use. Close
ArnoldRimmer said:
SvennoJ said:

You are partly right. The average quality of the game will not exceed that what you can get with local hardware.
However peak demand can be better absorbed in a server farm setup than on your local hardware. Not everyone plays at the same time and most games don't demand that much all the time.

Sure, I absolutely get that. Home PCs and consoles are both "wasting" lots of resources, as they're idle or in standby 95% of the time, while server farms have a much better utilisation, can be balanced etc.

Not everyone plays at the same time, but Xbox Live for example sometimes had over 2 million people playing simultanously. If each of these gamers had used the cloud processing power of even only a single cloud server, they would already require 2 million cloud servers. That's over 6 times the total current capacity of the Azure cloud, and since Xbox Live is just one small part of what the Azure cloud resources are being used for, Azure would have to be even bigger.

SvennoJ said:

Plus in a multiplayer game, the world and physics only need to be done once for all the clients. So when you play a streamed game you can get about the same level of graphics as you would get locally, but instead of heavy explosions being restricted to cutscenese, they can now play out in real time.

I admit I have no experience with developing 3D world games, but while I agree that there will be parts that will only need to be computed once for all players, I doubt that this makes a huge difference in practice.

For example, I would assume that the game "world" itself is hardly even being computed. The vast majority of calculations is necessary for rendering the world from a specific point of view that is unique for every single player and every single frame.

Physics calculations - yeah, I guess that's indeed something that could be reduced in multi-player worlds with dedicated servers. But how many percent would that actually account for? I have no idea, but I assume that for most games it's not that high after all.

Yep, connection problems during release of a new game will go to new heights when 2 million people try to play the latest cloud game simultaneously. Those early connection problems only exists because companies do not want to spend the extra money to absorb the peak demand of a new release, it will sort itself out after a few days, then all that extra power sits there wasting money. With more games using the same hardware it will be a bit better, but the hype of a new release will still bring it down to its knees. Or a game at release will look pretty average with severe downgrades, while playing it during off-peak hours will get you closer to what was promised at E3. New generation of bullshots ahead.

For the second part, yes currently the world is hardly being computed. MMORPGs all have very static worlds. They could have very dynamic worlds with distributed computing, the advantage being that a large living world is shared by many players. But the bottleneck is getting all that data to the client. Large living worlds are only really viable when streaming becomes the norm. Physics are restricted in the same way. You can calculate accurate wave patterns, true interaction with water, mud, snow for your world in a distributed system. However sending all that geometry to the client is a huge chunk of data, far more than rendering the view in the server and sending that instead.

Anyway I agree that seeing a large dynamic persistant living world for a single player game is unlikely. Not cost effective.