By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft Discussion - Microsoft shows off potential power of the Cloud at BUILD

Machiavellian said:
SvennoJ said:

At the end of the demo the cloud is doing the physics for 37,000 chunks, position, velocity, rotation. Updating all those independently moving chunks at 32fps plus the geometry changes for new chunks, broken up chunks and left behind gaps is a lot more data than a h.265 compressed video stream.
And the client still needs to be able to render all that extra geometry. You're taking the physics calculation away, not the strain of rendering 37k objects.

It's a good example of how a Gaika/Onlive server can deal with high stress situations by utilizing a server farm to spread peak demand. As long as not everybody is blowing up everything at the same time, it will be a more economic solution instead of having the peak power available for each client separately. It's a tech demo of how slow down can be eliminated in a server type situation. I don't see it being practical as helping out a local machine with physics.

The data from those calculations can be heavyly compressed.  Its position data and I am sure that the data is less than a H.265 video stream.  No way to know right now since none of that info was made available but I am sure a simple demo coudl easily be done where only the input parameters need is the calculated data.  As for the Geometry, even current gen consoles could perform those calculations.   The Demo is a stress test demo more real world type of games would be no where close to those types of calculations.  Also you can limit the data based on what the view the user can see at any one time.

From my understanding Gaikia does not leverage spreading a game over multiple servers, instead, Gaikai spins up an instance of that game for each user that is playing the game.  What Gaikai could do is take the rendered output from the game and spread that out to multiple servers for processing which is pretty easy with video output.  The difference with MS solution and Gaikai is that MS solution does not need an instance of the game running for each user.  Instead one main hosted instance can be used to support multiple users as the hosted instance just neeeds to sync the streams and send the data off to be process by mutliple servers.

Another advantage of MS solution is that its not dependant on the hardware.  For Gaikai to work, the game is actually rendered on PS3 hardware.  I am sure for PS4 games, those games might need PS4 hardware as well.  MS solution can use any combination of hardware and software making the cost lower to support.

I didn't mean that Gaika/Onlive currently use load balancing for existing games. The game will have to be specifically developed with that goal in mind, just as how the code has to change to utilize multiple cpu cores instead of 1. It is however much more efficient in a streamed situation. I've done a lot with compression and know the limitations. Sending intermediate data to be rendered quickly exceeds sending a fixed final image stream. It's not practical in the end either, you're taking away some work from the cpu, yet nothing from the gpu.



Around the Network

Yes it seems interesting, they could probably run Resogun at 30fps with this thing :-/ so many particles!

More seriously, their cloud computing was probably using GPUs and running on a LAN (you know... 100 times faster than most internet connection, very little lag) and well, it also runs at only 32fps... and you have not touched the controller, it would feel off if the delay was too long (you can render locally at 32fps, but the effects could be deffered by a few frames (1 frame at 30 is 33ms... if the interactions are 3 frames below you are almost at 100ms, which is enough to make it feel "off".

But hey, if I am wrong we are in for some amazing things... I would just not count on it until there is a critical mass of people with connection above 100mbps, maybe more depending on the number of objects they want to move around in a given frame, and you have to think that if they did not sell you the machine to run these simulations locally, nor the electricity to run it, they will want to recoup that cost somewhere, no game will let you play day in and day out with effects that require the equivalent of two or 3 titans GPU to render in real time without asking you to pay some monthly fee (and that kind of computing is not happening with the 300 000 "servers" they advertised at first, many more will be required for 4 million people to play...



SvennoJ said:

You are partly right. The average quality of the game will not exceed that what you can get with local hardware.
However peak demand can be better absorbed in a server farm setup than on your local hardware. Not everyone plays at the same time and most games don't demand that much all the time.

Sure, I absolutely get that. Home PCs and consoles are both "wasting" lots of resources, as they're idle or in standby 95% of the time, while server farms have a much better utilisation, can be balanced etc.

Not everyone plays at the same time, but Xbox Live for example sometimes had over 2 million people playing simultanously. If each of these gamers had used the cloud processing power of even only a single cloud server, they would already require 2 million cloud servers. That's over 6 times the total current capacity of the Azure cloud, and since Xbox Live is just one small part of what the Azure cloud resources are being used for, Azure would have to be even bigger.

SvennoJ said:

Plus in a multiplayer game, the world and physics only need to be done once for all the clients. So when you play a streamed game you can get about the same level of graphics as you would get locally, but instead of heavy explosions being restricted to cutscenese, they can now play out in real time.

I admit I have no experience with developing 3D world games, but while I agree that there will be parts that will only need to be computed once for all players, I doubt that this makes a huge difference in practice.

For example, I would assume that the game "world" itself is hardly even being computed. The vast majority of calculations is necessary for rendering the world from a specific point of view that is unique for every single player and every single frame.

Physics calculations - yeah, I guess that's indeed something that could be reduced in multi-player worlds with dedicated servers. But how many percent would that actually account for? I have no idea, but I assume that for most games it's not that high after all.



ArnoldRimmer said:
SvennoJ said:

You are partly right. The average quality of the game will not exceed that what you can get with local hardware.
However peak demand can be better absorbed in a server farm setup than on your local hardware. Not everyone plays at the same time and most games don't demand that much all the time.

Sure, I absolutely get that. Home PCs and consoles are both "wasting" lots of resources, as they're idle or in standby 95% of the time, while server farms have a much better utilisation, can be balanced etc.

Not everyone plays at the same time, but Xbox Live for example sometimes had over 2 million people playing simultanously. If each of these gamers had used the cloud processing power of even only a single cloud server, they would already require 2 million cloud servers. That's over 6 times the total current capacity of the Azure cloud, and since Xbox Live is just one small part of what the Azure cloud resources are being used for, Azure would have to be even bigger.

SvennoJ said:

Plus in a multiplayer game, the world and physics only need to be done once for all the clients. So when you play a streamed game you can get about the same level of graphics as you would get locally, but instead of heavy explosions being restricted to cutscenese, they can now play out in real time.

I admit I have no experience with developing 3D world games, but while I agree that there will be parts that will only need to be computed once for all players, I doubt that this makes a huge difference in practice.

For example, I would assume that the game "world" itself is hardly even being computed. The vast majority of calculations is necessary for rendering the world from a specific point of view that is unique for every single player and every single frame.

Physics calculations - yeah, I guess that's indeed something that could be reduced in multi-player worlds with dedicated servers. But how many percent would that actually account for? I have no idea, but I assume that for most games it's not that high after all.

Yep, connection problems during release of a new game will go to new heights when 2 million people try to play the latest cloud game simultaneously. Those early connection problems only exists because companies do not want to spend the extra money to absorb the peak demand of a new release, it will sort itself out after a few days, then all that extra power sits there wasting money. With more games using the same hardware it will be a bit better, but the hype of a new release will still bring it down to its knees. Or a game at release will look pretty average with severe downgrades, while playing it during off-peak hours will get you closer to what was promised at E3. New generation of bullshots ahead.

For the second part, yes currently the world is hardly being computed. MMORPGs all have very static worlds. They could have very dynamic worlds with distributed computing, the advantage being that a large living world is shared by many players. But the bottleneck is getting all that data to the client. Large living worlds are only really viable when streaming becomes the norm. Physics are restricted in the same way. You can calculate accurate wave patterns, true interaction with water, mud, snow for your world in a distributed system. However sending all that geometry to the client is a huge chunk of data, far more than rendering the view in the server and sending that instead.

Anyway I agree that seeing a large dynamic persistant living world for a single player game is unlikely. Not cost effective.



TimCliveroller said:
I tell you this: If Gaikai and Onlive is/was/is possible, why ...


the big difference is this, gaikai is a completely separate platform than the ps4.

 

gaikai will work for (let's say) 30% of the market.  that 30% of the market can use the gaikai service and the other 70% can choose to ignore that service and use what works for them which is to buy discs and process everything locally.

the thing with cloud processing is (like gaikai) it has to be handeled as a seperate service.  MS cannot integrate it into their xbox one platform in any meaningful way. the games have to be playable for that 70% of the market that cannot or will not use the cloud assist.  that means for the 30% of the market that may want to use it,..  it cannot be used in any meaningful way.  



Around the Network
ArnoldRimmer said:
Machiavellian said:
ArnoldRimmer said:

That's great, yet completely meaningless.

There has never been any doubt that graphics rendering calculations can generally be outsourced to other computers. Raytracing clusters have done this for decades.

The problem is, and has always been, that this has very little practical implications for real-world games.

Because even if not considering the problem of network latency/bandwidth etc. of real-world internet connections, there's still the problem that the actual amount of calculations necessary does not decrease, it's just being distributed differently: If, in this setup, a high-end PC managed to show the scene at about 2fps, while "the cloud" managed to achieve 32fps, then that means that the cloud computing resources required for this demo equalled to at least 16 equivalent high-end PCs. These resources cost money, quite a lot of it, that someone has to pay for.

So there's simply a huge rift between theory and practice when it comes to the "power of the cloud" for graphics: Relevant improvements are very well possible in theory - but unrealistic in practice, because no gamer would currently be willing to pay for the resources required to actually do so: Too costly for very limited improvements.

You must remember that there is a difference between profit and cost.  The cost for MS to distribute processing to a certain number of virtual servers compared to the profit margin they get for such resources when selling it as a profit.  The gain for MS is to get as many customers using Azure as possible and getting them tightly integrated with its services.  There will be aspect to this business plan which will require them to give away resources for free to get more subscription subs.

Phil made a tweet that we will be seeing more on this tech as well as the demo is not a throw away piece but could represent something larger that they are doing.  What I am guess we will see is how MS will monetize their platform which may not mean charging customers but instead getting developers, Publishers and other content providers on Azure which would pay for their cloud compute on the customer end.  I am also sure MS is looking to get a good percentage of XBL Gold subs which also goes along the cost of providing the service.

The thing for MS is that they have paid for the servers and infrastructure.  They are not renting, they are the provider so cost of resources can be balanced with cost of usage.  Another thing that is not know is how MS is spreading the work amoung multiple servers. I hope we get more technical detail on the process as I am interested in some of the problems MS must face with delivering this tech.

I don't know, that doesn't really convince me. Sure, Microsoft could sell cloud computing resources so cheap that they're actually making a loss, to attract new customers etc. They could cross-finance etc. etc. But those are not sustainable solutions. In the end, someone has to pay for the costs. If someone wants to to use the processing power equivalent of 16 high-end PCs, 40 Xbox Ones or whatever,  someone has to pay for that.

In a few years from now, it will probably be like this: Video game consoles will indeed be hardly more than very simple and cheap streaming devices, with pretty much all processing being done in the cloud - and people will pay different amounts of money depending on how good they want the graphics quality to be, maybe something like "2 Cents per minute for Xbox 360-quality graphics, 2 dollars per minute for raytracing quality graphics in 4K resolution"

16 top end PCs is probably way to much resources.  Lets put the number where MS made the statement first which was 4 X1 systems.  That type of cloud power is probably what MS is going for and is very obtainable based on their current infrasturcture.  As a business model, its very substainable.  The company I work for does this type type of stuff all the time.  We will give you the software for free to get you on a montly service fee.  The ideal is that once you use our stuff, you will be with us for years.  Sub cost based on stuff given is where MS is going.  They want you dependant on their services thus you remain a customer for years.  MS doesn't have to maintain such services at cost or even a lost for long.  Just long enough to get you hooked.  Its no different from the current app model on smartphones.  Give consumers the software for free and then recoup cost by selling you everything else.

As I mentioned, the cost MS sells a service is much less of course then it takes them to run it.  The service already running.  Makes more sense to use whats avaliable then let it run idle.  Also its evident that MS has run the numbers.  From the number of servers they believe is needed to run XBl and their cloud compute infrastructure.



Quite honestly, for places that have the bandwidth I don't see why having cloud assisted computing can't work (limited at first, given latency, but will improve over time). The concern I have is those countries that have bandwidth caps, or simply piss poor broadband infrastructure (of which there are many). Are we now going to shift from the hardware dependent performance variation that we see in PC, to bandwidth dependent performance variation with the cloud?

And if a country has poor bandwidth/data caps...is the game going to be unplayable/substantially downgraded?



lol what a disappointing demo.
I'd still pick the same scene pre-rendered. Might look the same every time, but atleast it wont look laggy all the time.



Soleron said:
If people with an internet connection get a better experience than those without on a single-player game, something is very wrong.


Welcome to the future and the future is now. Titanfall, Destiny, The Division. All online only games. It's going to become more and more common.



lucidium said:
Machiavellian said:

To set the fact straight, I called you out on stating you were a developer.  it was you who first mentioned about being a licensed developer and calling me an armchair poster.  So no, I stated we both can send each other our creds then start again instead of trying to call people names or make assumptions about their knowledge.  Even still, just because you develop does not mean you develop in this area.  The type of software I develop does not mean I can state I know how a game developer does his job.  The whole point is that you put that you are a licensed developer part as if that qualify you as knowledgeable about the subject which it does not.  Even I who have developed on cloud based platforms can only reference the type of work I do which is totally non game related and in most areas probably not going to be relevant but then again I was not going to come off as that way in the first place. 

Alright this is my final post on that stuff as I am not interested in waving creds but instead the topic.

I've got ample experience with the latest SDKs for the current consoles and how currently-in-development, as well as already released titles utilize cloud, thats my point, outside of gaming the usefulness of cloud based computation increases dramatically because load based scheduling that would locally take a long time anyway loses nothing if that load is pushed elsewhere to be computed remotely and the results of which passed back.

Cloud computation on rendering, or cloud like swarming for deformation, fluid dynamics, thermodynamics and so on are just as viable, but gaming, the viability of cloud based compute drops considerably, the best we will get in this generation is sub processed light math, that being lighting computed externally and the result streamed back as low quality monocromatic video which is essentially used as a the dynamic lightmap, but even then the local processing saved in doing this is minimal and would not be able to react fast enough to user input to be permissible on high dynamic objects, if youre muxing in the stream for a specific area cast from a building in a set piece with realtime lighting then depending on the speed and predictability of motion in the objects you want to calculate, then the viability for such processing is there, and could cause minimal issue should connection drop in quality or all togehter.

beyond that though, we lack the connectivity to the average home and the hardware in these clustered servers to handle the demand of hundreds of thousands of clients connecting to process entirely different chunks of resource for various applications and games, hence why lab environments are all we've seen such computational demonstrations thus far.

Sure microsoft would love to use it as a springboard for azure adoption but its not like they'll get much from additional XBL subscribers, Azure is doing fine as it is, without the gaming side of things chipping in, the gaming side has little impact other than being a bullet point for presentations and something to whet the lips of those attending such conferences who want to see how flexible the technologies can be.

I maintain that we are years, perhaps a decade or more away from cloud computation making a descernable difference in games that could not be achieved with simple server side communication on existing infrastructures, or by simply having additional hardware resources locally (such as additional power on ps4).

As to MS cloud compute, we really are around the corner.  Just like Sony with PS Now, it will be limited to just physics and AI.  Developers will do light offloading of task which will allow them to use those local resources for other things lke rendering.  I believe MS will pace out their tech but we should be able to see something that uses it within this year as MS need to lead in this direction to get developer support.

Ahhh you are only looking at it on the gaming side.  MS is looking at the X1 as a consumer device just like your smartphone.  This is their plan not saying if they can execute it correctly but MS is looking for the X1 to be an integral device in a home and each person is using it for different purposes based on their services.