By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Norris2k said:

Not to reopen a discussion that nicely ended, but just my 2 cents.

First, millions of servers sounds like science fiction, but already Google is said to have 2-3 millions servers (WW), and Microsoft announced 1 million. And cloud are massively scaling up. So such a massive scale is not doable right now just for games, but I believe it will become possible in the next few years.

Second, more than scale, it's all about how much it costs per user, and how much you can charge for it, even indirectly (that how you get this million server massive infrastructure from google for "free"). Let's make a very simplified calculation, out of thin air, just to explain how it works. You have 5 hours of peak usage everyday, 30 days a month, that's 150 hours. With time zones, if only half the players get the same peak time at any time in the same region (USA, Europe), you have twice the peak time, 300 hours.  Let's say a player plays 30 hours a month only at peak time, he only uses 10% of this peak time, which costs 10% of the cost of a server. But he's not using 100% of the cpu/gpu at any time, and not everygames are gpu intensive, so it will be even lower. Let's make a super cheap ps4 like server, let's say 200$ a month for maintenance and hardware, and the cost per user is 20$ a month, which is manageable (I'm not telling it's realistic, but why not at some point in the future if prices go down, or if sales goes up).

but we are talkimg games here

one single server blade in say netflix hq, can most likely decode at lewst 20 video stresms simultaneously on the hsrdesre wnd and then its just a csse of bandwidth. 

with games you have to have the hardware to actually run the games. And then the bandwidth to handle the stream.