Intrinsic said:
but we are talkimg games here one single server blade in say netflix hq, can most likely decode at lewst 20 video stresms simultaneously on the hsrdesre wnd and then its just a csse of bandwidth. with games you have to have the hardware to actually run the games. And then the bandwidth to handle the stream. |
To simplify even more, what I'm trying to say if that if only 10% of the users (because players don't play all the time, and because of time zones) plays simultaneously, you have to provide "only" 2 million servers for 20 million paying users. Also depending of the hardware on server side, the architecture and the target rendering, you will be able to share some resources. As an example, currently you would be able to run a few instances of miami heat on a single ps4. For a real world example, you can watch https://www.youtube.com/watch?v=LuJYMCbIbPk they run 2 games on 2 virtual OS with 2 gpu (not including the cheap main gpu), 1 cpu and shared memory. You will save not only on memory (you don't need 8 GB if games use 6 GB on average) but also a lot on disks, with caching, SAN disks, you will not need a BR player, and get rid of a lot of chips (bluetooth or whatever). So you will not have this 1 to 1 exact hardware. It's cleary much more expensive that netflix but it's also not the same price per unit (movies are cheaper than games) and for the hardware (because the user save 400$ on the hardware, you can charge it more).