I figure I'd chime in to help people understand what goes into building a network like XBL or PSN. I cannot say that I have worked on a project the assumed scale of PSN, but I do work for a networking company and have been apart of building out some rather large networks. Whoever said that having servers on the internet is cheap is not looking at the full picture.
Everything related to hosting costs money. Setting up large CDN (content delivery networks) is a phenominal up front cost. That being said, its most likely Sony leverages other CDN's to help offload some of the burden. Microsoft uses Akamai all the time for their download services (PC related stuff for sure, not sure about XBL). CDN's basically host your files from multiple farms across the globe. Downloding something in China from US may not be the service experience you are expecting, so its important to make sure service satisfaction is equal across your major targets. Servers, networking equipment, bandwidth and storage are going to be the major key expenses in building a network. Storage, and making that storage global is most likely their biggest hurdle. If I remember correctly, XBL puts a limit on some file sizes, where Sony does not. Because they charge their clients to host demos/files, it helps offset bandwidth and storage costs.
Services in the technology space is all the new rage. Everyone wants to sell something that keeps selling (revenue streams) rather than buying a product once and having to release new products to incure more sales. There are some great advantages for selling a service vs. a product, but there are also some major things to keep in mind. As things become more service based, reliability and experience are key. So to host a few websites, you can get away with a few servers and some bandwidth, but to host a network, you need a minimum of two of everything. Have a service that tracks players locations? Maybe it takes 20 servers to track all your clients,but you better get 40 and to make sure if one stack fails, the other can still handle the load. The network switches that those servers connect into? Better have two incase of switch failure. The routers that help route traffic? Better have 2 in case of router failure. Modules, cables, switches,routers, fans, power supplies all have to be redundant. When I submit a kit list that totals 250 thousand to a client and they flip out, its because they didnt realise you have to double everything. Once you have that built out, say in North America, better build that same footprint in Europe and one in Asia too.
After the up front costs to build a network, you have the MRC's. Montly recurring costs that datacenters charge you. Power, cooling, bandwidth, and support services you contract all cost, every month. They themsleves are providing a service. New technologies make thing easier like VMware and virtualization, but that comes at a cost too. Say those 40 servers, maybe you can virtualize them with 8 big beefy servers, but the cost to license the Virtualization technology costs some money. Add in the storage networks that get introduced because reduced your hard disk count when you downsized all your hardware, and you're right back where you started. At least hopefully your montly's are lower than they could be.
Fact is, nobody really knows. I havent seen anything disclosed anything that I have read, but it would be interesting. Setting up PSN is costly for sure, but a lot of that is the upfront cost to build. Once you have revenue coming in, and some infrastructure already laid out you concentrate on selling shit. I think it makes sense to charge your clients to upload demos and utilize the network. You are selling your service to help them hire a service to advertise and reach out to their targets. I suppose at some point everything becomes funny money because you're paying one guy to provide you a service in which you turn around and are selling to another guy.
Anyway, just a perspective...







