richardhutnik said:
In short, I am going to ask this:
Do you believe the model currently used by PC will be continued indefinitely, or do you believe eventually cloud computing is the future of computing? In short, is Cloud Computing NEVER going to supplant the PC's current model to be able to do gaming? Or, a more specific question about the technology of OnLive, do you believe Cloud Computing, to render graphics over a network, and have computing done elsewhere, will serve NO function at all in gaming at any point? OnLive will end up going bankrupt and no one will end up taking anything from it to any degree?
If you are saying that OnLive is going to fail, but in the next decade or two, it will become part of the ecosystem of gaming in a relevant manner, I would be able to agree that. If you believe it will fade away, and computing power will still reside on the client side FOREVER, then I would debate you on it.
|
I believe that computing will continue as it is for about 5 years. Then processing power will be a commodity for all but high-end scientific research - every computer that you could buy will have the power to play any video game and will cost the same as the cheapest PC you could buy today. There will be no differentiation in the market.
If the lowest-end computer you could buy is capable of that, where is the space for cloud computing? Since you need some kind of computer to access OnLive, you might as well use the client to play games too. And the client will always have going for it that you can own a copy of the game, or something halfway between ownership and cloud like Steam.
The reason this will happen is the following: 1) Plateauing of graphical fidelity of video games. There hasn't been any technical advancement since 2007. 2) On-die integration of fast GPUs (Llano and Sandy Bridge next year) followed by frequent die shrinks until current high-end graphical power is on the lowest end CPU models (will take 4 years, I estimate).
Cloud computing has a role to play in business and scientific research, which still need all the processing power you can throw at it. Remember that all computing was originally 'cloud' (timesharing on a mainframe) and the reason we went to clients was for cost reasons and that interconnects (and now the Internet) speed and latency can't keep up with hardware advancements.
Beyond that, in about 10-15 years time, video games will not be something rendered with conventional rasterisation on a monitor anyway. It will be virtual reality: input and output connected directly to the senses or the brain. And for that, ultra low latency and adaptation of the environment to you as a person is key and client will still be needed.
So let's talk in the short term, no more than 10 years away, because no one really knows whether "games" or "graphics" as we use the terms will last that long. Where is the opportunity for the cloud to become widespread before graphics power is a commodity availible on the lowest of Dell/HP boxes?