Over the last couple of weeks, there has been quite a bit written about cloud computing, most recently about the effort by Intel HP and Yahoo. The article I linked to makes a very important point. The notion of a dummy terminal that has services delivered to it by a server is as old as computing itself. Whenever there is an application that can only be performed in a timely manner by expensive hardware, centralized computing becomes the computing paradigm for those that want to run that application. When the PC proved up to the task of word processing and spreadsheets in the early 80’s, server sales went into a nose-dive. There were still databases that needed to be managed centrally, so the server never went away, but its role was diminished. Currently, smart phones just don’t have the processing power to do a whole lot, but that will change rather rapidly as I suggested a couple of weeks ago. We are getting to the point where many users don’t know what to do with the capabilities of a low end PC or notebook. Why do we think inflicting the poor reliability of network access on computing be better than the dominant decentralized paradigm?