In the 19th century electricity production was highly decentralized. With neither a transmission grid nor storage infrastructure in place, factories, towns and homes were each responsible for producing their own power. Over time it became clear that the economies of scale in generation, transmission and storage warranted a highly centralized network design, with power available by plugging a wire into the wall.

Several historians see a clear analogy between the birth of widespread electrical usage 200 years ago and cloud computing today, with the demise/elimination of onsite computing happening in the next decade. I don’t see it happening so quickly, for three key reasons. First, there is the existing data center/storage/application infrastructure that has been built over the past 50 years, with many proprietary/legacy applications that are poorly-suited to virtual or cloud environments (ever try to find a cobol programmer to fix a mission critical application written in the 70’s????). Second, while electricity is a classic commodity (a watt is a watt….), IT infrastructure is not always the same. For instance there are newly-emerging storage vendors, each offering unique platforms designed around and for BYOD/thin client architectures. Disaster recovery, security and backup requirements vary by each industry, with different vendors offering different solutions. I could go on and on with more technical considerations but let’s conclude this analysis on a personal level. There are real-life organizational factors: existing IT staff are (understandably) relucant to advocate outsourcing – and the possible elimination of their job.

Another blogger in the 22nd century will likely have a different perspective. But for now, even with cloud providers as a rapidly growing part of IT, technology professionals just need to keep their skills current and concentrate on the right mix of internal and external solutions.