Economics and Platform Architecture (Part 3)

Part 2 of this series of articles looked at how transitioning from scarcity to an abundance of fundamental computing resources enabled the historic one-to-one relationship between operating systems, applications and underlying hardware to be broken. Part 3 will examine how the ability to decouple hardware and software evolved into a new strategy for managing IT systems – saving company’s millions of dollars in the process – and laid the foundation for today’s cloud computing architectures.

The Mac on which this article was written can run not only Apple’s native OS X operating system, but also Microsoft Windows, multiple variants of Linux and – if needed Oracle’s Open Solaris operating system. A modern personal laptop computer has such an abundance of computational capacity that multiple distinct hardware architectures can be emulated in software.

This ability to virtualize the link between hardware and software with little or no performance penalty is a direct benefit of the economics of abundance. New layers of virtualization software allow us to run a copy of Microsoft Windows or Linux on a Mac computer alongside the Mac’s native operating system with each system sharing the same computational, storage, memory and network resources. This ability to decouple hardware and software provides significant economic benefits.

The ability to run multiple – but distinct – software architectures on the same physical hardware allows companies to amortize hardware costs across a wider range of application workloads. Virtualization is interesting and helpful on a laptop, but is truly transformative when applied to multi-million dollar computer server infrastructures.

Under the economics of scarcity a single enterprise software application would typically be tightly coupled to a single hardware platform. Often the underlying hardware would be running at only a small fraction of its available capacity. The ability to run multiple application workloads on the same hardware through virtualization enables IT departments to manage their hardware resources much more efficiently by running them at higher levels of utilization.

The use of virtualization strategies is now standard practice in many – if not most – enterprise computing environments. In many cases a few hundred servers running virtualized workloads have replaced 1,000s of servers previously dedicated to individual workloads. Applying this approach has saved many millions of dollars across enterprise IT departments.

At least one company – VMWare – became successful by meeting the market demand for virtualization solutions and was perhaps the first child of this new era of resource abundance. VMWare is no-longer the only player. The economic benefits of this approach to software architecture have spawned a diverse and  increasingly competitive ecosystem including offerings from Microsoft, Oracle, Citrix and Parallels and the open source based Xen.

The ability  to turn the compute, program memory, storage and network resources of many independent servers into a virtualized pool of resources to be allocated as needed by software applications has laid the foundation for emergence of Cloud Computing.

Part 4 of this series will look at how the commoditization of computing resources and the ability for them to be managed and allocated on demand forms the foundation of new cloud computing architectures and a radical change in the way software applications and services are created, managed and consumed.