Over the past decade, data center virtualization has played a major role behind every major trend in software from search to social networks to software as a service (SaaS). Most of the applications we use — and cloud computing as we know it today — would not have been possible without the server utilization and cost savings that resulted from virtualization.
However, new cloud architectures are changing the way we look at the entire data center and are re-imagining it. Virtualization, or the carving up of a large, expensive server into several virtual machines, is taking a back seat and is having a difficult time keeping up.
Instead of dividing the resources of individual servers, large numbers of servers are being aggregated into a single warehouse-scale “computer” to run highly distributed applications.
These changes are going to impact every IT organization, and developers will need to start paying attention.