Getting Deeper into Infrastructure with Server Virtualization
Virtualization in computation processes dates back to the nineteen-sixties. Since that time, the software-based emulation of infrastructure has been as fundamental to computer science and engineering as air is to breathing. When computer time was a scarce resource, time-sharing was a well-received innovation. Institutional data centers grew as fast as companies like IBM and DEC could deliver multi-user technology.
By the early seventies, IBM’s first experimental hypervisors had evolved into commercially available software capable of orchestrating virtual emulation. When the first virtual machines hit the market, they offered refreshingly flexible alternatives to batch processing.
Virtual Machine Infrastructure in Windows and Linux
Virtual machines have grown in capabilities even faster than the silicon circuits of the hardware. Virtualization has evolved as software bundles, built on the backs of hypervisors; including the entire stack of hyper-converged infrastructure if required.
Servers based on the x86-architecture first entered service in the late nineties, and the digital realm has never looked back. The x86 series of processors, along with clones from AMD, have become the leading chips that power personal computers and servers alike. Today, software-based data management applications build any conceivable digital solution on top of off-the-shelf, inexpensive compute infrastructure that utilizes commoditized x86 processors.
In Linux, x86 has found a new friend and ally. The founder of Linux, Linus Torvalds chose x86 over the ARM architecture. Chips based on ARM have been very successful in mobile computing chipsets. However, according to Torvalds, the ARM hardware market is too fragmented to develop as a platform for Linux infrastructure reliably.
Understanding the Architecture to Realize Unlimited Potential
Multiple paths lead to SDDCs, but solutions remain vendor specific. As standards emerge, abstraction will become easier and delivery faster. Hyper-convergence is one such way that provides the tools to run processes more efficiently, with cloud-like agility and scalability. However, this approach carries the risk of lock-in that limits the choice of applications to one vendor.
To avoid such traps, organization leaders who have ambitions to be at the leading-edge need to understand architecture at the highest level. Commoditizing the hardware and abstracting the compute, storage, and networking has removed the physical constraints and made software-defined data center solutions possible that have unlimited potential and agility.
Abstraction by Virtualization Drives Data Center Innovation
Virtualization is the first layer of abstraction.Along with the commoditized x86 physical layer, abstraction has opened the digital possibilities. Computing has evolved from the early days of time-sharing and separated the digital logic engines from the physical machines that cradle them. With this shift, data centers have become further removed, abstracted from the hardware.
In software-defined data centers(SDDCs) the infrastructure extends virtualization principles of abstraction across all data center resources. Automation via hyper-convergence and intelligent software controls configuration and management make IT infrastructure much more manageable. The improved data center efficiency that results from abstraction drives down capital costs and takes deployment provisioning from days to hours
Infrastructure Trends Leaner and Lighter with SDDCs
According to Gartner, the majority of enterprises will make SDDCs mandatory by 2020. The need to pivot with unprecedented agility while maintaining continuity will drive the growth of abstraction. Microsoft and VMware are the current leading SDDC technology vendors, with Dell HPE Cisco Lenovo and Oracle developing strong positions.
Early mover industry sectors include financial services, healthcare, utilities and communications. Other sectors are holdouts, such as hospitality transport and construction. The business cases, best use case, and risks associated with SDDCs are still not widely understood. Organization leaders who have ambitions to be at the leading-edge need to understand architecture at a high level.