Virtualization is the creation of a virtual — rather than actual — version of something, such as an operating system (OS), a server, a storage device or network resources.
Virtualization uses software that simulates hardware functionality to create a virtual system. This practice allows IT organizations to operate multiple operating systems, more than one virtual system and various applications on a single server. The benefits of virtualization include greater efficiencies and economies of scale.
OS virtualization is the use of software to allow a piece of hardware to run multiple operating system images at the same time. The technology got its start on mainframes decades ago, allowing administrators to avoid wasting expensive processing power.
Virtualization describes a technology in which an application, guest OS or data storage is abstracted away from the true underlying hardware or software.
A key use of virtualization technology is server virtualization, which uses a software layer — called a hypervisor</a — to emulate the underlying hardware. This often includes the CPU’s memory, input/output (I/O) and network traffic.
Hypervisors take the physical resources and separate them so they can be utilized by the virtual environment. They can sit on top of an OS or they can be directly installed onto the hardware. The latter is how most enterprises virtualize their systems.
The Xen hypervisor is an open source software program that is responsible for managing the low-level interactions that occur between virtual machines (VMs) and the physical hardware. In other words, the Xen hypervisor enables the simultaneous creation, execution and management of various virtual machines in one physical environment.
With the help of the hypervisor, the guest OS, normally interacting with true hardware, is now doing so with a software emulation of that hardware; often, the guest OS has no idea it’s on virtualized hardware.
While the performance of this virtual system is not equal to the performance of the operating system running on true hardware, the concept of virtualization works because most guest operating systems and applications don’t need the full use of the underlying hardware.
This allows for greater flexibility, control and isolation by removing the dependency on a given hardware platform. While initially meant for server virtualization, the concept of virtualization has spread to applications, networks, data and desktops.
The virtualization process follows the steps listed below:
The virtual environment is often referred to as a guest machine or virtual machine. The VM acts like a single data file that can be transferred from one computer to another and opened in both; it is expected to perform the same way on every computer.
You probably know a little about virtualization if you have ever divided your hard drive into different partitions. A partition is the logical division of a hard disk drive to create, in effect, two separate hard drives.
There are six areas of IT where virtualization is making headway:
The layer of software that enables this abstraction is often referred to as the hypervisor. The most common hypervisor — Type 1 — is designed to sit directly on bare metal and provide the ability to virtualize the hardware platform for use by the virtual machines. KVM virtualization is a Linux kernel-based virtualization hypervisor that provides Type 1 virtualization benefits like other hypervisors. KVM is licensed under open source. A Type 2 hypervisor requires a host operating system and is more often used for testing and labs.
Virtualization can be viewed as part of an overall trend in enterprise IT that includes autonomic computing, a scenario in which the IT environment will be able to manage itself based on perceived activity, and utility computing, in which computer processing power is seen as a utility that clients can pay for only as needed. The usual goal of virtualization is to centralize administrative tasks while improving scalability and workloads.
The advantages of utilizing a virtualized environment include the following:
Virtualization provides companies with the benefit of maximizing their output. Additional benefit for both businesses and data centers include the following:
Before converting to a virtualized environment, it is important to consider the various upfront costs. The necessary investment in virtualization software, as well as hardware that might be required to make the virtualization possible, can be costly. If the existing infrastructure is more than five years old, an initial renewal budget will have to be considered.
Fortunately, many businesses have the capacity to accommodate virtualization without spending large amounts of cash. Furthermore, the costs can be offset by collaborating with a managed service provider that provides monthly leasing or purchase options.
There are also software licensing considerations that must be considered when creating a virtualized environment. Companies must ensure that they have a clear understanding of how their vendors view software use within a virtualized environment. This is becoming less of a limitation as more software providers adapt to the increased use of virtualization.
Converting to virtualization takes time and may come with a learning curve. Implementing and controlling a virtualized environment demands each IT staff member to be trained and possess expertise in virtualization. Furthermore, some applications do not adapt well when brought into a virtual environment. The IT staff will need to be prepared to face these challenges and should address them prior to converting.
There are also security risks involved with virtualization. Data is crucial to the success of a business and, therefore, is a common target for attacks. The chances of experiencing a data breach significantly increase while using virtualization.
Finally, in a virtual environment, users lose control of what they can do because there are several links that must collaborate to perform the same task. If any part is not working, then the entire operation will fail.
What’s old is new again: Lego site BrickLink was found vulnerable to cross-site scripting and other well-understood types of …
Kubernetes manifests and objects represent the baseline of code development on the platform. Here’s why and how to test these …
Spotify Plugins for Backstage improves the development environment, but it can also help users tackle the developer shortage, …
To establish the right development team size, managers must look at each member’s responsibilities and communication paths, as …
While it’s not necessarily easy to become certified in microservices architecture, there are plenty of courses you can take to …
The Golden Hammer antipattern can sneak up on a development team, but there are ways to spot it. Learn the signs, as well as some…
Among other benefits, a hybrid cloud data warehouse can offer enhanced flexibility and scalability, as well as on-demand access …
The wrong instance type can affect workload performance and even increase costs. This year at re:Invent, AWS released new EC2 …
The Department of Defense Joint Warfighting Cloud Capability contract allows DOD departments to acquire cloud services and …
Many organizations struggle to manage their vast collection of AWS accounts, but Control Tower can help. The service automates …
There are several important variables within the Amazon EKS pricing model. Dig into the numbers to ensure you deploy the service …
AWS users face a choice when deploying Kubernetes: run it themselves on EC2 or let Amazon do the heavy lifting with EKS. See …
When you compare Scrum vs. Kanban, you realize there are as many similarities as there are differences. Here we help you choose …
Do you know Java? Are you trying to learn TypeScript? Here are five differences between TypeScript and Java that will make the …
The job a product manager does for a company is quite different from the role of product owner on a Scrum team. Learn key …
Classical and quantum computers have many differences in their compute capabilities and operational traits. Know their …
Colocation companies offer a wide range of facilities and services that can help organizations reduce or eliminate the costs …
Data center standards help organizations design facilities for efficiency and safety. Organizations can use BICSI and TIA …
All Rights Reserved, Copyright 2016 – 2022, TechTarget
Privacy Policy
Cookie Preferences
Do Not Sell My Personal Info