Understanding Virtualisation [Podcast]

The foundation of a typical virtualised platform is its virtualisation software, known as a hypervisor, which emulates computer hardware and allows multiple operating systems to run on a single physical host computer.

From an application perspective, each guest operating system appears to have access to the host's processor, memory, and access to other I/O resources on a discrete basis.

Ben Chai, Partner/Analyst Incoming Thought and senior professional with Lanix

In reality, of course, the hypervisor is controlling the IT system resources and allocates elements of these resources to each guest operating system on a dynamic basis. This ensures that each of the guest operating systems - known as virtual machines - do not interfere with each other.

Most modern virtual machines are run on host hypervisors that require a host operating system - Apple Mac, Linux or Windows - that install in much the same way as a software application installs.

The key advantage of virtualisation is that few single operating system computers make complete use of their hardware and system resources.

Running multiple instances of a virtual machine on one or more system boxes means that, for example, ten users can run their virtual machines on as few as four system boxes, so maximising the efficiency of the hardware utilisation.

Each user, of course, `sees' a single instance of a computing platform, just as if they were given a unique computer system for their exclusive use.

From an IT perspective, virtualisation is rather like spinning plates in the air - with practice and a good platform, it will work and work well. But if the underlying system crashes, it can crash multiple instances of virtual machines.

That's the downside. The upside is the vastly improved usage of IT resources and, in a large organisation, the ability to load balance multiple instances of virtual machines between multiple physical servers. Despite all the media hype, virtualisation is actually not a new technology, and dates all the way back to the 1960s.

According to Professor John Walker, a member of the Security Advisory Group of ISACA, the not-for-profit IT security association, the technology has recently come back into vogue, largely as a result of the financial benefits its brings to most organisations.

Professor Walker, who is managing director of Secure Bastion, says that, whilst virtualisation's benefits include reduced server sprawl and a quicker build time, there are clear security issues.

As with any system, or application configuration, he adds, control is vital to security, and its professionals should remember that this security principle applies to the on-line and off-line images alike.

Against this backdrop, he argues that IT professionals should take care to ensure that new builds are tracked, and that, again, as with conventional systems and applications, virtualised environments need to be patched and fixed.

Despite all of this, the current implementation of virtualisation technology is still a relatively young science, with an increasingly powerful range of hardware-based servers now able to support multiple virtual machines without breaking the bank.