virtualization controls tech costs by sharing resources.
by rick telberg
hewltett packard, hp technology at work, may 2007
you’ll be hearing more about “virtualization,” especially with the release of microsoft’s windows vista operating system. exactly what virtualization is depends on the context in which the term is being used. for the most part, virtualization refers to the process of creating a software emulation of a hardware platform.
that sounds like a mouthful to most non-techies. but virtualization, especially in the form of hardware emulation, has been around for decades.
if you haven’t encountered it yet, you will soon. it’s worth getting to know a little better. accountants will be especially interested in virtualization because it can reduce costs by increasing the utilization of the servers you may already have. it can make your it system more flexible and assure that supply automatically meets demand.
the first really popular personal computer virtualization system was the connectix virtual pc for the macintosh, first introduced in 1997. running under the mac os, this piece of software emulated a pentium cpu and associated motherboard and peripherals, allowing the windows operating system and windows-based applications to run on the mac. microsoft bought the company, and its technology, in 2003. with the advent of intel-based macs, the original reason for virtual pc is moot.
instead, a major application of the current version of microsoft’s virtual pc, and similar products from other vendors such as vmware, is to simultaneously run multiple copies of an operating system on a single pc. these can be diverse operating systems, including windows vista, windows xp, and even linux, and each os can be running a different application or applications. microsoft touts its newest version of virtual pc, still in beta, as a great tool for migrating to windows vista.
another application where virtualization is making great headway is in the area of virtual servers. with a virtual server, you can combine multiple pcs into a single virtual server, or have virtual machines configured for standby in the event that one of your production servers goes down. if you preserve the image of the initial server running the application as a backup, it can be transferred to the standby virtual server (probably running on a different physical server) with no noticeable downtime.
it takes a fair amount of technical sophistication to set up a virtual pc, so it’s probably a task best left to it departments. in the near future, you can expect to see more releases of pre-configured virtual systems designed to do one specific task or to demo a virtual application or operating system.
microsoft already has several of these available for download on its virtual pc site. these are in the form of virtual hard disks (vhds) and, as with virtual pc, they are free. the current crop of vhds are test drives of microsoft’s various server products including windows server 2003, windows exchange server 2007, and even sql server 2005.
the big advantage of this approach is that the vhd versions are all installed, pre-configured, and ready to run with your applications. all you have to do is install virtual pc, and load the virtual hard disk containing the server that you want to test.
the vhd program is also an excellent way to get your feet wet in what virtualization is and how it works. just be careful with the applications that you try running on a virtual server, as it is just as possible to alter or even corrupt the data files running on a virtual server as it is on a regular one. for safety’s sake, it’s better to practice on a dedicated pc with copies of the applications until you feel confident that you understand the underlying processes.