SCCyberworld

Tuesday, February 17, 2009

Shopping is not the answer

By : Mr. Gery Messer, Red Hat’s President of Asia Pacific and Japan

Here’s a common problem in many data centers in Malaysia ― one application overloads its server while others stand underutilized. That’s probably not an issue in good times when IT budgets are healthy. But in the current period of tight credit and shrinking budgets, this is not smart. The solution to overloading is not to buy more servers, which the company already has plenty of, but more flexibility. Flexibility can be achieved through virtualization so that when one application is struggling to respond to its server requests, the IT manager just needs to move the application to another server without having to take it off-line.

Server consolidation is an obvious benefit of virtualization. With server virtualization, a physical server can be partitioned into multiple virtual servers such that each appears as a dedicated machine with its own operating system and capabilities. This is a simple and effective way to make use of underutilized server CPU resources.

By breaking the one-to-one relationship between hardware and software, virtualization makes it possible to move from the “one application, one server” model common in today’s data centre to an environment with five to 10 virtual servers running on a single physical server. Typically, individual machines in data centers run at 10 to 15 percent utilization rates. With virtualization, data centers can achieve utilization rates of 90 percent or more. That’s efficiency worth working for.

Essentially, what virtualization does is decouple an operating system and the services and applications it supports from a specific, physical hardware. If we look at what virtualization does, and what one half of an operating system does ― i.e. the part that interfaces with the hardware ― it is one and the same.

Virtualization is now estimated to run on only six to eight per cent of servers. This is about to change very quickly. Industry pundits forecast virtualization to run on some 90 per cent of servers in the next two years. In other words, 90 per cent of servers being used 90 per cent of the time.

But, the real potential of virtualization lies beyond server consolidation. Data centers are only beginning to learn what they can do with virtualization, and how it can help businesses cope with the new paradigm of slow or no growth. New use cases of virtualization include application migration, high availability in failover, virtual desktops and, the new technology paradigm of cloud computing.

Virtualization makes it possible for several, different operating systems to co-exist seamlessly while using the same hardware resources. It will be an integral part of the next-generation operating system. Currently, there are different virtualization offerings.

Some virtualization technology is offered as a suite of standalone emulation software. Emulation software enables native virtualization, using a hypervisor, to partially allow access to the hardware and partially to simulate hardware for the user to load an operating system.

A new and significant virtualization technology is the kernel-based virtual machine or KVM. KVM has been integrated into the Linux kernel taking advantage of the operating system to deliver virtualization with more robust security, high performance and a wide range of hardware support. KVM provides the capability to split a single physical computer into multiple virtual machines. With KVM, the virtualization capability is integrated into the operating system and does not need new pieces of bolt-on or graft-underneath technology.

In tomorrow’s operating systems, virtualization will simply be a standard operating system feature. This is why virtualization is so important in today’s data centre. In essence, the choice of a virtualization solution is a choice about the operating system that supports it.

In tough business times, the ability to create new systems and services for competitive advantage without having to install additional hardware is a real incentive for firms to implement virtualization. In fact, Malaysia’s Open Source Competency Centre (OSCC) has set up the Advanced Virtualization Facility (AVF) to accommodate government agencies that are interested in using or testing OSS products without incurring the high acquisition cost of procuring traditional servers or new hardware. The AVF also aims to provide the agencies an incentive to try out OSS products and ease the migration from proprietary software.

Companies today would not exist without technology. Hence, high availability is a top business priority. Being able to create virtualized disaster recovery environments assures organizations that they can sustain high availability in failover. And it can be achieved without the high costs of replicating identical hardware environments. For the business, this translates into real costs savings. Equally compelling is the ability to create virtualized environments to test out different disaster scenarios at greatly lowered costs.

One step further along the path of virtualization is the virtual desktop. The virtual desktop suggests the ability to deploy both Linux and Windows desktops in the enterprise. Enterprises will be able, for example, to have an inexpensive Linux thin client serving a Windows operating system or a Linux operating system from a remote server in the data center to users in the enterprise. This will resolve current management and scalability problems. Like server virtualization, desktop virtualization enables the consolidation of hardware resources and cost savings.

But the greatest potential for exploiting virtualization lies in cloud computing which evolves from being able to have a dynamic, virtualized infrastructure. Virtualization is a foundational technology for cloud computing. It enables flexible, shared resource pools such as servers, storage and network connections. Although cloud computing is at an early stage still, its potential is identifiable.

Cloud computing is a way to increase capacity or capabilities quickly without having to invest in new infrastructure and new software licenses. This diverts precious financial resources from Capex to Opex which for companies in tough times is a boon. It affords a more predictable and manageable cost structure. In tough times, the tough don’t go shopping.

No comments: