SE LECT
VOL 02 | ISSUE 13 MAY 2010 | RS. 50 A 9.9 MEDIA PUBLICATION
SE R I ES
VIRTUALISATION Still confused about the concepts and technologies behind virtualisation? Wondering how you can benefit as a solution provider? Dip into our special package inside and get ready for multiple business opportunities
DESKTOP VIRTUALISATION
There is tremendous scope and value in taking desktops virtual PAGE 11
OPEN SOURCE
Its marriage with virtualisation can create the best offspring PAGE 08
NETWORK VIRTUALISATION
More and more companies are using it to their benefit PAGE 20
editorial
Virtualisation is for Real
M
sanjay.gupta@9dot9.in
Virtualisation may have started off on a slow note in India, but it can take a quantum jump in the not-too-distant future
ost hyped technologies finally percolate down to mass adoption and typically see great channel participation – and virtualisation should be no exception. According to IDC, 18.2% of all servers shipped worldwide in the fourth quarter of 2009 were virtualised, compared to 15.2% in Q4, 2008. Senior Research Analyst Brett Waldman puts it in no uncertain terms in a news release: “Virtualisation is no longer the cool, unproven technology that people are looking into for quick cost savings. It has matured into an integral piece of the IT infrastructure.” There’s no denying that the phenomenon is finding growing acceptance across different pieces of the tech pie – be it storage, networking, servers or desktops. And then there’s security for virtual environments and virtualisation management. In India, the virtualisation story has just begun. There are some big enterprises that have used it in storage and as part of their server consolidation efforts, but largely, the huge potential of virtualisation remains virgin territory. A territory that is wide open for vendors as well as partners to explore. I know this is easier said than done. The other day during a storage event for CIOs, I overheard a couple of guys expressing their confusion surrounding virtualisation and whether their organisations
are indeed ready for it. On the solution provider side, too, one doesn’t get to hear about many virtualisation projects going on. But virtualisation as a technology “will be the highest-impact trend changing infrastructure and operations through 2012” in the words of research firm Gartner. In fact, Gartner has predicted that the number of virtualised PCs will go up from under 5 million in 2007 to as many as 660 million by 2011. In all likelihood, virtualisation may have started off on a slow note in India, but it can take a quantum jump in the not-too-distant future. From what we keep on hearing, there’s a lot of interest about virtualisation – only, people need to be convinced of its benefits and relevance. Our Virtualisation Special this fortnight is an attempt to dispel the myths surrounding the technology and bring its various constituents on a single platter for you to savour. Do let us know how it tastes.
SANJAY GUPTA Editor Digit Channel Connect
sounding board sounding board S E L ECT
VOL 02 | ISSUE 13 MAY 2010 | RS. 50 A 9.9 MEDIA PUBLICATION
n
Rahul Meher, Managing Director, Leon Computers, Pune: “Virtualisation requires regular technical upgrade and proper understanding of the technology. Post-deployment, a key challenge a solution provider could face is to make sure that the customer understands well that adopting virtualisation does not mean large overheads and multiple servers.”
n
Research firm Gartner: Through 2012, 60% of virtualised servers will be less secure than the physical servers they replace. Although this figure is expected to fall to 30% by the end of 2015. Furthermore, by 2012, as much as 50% of the total enterprise datacentre workloads would be virtualised.
n
Vishak Raman, Regional Director, SAARC and Saudi Arabia, Fortinet: “A growing number of companies are now deploying virtualisation capabilities. Those with multiple sites or different, clearly separated business units or departments are progressively relying on virtualisation.”
S E R I E S
VIRTUALISATION Still confused about the concepts and technologies behind virtualisation? Wondering how you can benefit as a solution provider? Dip into our special package inside and get ready for multiple business opportunities
DESKTOP VIRTUALISATION
There is tremendous scope and value in taking desktops virtual PAGE 11
OPEN SOURCE
Its marriage with virtualisation can create the best offspring PAGE 08
NETWORK VIRTUALISATION
More and more companies are using it to their benefit PAGE 20
Write to the Editor E-mail: editor@digitchannelconnect.com Snail Mail: The Editor, Digit Channel Connect, K-40, Connaught Circus, New Delhi 110 001
DIGIT CHANNEL CONNECT
11
MAY 2010
SELECT
VOL 02 | ISSUE 13 MAY 2010 | RS. 50 A 9.9 MEDIA PUBLICATION
contents
S E R I E S VOL 02 ISSUE 13 | MAY 2010
VIRTUALISATION
Managing Director: Dr Pramath Raj Sinha Printer & Publisher: Kanak Ghosh EDITORIAL Editor: Sanjay Gupta Copy Editor: Akshay Kapoor Sr. Correspondents: Charu Khera (Delhi), Soma Tah (Mumbai) DESIGN Sr. Creative Director: Jayan K Narayanan Art Director: Binesh Sreedharan Associate Art Director: Anil VK Manager Design: Chander Shekhar Sr. Visualisers: PC Anoop, Santosh Kushwaha Sr. Designers: Prasanth TR & Anil T Photographer: Jiten Gandhi
Still confused about the concepts and technologies behind virtualisation? Wondering how you can benefit as a solution provider? Dip into our special package inside and get ready for multiple business opportunities
5
VIRTUAL REALITY OF OPEN THE CLOUD SOURCE
DESKTOP VIRTUALISATION
BRAND COMMUNICATION Product Manager: Ankur Agarwal SALES & MARKETING VP Sales & Marketing: Navin Chand Singh National Manager - Events and Special Projects: Mahantesh Godi (09880436623) Business Manager (Engagement Platforms) Arvind Ambo (09819904050) National Manager - Channels: Krishnadas Kurup (09322971866) Asst. Brand Manager: Arpita Ganguli Co-ordinator - MIS & Scheduling: Aatish Mohite Bangalore & Chennai: Vinodh K (09740714817) Delhi: Pranav Saran (09312685289) Kolkata: Jayanta Bhattacharya (09331829284) Mumbai: Sachin Mhashilkar (09920348755)
A QUESTION OF BEST PRACTICES NETWORK
Its marriage with virtualisation can create the best offspring PAGE 08
There is tremendous scope and value in taking desktops virtual PAGE 11
Organisations using cloud computing are now taking the advantage of virtualisation to make it easier to provision resources
VIRTUALISATION What are the key things
More and more companies are using it to to keep in mind for their benefit PAGE 20
16
security compliance in a virtualised environment? Here you go
TAKING CONTROL With the right tools and best practises, organisations can better manage virtualisation
22
VIRTUALISATION OTHERS TO THE MAX
6
IT departments are nowadays asked to do more with less and virtualisation is the answer they are looking for
8
TWO TO TANGO
EDITORIAL.......................................................... 01 EVENT................................................................ 03 VENDOR SPEAK................................................. 04 VIRTUAL DESKTOP............................................. 11 SERVER CONSOLIDATION................................... 15 ANALYST SPEAK................................................. 19 NETWORK VIRTUALISATION............................... 20 TRENDS.............................................................. 24
advertisers index
While virtualisation has been a major success delivered in a proprietary form, its marriage with open source goes beyond today’s virtualisation achievements
HP............................................................ Cover on Cover IBM...................................................... Inside Front Cover
PRODUCTION & LOGISTICS Sr. GM Operations: Shivshankar M Hiremath Production Executive: Vilas Mhatre Logistics: MP Singh, Mohd. Ansari, Shashi Shekhar Singh CHANNEL CHAMPS Sr Co-ordinator - Events: Rakesh Sequeira Events Executives: Pramod Jadhav, Johnson Noronha Audience Dev. Executive: Aparna Bobhate, Shilpa Surve OFFICE ADDRESS
Nine Dot Nine Interactive Pvt Ltd., KPT House, Plot 41/13, Sector 30, Vashi, Navi Mumbai - 400 703 Phone: 40789666 Fax: 022-40789540, 022-40789640 Printed and published by Kanak Ghosh for Nine Dot Nine Interactive Pvt Ltd. C/O KPT House, Plot 41/13, Sector 30, Vashi (Near Sanpada Railway Station), Navi Mumbai 400703 Editor: Anuradha Das Mathur C/O KPT House, Plot 41/13, Sector 30, Vashi (Near Sanpada Railway Station), Navi Mumbai 400703 Printed at Silverpoint Press Pvt. Ltd, TTC Ind. Area, Plot No. : A - 403, MIDC, Mahape, Navi Mumbai - 400709
Seagate...........................................................Back Cover K7 Computing........................................Inside Back Cover Epson...........................................................................25 COVER DESIGN: SAMEER KISHORE
DIGIT CHANNEL CONNECT
2
MAY 2010
event
Springboard Research organises Cloud India 2010 Cloud India 2010 conferences in New Delhi and Mumbai showcased the business advantages of cloud computing CHARU KHERA
S
pringboard Research recently organised its Cloud India 2010 conferences in New Delhi and Mumbai. The event saw a great turnout of prominent people from the IT industry, channel fraternity as well as media. It was sponsored by vendors such as Citrix, Microsoft, Jindal Steel, Google, SAP and HCL. The conference presented CEOs, CFOs, CIOs, Heads of IT, Line of Business (LOB) Managers, ISVs, System Integrators (SIs), and other potential cloud channel providers an opportunity to learn about the business advantages of cloud computing, best practices for extracting maximum value from cloud solutions, and the types of solutions currently available throughout Asia Pacific. Attendees were provided a clear understanding of cloud computing as a technology and
IT sourcing trend and the implications for their business. The key topic discussed at the event was that cloud computing is a fundamentally new approach to designing, delivering, and managing IT-enabled resources and capabilities. It can help organisations overcome the costs and complexities associated with more traditional technology, while allowing these same organisations to costeffectively support innovation. Speaking at the event, Michael Barnes, VP Software & Asia Pacific Research, Springboard Research, also shared the key findings of the survey conducted by Springboard Research. “42 percent of respondents are using the cloud services. The figures indicate the acceleration of cloud technology across the emerging Indian market. The key reasons for the growth in adoption of cloud technology are slowdown, favourable technology climate around virtualisation, open standards, and standardised and commoditised IT infrastructure,” he said. Barnes further shared that 38 percent of respondents believe that flexibility and scalability are the key reasons (as mentioned by CIOs) for adoption of cloud services in India; and 34 percent of respondents view cost (that included investment in hard-
ware, software and IT staff) is the key reason. Enterprise applications (36 percent), storage (31 percent) and servers (24 percent) were other major technologies that favoured the growth of cloud computing as a technology in India. Also, Web Conferencing/ Collaboration and e-mail were other few reasons stated by respondents. Sharing insights about the event, Fred Giron, VP India & Research Operations, Springboard Research said, “About 250 CIOs and IT decision makers attended the Cloud India 2010 events, both in Delhi and Mumbai. It was great to be able to interact with high level IT managers and leading vendors on a hot topic such as cloud computing. We are happy to see such a success for our very first two events in India. As a matter of fact, we are already planning a new round of events in these cities for the third quarter of this year.” Ajay Kumar Dhir, Group CIO of Jindal Stainless Steel and a keynote speaker shared, “The event helped us learn all about cloud computing and how one can leverage it to drive increased business value. The event also provided insights to help build a business case for cloud-based initiatives and solutions within an organisation through end-user case studies.” “Cloud computing pro-actively prevents the threats right at the source (Cloud –Internet) from intruding into our network. For new virus outbreaks, it generally takes a minimum of 4-6 hours for the pattern file to be developed and then deployment at the endpoint and till such time, our endpoint/network is unprotected from the particular virus. The cloud version of OS 10.0 is able to find out the source of infection within a few minutes and it blocks the infection right at the source, which will give immediate protection to our clients,” shared Dhir. Adding to Dhir’s thoughts was Neeraj Mediratta, CEO, Ace Data Devices. “Cloud computing is not only the future of computing, it is the present and the entire past,” he said. Rajiv Bhalla, a guest present at the occasion said, “All sessions were very insightful. The content shared was highly specialised, the speakers provided unique research insights, along with vendor-side briefings, case studies and go-to-market advisories.”n charu.khera@9dot9.in
DIGIT CHANNEL CONNECT
3
MAY 2010
Virtualisation
vendor speak
special
“We aim to become one of the top 3 brands for multimedia speakers by 2015” Rajesh Bansal, Director – Marketing, Fenda Audio
Bansal talks to DCC about Fenda’s channel plans, segmentation in the multimedia speaker market and the challenges for partners that lie ahead DCC: Fenda has been present in India since 2004. Can you briefly tell us about your channel journey so far?
Though Fenda Audio started its operations in 2004, its foundation was laid way back in 1990 when K.K.Bansal (Founder & M.D.) started the business of manufacturing and marketing radios. Gradually, with word of mouth, the dealer network in major states grew. As the year 2000 approached, the audio market in India shifted to a combination of passive speakers and VCD/DVD players. At this time, Mukesh Bansal joined the group. DCC: Do you follow a regional or national distribution model? How is it evolving and are you planning to sign on more distributors?
We have more than 100 distributors across India, serving close to 10000 dealers. We are focussed on distribution as our main channel and planning some innovative programmes for growth of this channel. DCC: Can you shed some light on the size and growth of the computer multimedia speaker market in India?
Computer multimedia speakers are today the key element of any PC/laptop. In India, where people normally like good sound, multimedia speakers become even more
DIGIT CHANNEL CONNECT
4
mAY 2010
important. Currently, this market is supposed to be growing at 20% CAGR and will continue to grow at this pace till 2015. DCC: This market seems to be pretty fragmented, with a large number of vendors fighting for market share. Where is Fenda positioned in the market and what are your goals for the next one to two years?
By 2015, we wish to make F&D as one of the top 3 brands for multimedia speakers in India and this will be done by adding more innovative products and also expanding current channel. In addition to this, we will make good investments in marketing in next 2 years. DCC: What according to you are the key differentiating factors for vendors in the multimedia speaker market?
One of the key and most important differentiators is after sales service. For a sound to be good, it is equally important that the customer is well aware about the proper placement of speakers. Another major differentiator in coming years will be the design and materials used in these speakers. DCC: Do you think the Indian consumers have begun to look beyond price when they buy products?
In metros and in A-class towns,
Computer multimedia speakers are today the key element of any PC/ Laptop. In India, where people normally like good sound, multimedia speakers are quite important. One of the key differentiators is after sales service. it is important that the customer is aware about the proper placement of speakers.
c u s t o m e r s h ave b e c o m e m o r e educated and are choosing more stable and trustworthy brands. Also, factors like ease of use and sound quality are becoming impor tant parameters at time of purchase. DCC: Can you share details of your service support network?
We have close to 70 technicians across India who provide regular service at dealer point. They are in turn well supported by close to 6 executives who work in back end in terms of stock maintenance, spare dispatch and complaint handling. In addition to this, we also have our own service centres in 10 cities. DCC: What challenges lie ahead for you or your partners and how do you plan to tackle them?
We are mainly focussing on the middle income group, who are still too price sensitive and not value sensitive. The challenge is to make them more educated, so that they can take the right decision while making a particular purchase. Another big challenge is choosing the right formula for marketing in class B&C towns because most of advertising firms which operate in India are focussed on metros and A -Class towns and have little knowledge on innovations possible in small towns. n
VIRTUALISATION
SPECIAL
Virtual
cloud computing
REALITYof the
CLOUD
Organisations using cloud computing are now taking the advantage of virtualisation to make it easier to provision resources VAMSI KRISHNA
C
loud computing is a term that has grown in ubiquity over the past several years, to a point where now it has multiple meanings and there is a danger that confusion could over-take the market hype. Computing in the cloud is essentially a method of computing in which dynamically scalable resources are provided as a service over the Internet – be it applications, infrastructure, storage or platform. It is the underlying technology, in particular the design changes to servers and processors, that is helping drive this new, advantageous sector match predictions. For users, cloud computing offers the ability to gain additional resources, such as access to new applications, additional storage space or faster computing, minus the infrastructure complexity that usually comes with these upgrades. A real world example of this would be when a company runs an ad on TV. For the 24 hours after the ad is shown, the company would expect traffic to its website to increase drastically; so it may need to buy additional CPU capacity from a cloud provider for the anticipated period of peak demand. However, to be effective, engineers and IT managers need to have an accurate idea of where those resources are most needed, and therefore, having a transparent infrastructure is a crucial element to being able to allo-
cate resources accordingly. This is the reason many companies deploy virtualisation before they embark on a cloud computing strategy. Virtualisation is a cloud enabler that typically makes it easier to provision resources.
Virtualisation: a cloud enabler Virtualisation enables software to be decoupled from hardware, resulting in multiple operating systems and applications running in virtual machines on the same physical server with resources such as memory and storage allocated according to the need of the software. But, underlying both virtualisation and cloud computing is the server architecture - and a vital component that has enabled both virtualisation and cloud computing to progress at such a pace is the CPU. In order to effectively run virtualisation servers and take the first step to adopting cloud computing, the hardware and the software should be optimised in terms of balancing power consumption and raw performance. By enabling this balance, server overload is addressed and resources for applications and software can be allocated accordingly to run in the most efficient way.
The magical combo As the need to handle more data, perform faster calculation and access
VAMSI KRISHNA
TO ACHIEVE OPTIMUM UTILIZATION IN A VIRTUALISED ENVIRONMENT, ORGANISATIONS MUST DO A GOOD JOB OF MAINTAINING THE ENVIRONMENT. THIS MEANS DATA AND OTHER ASSETS MUST BE MONITORED, PROTECTED AND PRESERVED.
more memory has increased, transitioning from 32-bit x86 processors to 64-bit x86 technology has enabled organisations to keep up with processing demand. This has been followed by an increase in the number of processor cores – allowing multiple tasks to execute at the same time while helping to ensure no one core gets overloaded. Multi-core processing technology has made faster, less power-hungry computing possible. Additional features have also been developed that allow the CPU to take over some functions from the virtualisation software, helping to speed up the computing process. The increasing functionality of the CPU has had major advantages for users looking at deploying a cloud computing strategy. Essentially, deploying a server with quad-core or six-core processors can help lower power requirements, which in turn helps save on power consumption and helps the server achieve greater energy efficiency. The CPUs nowadays are virtualisation-ready and designed for workload-intense infrastructures, such as those within a data centre, or those needed for cloud computing. The sheer number of servers in a data centre or required for cloud computing can usually be reduced by consolidating them under a virtualisation strategy, and the energy consumption of those servers can be cut back by the CPU in the server. Also, by utilising a low-power CPU, the heat generated by the CPU is minimised and therefore it requires less cooling, and reduces fan capacity, saving even more power, and hence enabling more cost savings. IDC has predicted that IT spending on cloud services will grow almost 300 per cent, reaching US$42 billion, by 2012. As the technology develops, firms are already witnessing a chargeper-use model for CPU power and in the future, this could lend itself to even greater sharing of technology in the cloud, where the majority of businesses ‘rent’ CPU usage, powered by mega-servers running an almost infinite number of CPU cores in giant data centres. By keeping server efficiency high and power consumption low, businesses operating in the cloud could be entering a new era where computing becomes a business accelerator, not just a cost centre.n Vamsi Krishna is Senior Technical Manager, AMD
DIGIT CHANNEL CONNECT
5
MAY 2010
overview
VIRTUALISATION TO THE MAX IT departments are nowadays asked to do more with less and virtualisation is the answer they are looking for CHARU KHERA
T
he times were easier when an organisation had just a few servers, had little data to manage, and had very few network nodes. But today, when all of these have reached an unmanageable limit, virtualisation is often touted as the solution that can address the complexity issues. As per VMware, virtualisation is a proven software technology that is rapidly transforming the IT landscape and fundamentally changing the way people compute. In simple terms, virtualisation as a concept lets one run multiple virtual machines on a single physical machine, sharing the resources of that single computer across multiple environments. The technology helps improve the efficiency and availability of resources and applications in an enterprise. A decade back, the common phenomenon in almost every organisation was that the internal resources were underutilised, the model followed was ‘one server, one application’
DIGIT CHANNEL CONNECT
6
MAY 2010
and IT teams spent too much time managing these servers. But today, virtualisation enables one to run multiple operating systems on a single computer, helps an organisation with reduced capital costs and increased energy efficiency, ensures that all applications perform best, enables business continuity, and also improves the overall enterprise application management. Explaining further, Krishna Upadhyay of Ahmadabad-based Care Office Equipment says, “Virtualisation improves efficiency and availability of IT resources for an organisation. It’s true that it starts by eliminating the old “one server, one application” model and runs multiple virtual machines on each physical machine, thus freeing your IT admins from spending so much time managing servers rather than innovating.” The area in which virtualisation is making inroads is network virtualisation, storage virtualisation (commonly used in storage area networks), server virtualisation, applica-
VIRTUALISATION
SPECIAL
tion virtualisation and desktop virtualisation. Some of the key vendors that offer virtualisation solutions in India are VMware, Citrix, Microsoft, Dell, IBM, and Red Hat among others.
VIRTUALISATION EXPLAINED Virtualisation, as a topic, is pretty broad in itself, but let’s get to know some of the key vir tualisation concepts. Network virtualisation: Network virtualisation is the process wherein
KEY BENEFITS OF VIRTUALISATION • Enhances business agility and flexibility. • Desktop virtualisation allows running multiple operating systems simultaneously on a single machine. • Server virtualisation helps organisations with power conservation and also significant cost savings. • Helps ease complexities of an organisation and consolidate applications at a centralised level. • It can also be used for disaster recovery and securing desktop environments. • Run multiple operating systems on a single computer including Windows, Linux and more. • Let your Mac run Windows, creating a virtual PC environment for all your Windows applications. • Reduces capital costs by increasing energy efficiency and requiring less hardware while increasing your server to admin ratio. • Ensures that your enterprise applications perform with the highest availability and performance. • Builds up business continuity through improved disaster recovery solutions and deliver high availability throughout the datacenter. • Improves enterprise desktop management and control.
an enterprise takes the traditional client/server-based services and puts them on the network. Storage virtualisation: Storage virtualisation helps an organisation boost the value of all its storage assets and enables simplified storage management in a virtualised server environment. Data intensive sectors like BFSI, telecom, retail and government have been early adopters of storage virtualisation. Desktop virtualisation: Also known as client virtualisation, it separates a personal computer desktop environment from a physical machine using a client–server model of computing. The central machine could be put at a residence, business or data center and all users would be connected to the central machine by LAN/WAN or the Internet. Government, education, and ITeS sectors will be the key adaptors of desktop virtualisation solutions. Server virtualisation: Server virtualisation is a way to share one physical server with multiple people in a way that gives the illusion that each customer has its own dedicated server and thus, a key benefit is flexible resource allocation. Frost & Sullivan estimates that server virtualisation is the forerunner in the adoption of virtualisation due to the benefits it offers to the organisation – namely consolidation, reduced operating expenditure, and limited impact on user operations. As per the firm’s estimates, in 2008, 24.5 percent of the servers sold were virtualisation enabled and this figure is expected to grow to 60.2 percent by 2015. The entry of Microsoft will further change the dynamics of the market, thereby making server virtualisation solutions more affordable.
KEY TRENDS Though the overall market for virtualisation in India is still nascent, businesses are waking up to the concept of virtualisation. Gartner predicts that virtualisation will be part of nearly every aspect of IT by 2015. As per Frost & Sullivan, the trend is expected to influence the end-to-end infrastructure of an enterprise, namely server, storage, network, application, desktop, data and so on. As per Vishak Raman, Regional Director, SAARC and Saudi Arabia, Fortinet, a growing number of companies are now deploying virtualisation capabilities. Those with multiple sites or different, clearly separated business
overview units or departments are progressively relying on virtualisation.
CHALLENGES
KRISHNA UPADHYAY, DIRECTOR, AHMADABADBASED CARE OFFICE EQUIPMENT
“VIRTUALISATION IMPROVES EFFICIENCY AND AVAILABILITY OF IT RESOURCES FOR AN ORGANISATION. IT RUNS MULTIPLE VIRTUAL MACHINES ON EACH PHYSICAL MACHINE.”
Though virtualisation helps enterprises with numerous benefits as well as reduce the physical requirements of the data center, a key challenge faced by most organisation as per as SPs is the increase in level of management and complexity of servers. Another challenge faced by an SP who is looking at virtualisation as a potential area for business could be lack of technical know-hows of the technology. “Virtualisation requires regular technical up gradation and a proper understanding of the technology and post deployment, a key challenge an SP could face is to make sure that the customer understands well that adopting virtualisation does not mean large overheads and multiple servers,” explains Rahul Meher, Managing Director, Leon Computers, Pune. Then, there are security threats to virtualisation as well. “With virtualisation, enterprises have multiple applications lying on one server and if a hacker manages to break-into the virtualisation layer, he could have access to all the applications hosted on the server and could attack all hosts sitting on that server. To manage this and provide security could be a key challenge,” adds Upadhyay.
THE ROAD AHEAD
SANCHIT VIR GOGIA, SENIOR RESEARCH ANALYST, SPRINGBOARD RESEARCH
“Virtualisation has quickly evolved as a technology and will continue to provide numerous benefits. The more it is propagated by vendors, the more it will be used by enterprises.”
Indian enterprises have started to understand the concept of virtualisation quite clearly. Solution providers as well as vendors like VMware, Citrix, Microsoft, Red Hat, etc are all pitching hard and aggressively launching their products and solutions in the Indian market. A recently conducted survey by Cisco shows that virtualisation will be the top networking investment over the next five years, as CIOs continue to be hyper-focused on reducing IT costs and organisations are increasingly adopting and will continue to adopt virtualisation as an energy, space and cost-saving measure. According to Sanchit Vir Gogia of Springboard Research, “Virtualisation has quickly evolved as a technology and will continue to provide numerous benefits. The more it is propagated by vendors, the more it will be used by enterprises - broadening its value at each step.” n charu.khera@9dot9.in
DIGIT CHANNEL CONNECT
7
MAY 2010
open source
TWO TO TANGO While virtualisation has been a major success delivered in a proprietary form, its marriage with open source goes beyond today’s virtualisation achievements BERNARD GOLDEN
V
irtualisation offers enormous benefits to IT organisations. However, several factors have limited its adoption. First, it has traditionally been the province of proprietary vendors. Some of these vendors are hardware manufacturers who provide virtualisation software that enables better use of their hardware. Unfortunately, this approach allows the software to be used only on hardware from that manufacturer. For organisations that use hardware from multiple vendors, this is clearly undesirable, since it presents the need to run multiple virtualisation products, which increases infrastructure complexity and reduces operational efficiency, thereby raising costs. A more attractive proprietary approach comes from independent software vendors (ISVs); their software can be run on servers from many different manufacturers. This enables organisations to reduce the variety of virtualisation software needed and to use a consistent virtualisation infrastructure throughout their data centre. Using ISV software from a vendor like VMware is clearly more attractive than attempting to manage multiple
DIGIT CHANNEL CONNECT
8
MAY 2010
products from several hardware manufacturers. However, using an ISV’s virtualisation offering presents other shortcomings. Relying on a single proprietary software vendor, while reducing complexity, creates dependency on that vendor. This dependency can also be characterized as vendor lock-in, symbolizing the fact that the user organisation is trapped into a restrictive relationship, dependent upon the vendor’s product decisions and time frames. From an end user perspective, this situation is extremely undesirable, since it means that end user plans are subject to the feature road map and release schedule of the vendor. If a user organisation needs a particular feature to meet business requirements, it may be unable to obtain that feature in the necessary time frame. An associated shortcoming of the single vendor decision is early retirement; in this scenario, a vendor ends support for a software product, forcing the end user to upgrade or modify its infrastructure on a time frame not of its own choice. Single ISV providers of virtualisation present another kind of challenge in terms of evolution of the product and, crucially,
VIRTUALISATION
SPECIAL
inclusion of new technology innovations in the product. For virtualisation products that come from proprietary ISVs, integration with these new technology products can proceed only at the pace of the ISV. This means that important technology innovations may not be available in the time frame a user organisation desires. Finally, of course, proprietary software is typically associated with high prices. Many users feel that virtualisation products carry a burdensome price tag, dimnishing the potential attractiveness of the technology. The high price of proprietary virtualisation software means that many organisations are precluded from using it or are applying it in useful, but less critical applications. So, open source offers significant benefits when compared to the proprietary approach to virtualisation.
Open source virtualisation Proprietary software means that a single vendor controls the code base of the product. Only the vendor’s employees can make changes to the product’s code. Moreover, people or companies
with innovative ideas, unless they work for the vendor, are precluded from direct contribution to the product and forced to work through the proprietary vendor. Naturally, this arm’s length approach retards the evolution of the product. By contrast, open source products welcome participation from a much larger pool of contributors. New ideas and technical approaches can be incorporated into the virtualisation product much more rapidly than a proprietary counterpart. Furthermore, support for complementary products such as new networking technologies or processor components can move forward very quickly. The developers of the technologies can directly participate in developing support for their products in the virtualisation software. This means that innovations can be tested and incorporated by the experts in the technology, thereby accelerating final support of the technology in the product. Naturally, this has the effect of rapidly improving the functionality of the open source product. The ability of open source to improve more quickly than proprietary software
OPEN SOURCE PRODUCTS WELCOME PARTICIPATION FROM A LARGE POOL OF CONTRIBUTORS. NEW IDEAS AND TECHNICAL APPROACHES CAN BE INCORPORATED INTO THE VIRTUALISATION PRODUCT MUCH MORE RAPIDLY THAN A PROPRIETARY COUNTERPART.
The Business Benefits of Virtualisation Better hardware utilization: By implementing virtualisation on data centre servers, IT organisations can raise the utilization of their servers from 10% or 15% to as high as 70% to 75%, achieving much better efficiency while still allowing performance headroom for demand spikes. Higher utilization rates also make more efficient use of corporate capital. Reduced energy costs: Running servers at low utilization rates not only wastes capital, it wastes energy spending as well. Even though a machine may be running at only 15% of its capacity, the machine may still use almost the same amount of energy as if it were running at 90% utilization. By using virtualisation, fewer pieces of hardware are required, and thereby energy use is reduced. Reduced crowding in data centres: Many organisations have maxed out their data centre capacity. They cannot fit any more machines into the rack space they have available. This is a critical issue, since the cost of increasing space is typically not incremental; increasing data centre space easily can run into tens of millions of dollars. Virtualisation reduces the number of servers needed and obviates the need to increase data centre floor space. Better operational efficiency. All hardware needs to be maintained and repaired. Sprawling data centres packed full of machines impose large operational costs as systems are taken out of service for repair and upgrade. Using less hardware means less system administration work and enables better operational efficiency. Better operational efficiency. All hardware needs to be maintained and repaired. Sprawling data centres packed full of machines impose large operational costs as systems are taken out of service for repair and upgrade. Using less hardware means less system administration work and enables better operational efficiency.
open source has been seen in a number of software markets, and this dynamic is present in the virtualisation market as well.
Contribution One of the key challenges for proprietary vendors is the limited perspective they bring to the development of their software. This is inevitable, since they have less experience using their software than real-world end users. No single company can know as much about the variety of applications of their software as the overall ecosystem of system users. The ability of complementary technology providers to directly test and integrate their products offers the opportunity for their domain knowledge to be included in the open source product. This means that the project benefits from the detailed knowledge of these members of the product ecosystem. Other members of the ecosystem can directly contribute to the product as well. If we look at Linux, it’s clear that much of the code contribution to the project comes from individuals employed by vendors providing products complementary to the operating system. By having their employees working on the Linux code base, these vendors are able to ensure that drivers for their products work well with Linux. Furthermore, by participating in the product development community, these vendors are also able to ensure that developers working in other portions of the kernel are aware of hardware requirements and can keep these requirements in mind while designing and coding their work. This kind of consistent, engineer-toengineer communication pays dividends well beyond the kind of business development relationship typically engaged in by proprietary companies. By collaborating on open source work, members of a product’s ecosystem can improve the product’s support of new hardware and software technologies. Contributions by domain experts can extend well beyond vendors that manufacture products for the domain. For many open source projects, sophisticated end users also contribute source code improvements to the project, thereby ensuring that innovative product functionality is integrated into the product. This can extend to product features, which, were the product proprietary, would not be deemed important enough for the vendor to implement. In open source, there is a way for these features to be implemented and offered to other product users.
DIGIT CHANNEL CONNECT
9
MAY 2010
VIRTUALISATION
open source Inclusion in operating system In the proprietary world, as companies seek to protect their markets, product integrations are often implemented by inferior technical solutions. While not desirable from a product use perspective, these integrations are necessary to ensure that each vendor’s product can continue to be sold separately. Put another way, in the proprietary world, technical usefulness often takes a back seat to marketplace realities. The less desirable technical outcomes manifest through complex and unstable integration mechanisms, as well as through poor levels of product performance. By contrast, open source technologies can be contributed and directly integrated into the operating system kernel. This means that integration is high performance and elegant. The inclusion of features in the operating system also pays benefits to end users in terms of reduced costs. Because features are integrated directly into the operating system, it is no longer necessary to purchase and install additional software products. Instead, the functionality is easily available in the base operating system, often made available by nothing more complex than filling in an option during operating system install. Furthermore, with the inclusion of key functionality into the operating system itself, ongoing upgrades and maintenance is easier as well. Instead of the end user needing to keep track of multiple products’ release schedules and installing and testing a software stack each time one of the products is updated, the software distributor takes care of ensuring that the products are kept up to date and released in a consistent, tested package.
Lack of vendor lock-in User organisations have learned from the past. Ceding control of significant parts of their computing infrastructure imposes significant costs and risks upon them. While vendors cite the benefits of standardizing on a single vendor’s product stack, users have learned through experience that these benefits are inevitably accompanied by significant drawbacks: poor responsiveness, ongoing mandatory upgrades, and inability to influence product direction. Many organisations have concluded that the latter drawbacks outweigh the benefits of single vendor dependence. In fact, the industry term for this relationship is “vendor lock-in,” which illustrates
DIGIT CHANNEL CONNECT
10
MAY 2010
SPECIAL
the restrictions users experience when dependent upon a single vendor. Open source software, by its very nature, reduces vendor lock-in. The source code for every open source product is available for any user to examine, modify, and redistribute. Consequently, this means that should a user be dissatisfied with an open source vendor, the user can terminate their relationship with the vendor and look to other avenues for development and support for the product. These other avenues can include other vendors, other members of the product community, or even the enduser. The fact that end users have options beyond the vendor tends to discipline vendors and ensure a positive working relationship. With regards to virtualisation, the lack of vendor lock-in means that, should a customer or user decide that they wish to move to a different vendor, it’s easy to migrate. Since all participants in open source share a common code base, vendors are precluded from holding users hostage. Should a user of Linux-based virtualisation choose to move away from their current supplier, they can do so.
Community Community is one of the most impressive aspects of open source. Consisting of developers, users, and other members of the product ecosystem like hardware manufacturers and system integrators, the community provides a rich resource for learning, support, and collaboration. The concept of community extends well beyond people contributing code to a project. Because of the transparency typical of open source, there is open and frequent conversation among all community members. Community provides a way for a broad range of users to contribute their knowledge and experience to the product, unlike the proprietary world where only some customers are able to make their views and requirements known. The meritocracy of community means that all users can share their perspective. Moreover, community members can contribute to the product, even if they do not submit code. Bug reports, tutorials, and technical support are all valuable contributions from community members. One has only to look at the mailing list or forums of a typical open source product to recognize the way that community members assist one another, often enabling product users to solve problems much more quickly than in a
BY COLLABORATING ON OPEN SOURCE WORK, MEMBERS OF A PRODUCT’S ECOSYSTEM CAN IMPROVE THE PRODUCT’S SUPPORT OF NEW HARDWARE AND SOFTWARE TECHNOLOGIES.
Open source software, by its very nature, reduces vendor lock-in. The source code for every open source product is available for any user to examine, modify, and redistribute. THE HIGH COST OF COMMERCIAL VIRTUALISATION CERTAINLY HAS SLOWED THE PACE OF ADOPTION OF VIRTUALISATION. BUT WITH THE ADVENT OF OPEN SOURCE VIRTUALISATION, THE ADOPTION OF VIRTUALISATION TECHNOLOGY IS EXPECTED TO SKY-ROCKET.
commercial support arrangement. In the area of open source virtualisation, the involvement of community members has enabled awareness and product uses to spread much more quickly than if the product were commercially based.
Cost Last, but certainly not least, is the cost advantage open source presents versus the proprietary competition. Because of the licensing conditions proprietary software is distributed under, proprietary products require a costly license fee prior to use, with mandatory maintenance fees a pre-requisite for continued use and support. By contrast, the expansive licensing conditions of open source imply that software is freely available for download at no cost. Users have full access to the product with no requirement for a financial transaction. And, while commercial support and services are available if desired, there are no financial requirements for continued use of the product. Consequently, open source software can be far more cost-effective than proprietary software. The cost advantage of open source is very important with respect to virtualisation software. Because previous virtualisation products have been an add-on software product, carrying their own licensing fee, users have been forced to find additional budget in order to implement it. Furthermore, because commercial virtualisation products are so expensive, IT organisations have been forced to move to dense virtualisation infrastructures in which they load the largest possible number of virtual machines per physical server in order to reduce the overall cost of virtualisation. Clearly, this high density poses risk, since maximizing server load means that any individual failure can affect many systems. Despite the manifest business benefits of virtualisation, the high cost of commercial virtualisation has put IT organisations in an awkward position. They need to spend significant sums on software before beginning to realize the financial benefits of virtualisation technology. The high cost of commercial virtualisation certainly has slowed the pace of adoption of virtualisation. But with the advent of open source virtualisation, the adoption of virtualisation technology is expected to sky-rocket. n (This article has been adapted from a white paper sponsored by Red Hat.) Bernard Golden is the CEO of Navica.
virtual desktop
DEMYSTIFYING
DESKTOP VIRTUALISATION
Desktop virtualisation offers several business benefits such as lower cost of ownership and ease of management, but challenges in implementation remain SHIV KUMAR
T
here is a lot of hype around desktop vir tualisation, which is also referred to as ‘Hosted Virtual Desktop’. One common virtualisation goal is to improve the way in which IT manages and optimizes its resources. This may take the form of improved service level, increased peak capacity, reduced configuration costs, and reduced operating expenses. One or more of these goals is often achievable through desk-
top virtualisation if designed and implemented in the right manner. Perceived vir tualisation savings may also take the form of reducing the amount of hardware that IT purchases or manages. Although there may be some concerns about the upfront investment on the hardware and storage, there is little doubt that desktop virtualisation has value and is here to stay. Virtualisation at the desktop level promises many desirable attributes such as centralization, better management, improved security, lower operational costs, and remote and branch access that are difficult to achieve in traditional PC environments. One of the prime drivers of desktop virtualisation is that the cost of managing PCs continues to rise, dwarfing the cost of acquisition easily by 2 to 1. According to IDC, the costs associated with managing PCs could be well over $1,000 per PC per year and there are 1.2 billion PCs world-
wide, of which 50% come from enterprises. Businesses want technology solutions and infrastructure strategies that contribute to IT optimization, lower TCO, increase in the ROI, ease of use, and empower greater productivity of staff. Desktop virtualisation can also contribute to the green IT adoption, as the virtualised desktop helps a firm to stretch the hardware for more like five or six years, rather than the standard three-year PC refresh cycle. In this context, the market for virtualisation has huge potential, with Microsoft, Citrix and VMware competing for the major market share along with few other players such as VERDE, Redhat, Novell SUSE, Oracle and Parallels. Unfortunately, most of the customers who get impressed by desktop virtualisation end up holding off because of the upfront hardware costs and expensive licenses. Another factor hindering the wider adoption of virtualisation is the sheer number of varieties it comes in - even the vendors get confused.
Desktop virtualisation readiness A typical desktop virtualisation implementation requires the following three major components: server virtualisation software to host desktop images, broker software to connect users with their desktop environment; tools for managing the provisioning of virtual desktops and images. Moving from the physical server world to a virtual infrastructure requires adherence to industry best practices for virtualisation. Best practices include activities and procedures that create outstanding results in a given situation. These practices can then be efficiently and effectively
DIGIT CHANNEL CONNECT
11
MAY 2010
VIRTUALISATION
UTM
SPECIAL
adapted to other situations. Desktop virtualisation is a major infrastructure overhaul that can lead to resistance from users and the business if performance levels suffer because of mistakes at the design and implementation stages. Therefore, an organisation needs to pay attention to hardware compatibility lists. When deploying virtualisation technologies in a production environment, it needs to make sure that all host hardware is on the virtualisation software vendor’s hardware compatibility list. Users who need a lot of capacity on their local desktop or ones who often work offline are not ideal candidates for desktop virtualisation. Most customers will begin their desktop virtualisation project by piloting Proof-Of-Concept (POC) implementations. These projects tend to be limited in scope, involving small subsets of users as a part of larger client-computing implementations. For these POCs , Microsoft, VERDE, VMware or Citrix’s products will work effectively to carry out the evaluation well. It is advisable for firms to test and evaluate the products from the performance and ease of use perspective,
and to make a decision at the end of the POC cycle. The software vendor or the implementation provider of desktop virtualisation must offer all three necessary components in combination (server virtualisation platform, brokering software and session management layer, and management tools). Some may be brought in from third parties, but the vendor or solution provider must take responsibility for integration and implementation. Factors that will influence the product choice include the number of users, users’ locations, application type and performance requirements, and manageability requirements. It’s important to remember that an established relationship with a trusted channel partner or systems integrator should have weight in the decision process. The major desktop virtualisation software vendors are heavily dependent on their channel for delivering their product and implementation services. Furthermore, many channel partners are vendor-neutral and offer a solution which is priced and featured right for the business.
VIRTUAL Virtualisation of security features in all likelihood heralds the next generation of firewalls and is proving to be profitable for organisations as well
DIGIT CHANNEL CONNECT
12
MAY 2010
VIRTUALISATION AT THE DESKTOP LEVEL PROMISES MANY DESIRABLE ATTRIBUTES SUCH AS CENTRALIZATION, BETTER MANAGEMENT, IMPROVED SECURITY, LOWER OPERATIONAL COSTS, AND REMOTE AND BRANCH ACCESS THAT ARE DIFFICULT TO ACHIEVE IN TRADITIONAL PC ENVIRONMENTS.
Planning
Security goes
VISHAK RAMAN
SHIV KUMAR
T
he virtualisation of security features is entering the next generation of firewalls. What started as the integration of security functions into a single appliance has involved into completely virtualised firewalls, which is gamechanging for traditional firewalls.
Game-changing dynamics The pooling of security functions such as inspection firewalling, anti-virus, intrusion prevention and detection (IPS/IDS), anti-spam, web content filtering, traffic shaping, and dynamic routing in a single appliance is completely virtualised. IDC has predicted that this market would grow twice the size of today’s market for traditional firewalls and VPN (Virtual Private Network). Previously, this was known as Unified Threat Management (UTM). This concept has been taken a step further by vendors who completely virtualise all the integrated UTM security functions. Fortinet, for instance, can partition its FortiGate UTM firewall into several, separately managed and provi-
Storage and network bandwidth management should be planned properly during the desktop virtualisation design and deployment. A lot of resources can be wasted if storage and network optimization is not done properly. For example, organisations looking to deploy across complex network configurations where guaranteed bandwidth is unknown and network latency is an issue tend to choose Citrix’s XenDesktop because of the ICA protocol. Although VMware has partnered with other protocol vendors such as HP, Oracle/Sun, most customers deploying VMware’s View do so by using RDP. If desktop virtualisation is part of a broader application delivery strategy that also involves server-based computing and applications streaming, then Citrix’s XenApp could be a complimentary solution to XenDesktop. Adding more products to the mix adds complexity. However, specialized solution providers or systems integrators can help to address the application integration complexity issues during design and testing. n Shiv Kumar is Executive VicePresident, Zylog Systems Ltd
sioned instances. With Fortinet, these instances are called VDOMs, short for Virtual Domains. On the largest FortiGate multi-threat security appliances, customers can operate up to 4000 virtual UTM firewalls. But, the smaller models also offer virtualisation functionality. To communicate between multiple virtual firewalls, Fortinet allows the activation of Inter-VDOM routing. This involves packets being routed internally between the virtual firewalls, making communication via physical network interfaces redundant. This results in savings in terms of physical network interfaces and increased performance. Physical network interfaces can be virtualised via Virtual LANs (VLANs). Under the right conditions, up to 4000 virtual VLAN interfaces can be used simultaneously. Apart from UTM firewall functions, static and dynamic routing can also be virtualised.
Roots Virtualisation of firewalls is not an entirely new topic in the field of network security. For years, carri-
VIRTUALISATION
UTM
SPECIAL
ers, Internet service providers (ISPs), hosting and managed security providers (MSSPs) have been virtualising traditional network firewalls for their customers. They primarily used larger, redundant cluster firewall systems being shared by several end customers. Each customer could use his own, virtual firewall with appropriately separated configuration capabilities to deliver savings in terms of hardware and software licenses and this enabled providers to offer its customers costeffective and highly available firewall services. Today, all the other UTM security functions can be virtualised. At the touch of a button, these features can be set up within a virtual firewall. Even the operating modus can be combined as required. One virtual firewall can, for example, be run in the NAT/route modus, while the second operates in the transparent modus (layer2). Firewall, IPS, and anti-virus functions can be run on the first instance, and on the second layer a pure web filter. A growing number of companies are now deploying virtualisation capabilities. They find the necessary flexibility in increasingly complex enterprise networks, in particular with virtualisation of complete firewall functions or in the virtualisation of network inter-
istration can be delegated to various administrators who see and manage only their own virtual firewall.
faces. Companies with multiple sites or different, clearly separated business units or departments are progressively relying on virtualisation. The admin-
also to introduce increased revenue opportunities for the channel. The early adopters of virtualised UTM have been the organisations with
DIGIT CHANNEL CONNECT
14
MAY 2010
Virtualising UTM UTM has always been about doing more with less, and whilst not always considered overtly green, there is little doubt that replacing a multitude of boxes out in the field with a highperformance platform is certainly efficient. Better security effectiveness, higher cost savings, easier management and optimum space/power consumption are the driving forces behind both virtualisation and UTM, and so in combination, its benefits are multiplied. Numerous security players have been trying to get in on the act, by enabling their individual security functions to be virtualised. It’s all very laudable, but limited for the end-user and the channel. Instead, virtualising UTM, i.e. integrated security functions such as IPS, AV, antispam, firewall, content filtering etc. and the switching functions that operate in tandem, represents a far more powerful proposition not only for an end-user’s environmental efficiency and security protection, but
VISHAK RAMAN
VIRTUALISED UTM CAN PROVIDE SUMPTUOUS REVENUE OPPORTUNITIES FOR THE CHANNEL, AS MANAGED SECURITY SERVICES OPEN UP THE PROSPECT FOR MAXIMUM VALUE-ADD WITH MINIMUM EFFORT.
the biggest networks and the biggest traffic demands. For example, HEAnet, the service provider operating Ireland’s research and education network, is charged with providing numerous security functions for over 800,000 users and has accomplished this feat with a solution that occupies just two data centre racks. U T M i s i n c r e a s i n gly p o p u l a r amongst larger enterprises, as they begin to seek the economical, performance and ef ficiency benefits of multi-threat security. Channel players can take full advantage of this trend with virtualised UTM, by developing Managed Security Service Provider (MSSP) capabilities. Many resellers already providing UTM solutions for customers have witnessed an increased demand to turn these security functions into services. Those channel players that have already answered to the call of the MSSP, have been raking in the benefits of virtualised UTM for almost 18 months now. Those seeking to meet this need for security services, often delivered on a ‘per seat, per month’ basis, have found a simple route to this recurring revenue opportunity by investing in a highend UTM infrastructure themselves. Rather than spending vast sums of money and time building distinct security architectures for each service offering (managed firewall, managed IPS, managed AV etc.), the use of centralised UTM architecture is a near perfect method for dramatically reducing capital and operating costs, as well as providing the most flexible services possible to customers. We all know virtualisation is green and so is UTM; so for a reseller, combining these two initiatives is an absolute blessing for the end-users as they look to keep up with the floodgate of environmental corporate initiatives. However, more than keeping up with customer demand for green IT, virtualised UTM can provide sumptuous revenue opportunities for the channel, as managed security services open up the prospect for maximum value-add with minimum effort. In the future, hardly any firewalls will be purchased without virtualisation and UTM functionality. The growing demand for security functions, increasingly complex networks, and the pressure for companies to be costefficient speak for themselves. n Vishak Raman is Regional director, SAARC and Saudi Arabia, Fortinet.
server consolidation
MORE
Doing with Much LESS Virtualisation is one key technology strategy that can help any SMB transform IT into a competitive advantage
T
Virtualisation as an answer One of the most exciting new technologies to emerge in recent years is virtualisation. It’s the ultimate “do more with less” answer for business. The technology increases the percentage of work each server is doing so that companies need less hardware to do more work – transforming the work of one server into that of many. The savings benefits: • Buying fewer servers helps manage the IT budget; • Deploying fewer servers means more
SPECIAL
SMBS ARE FACED WITH A CLASSIC DILEMMA: WHILE IT IS CRITICAL TO MEETING CUSTOMER AND BUSINESS REQUIREMENTS AS WELL AS ACCOMMODATING BUSINESS GROWTH, BUDGET RESTRAINTS MEAN SMBS HAVE FEWER RESOURCES TO INVEST IN IT. Although virtualisation is still relatively new, its benefits have been proven time and time again. In fact, Gartner reports that by 2012 most midsize businesses are considering to take 100 percent of their servers virtual, primarily driven by savings from server consolidation.
SATYEN VYAS he current economic environment is challenging for any company – managing cash flow, profitability and customer requirements in a sea of change can test even the most experienced corporate manager. The same conditions are amplified for small and medium businesses that are expected to compete with large enterprises for the same pool of customers using a fraction of the resources. SMBs are faced with a classic dilemma: while IT is critical to meeting customer and business requirements as well as accommodating business growth, budget restraints mean SMBs have fewer resources to invest in IT.
VIRTUALISATION
space is freed up in facilities; • Lowered electricity and cooling costs (key parts of maintaining hardware). In short, virtualisation allows a company to cut its IT budget, but do more with existing resources, an excellent solution to the ultimate SMB IT dilemma. Although virtualisation is still relatively new, its benefits have been proven time and time again. In fact, Gartner reports that by 2012, most midsize businesses are considering to take 100 percent of their servers virtual, primarily driven by savings from server consolidation. The real proof-of-savings helps to boost the case for virtualisation, as even Dell has reaped the benefits of virtualisation, implementing virtualisation strategies in its own facilities with dramatic cost savings. Dell has used virtualisation to consolidate thousands of servers to date. The effort has led to cost savings of more than $29 million when the impact of reduced hardware purchase, reduced space require-
VIRTUALISATION IS ONE KEY TECHNOLOGY STRATEGY THAT CAN HELP ANY SMB TRANSFORM IT INTO A COMPETITIVE ADVANTAGE. BUSINESSES OF ANY SIZE CAN TAKE ADVANTAGE OF THIS PROVEN TECHNOLOGY.
ments and reduced use of power are all counted.
Real world impact What does this mean for the average SMB? When first introduced, virtualisation was only practical for the largest of enterprises and cost millions to implement. New technologies have helped make virtualisation more accessible to companies of all sizes. For example, one SMB that has taken advantage of these services is Data Guard Systems. Although the company had filled a data centre in Boston, it still needed more power to meet its computing needs. Dell recommended a virtualisation solution that cut the number of machines the company needed while increasing available computing power. Data Guard realized immediate savings of almost $20,000 per month simply from a 50 percent reduction in power needs. They estimated additional savings of nearly $60,000. The company was able to immediately expand its computing power within its existing data centre. By going all-virtual, Data Guard was able to use its own infrastructure money and time savings to innovate instead of just maintaining status quo. The company was also able to quickly scale up and deploy the new managed encryption services without having to deal with the physical limitations of upgrading the data centre. As a result, Data Guard had the service up and running for its partners and customers in a matter of minutes instead of several days. Implemented at the SMB stage, virtualisation can enable the average company to consolidate eight servers to two – already a dramatic savings. The technology allows work load to be re-distributed to different applications, making the switch of applications between virtual machines flawless. Company employees will have all of the tools they need to meet customer needs, but that same underlying IT network will be dramatically more efficient and less costly. Virtualisation is one key technology strategy that can help any SMB transform IT into a competitive advantage. Businesses of any size can take advantage of this proven technology. It enables SMBs to compete with the largest of enterprises, as the playing field gets more level with each passing day. n Satyen Vyas is Director, Advanced Systems Group – SMB, Dell India.
DIGIT CHANNEL CONNECT
15
MAY 2010
security compliance
A Question of
BEST PRACTICES What are the key things to keep in mind for security compliance in a virtualised environment? Here you go
T
he following best practices are widely accepted in the field of information security. It is important to recognise that they apply not only in securing virtual systems, but also physical systems. However, there are some unique aspects that must be managed when virtualisation is introduced.
Platform hardening Virtualisation adds new layers to the computing infrastructure, including the hypervisor layer which allocates the hardware resources of the “host” to each of the virtual machines or “guest” operating systems. Virtualisation infrastructure also includes virtual networks with virtual switches connecting the virtual machines. All of these components which, in previous systems, used to be physical devices, are now implemented via software. Virtualisation also introduces a new administrative layer for managing the virtualisation infrastructure. To reduce the risk of unauthorized access, just as with physical systems, the hypervisor layer and the administrative layer must be properly “hardened.” System hardening helps minimize security vulnerabilities by taking a series of actions such as configuring the virtualisation platform with secure settings, removing unused components and applying the latest patches. Organisations should work with internal and external auditors in selecting the right hardening guide.
Configuration and change management In virtualised environments, configuration and change management is governed by the same principles as in the physical environment. First, it is important to ensure that the system has been setup properly (“hardened” as discussed above). Then on an on-going basis, it must maintain a “gold standard” of configuration – it must have all the right settings and patches and meet any required security policies. As changes occur, a good change management process ensures that a virtual system continues to meet the “gold standard,” and that any changes made are limited to authorized changes. As in the physical envi-
DIGIT CHANNEL CONNECT
16
MAY 2010
ronment, the goal of configuration and change management is to reduce the risk of someone exploiting a vulnerability caused by an error or omission in the way the machines have been setup, and then gaining unauthorized access. In a virtualised environment, such configuration and change management must be applied to the virtualisation software in addition to the virtual machines it hosts. In virtualised environments, there is no longer a one-toone relationship between a physical host and a server. System administrators can no longer rely on knowing that a particular physical host is running a particular server, with a particular configuration. Now, a virtual machine can run on one of many physical hosts, while a host can run a wide variety of virtual machines. This association changes dynamically, making it difficult to keep up with changes. The result is that configuration and change management practices become even more critical in virtualised environments. While this may require extra diligence, it is worth the time to ensure that well-documented configuration and change management processes exist which include approvals by appropriate personnel. To meet compliance requirements, organisations will have to prove to internal and/or external auditors that good configuration and change management procedures are in place. The underlying virtualisation host must be properly configured, as well as the association between the host and guests, the network configuration and the storage infrastructure. There are configuration management tools that can help maintain
VIRTUALISATION
SPECIAL
conformance to the “gold standard” by generating alerts on any deviations. Organisations should take advantage of the tools that already exist in their physical environment in order to lessen the burden of managing conformance within a virtual infrastructure. One characteristic of a virtual environment that drives extra diligence in change management is that, in a virtual world, things can be done or changes can be made extremely quickly. The ability to make fast changes is a great benefit. Resources can be allocated quickly when and where they are needed, but there is also a need to mitigate the risk of malicious activity and errors just as quickly. If organisations lack good change management, the speed factor can exacerbate existing weakness in any existing processes. Organisations should consider instituting proper work flow for changes in order to obtain all necessary approvals and ensure servers are properly configured before they are deployed. Good change management processes will enable organisations to reduce time and reduce risks. A related challenge is VM mobility. Virtual machines are a lot more mobile than physical machines; servers and applications can be moved to hardware within the same data centre or even to a data centre located across the planet. Again, this is an enormous benefit, enabling optimization of resources. However, there is still the need to mitigate the risk that a server will be moved to an unauthorized location. To meet the requirements regarding the physical location of data, organisational security policies should dictate any restrictions regarding where certain VMs can be located. These policies must then have corresponding controls in place. Virtualisation systems that allow VM mobility to be controlled should be used so that servers that process protected data cannot be deployed to or moved to unauthorized locations. Another area which may require
special attention is bringing offline VMs online again, which is very easy to do. But if a VM has been offline for too long, it may be out of date with respect to patches, antivirus signatures, etc. Organisations should be aware of any test or development environments which may be turning VMs on and off. A possible scenario is a developer takes some VMs, uses them for a week or two, and turns them off; then two months later, he/she goes back to that particular project and brings them on-line again without proper patches. Also watch for redundant servers that will only be used when servers go down. Processes should be developed to ensure that before any servers are brought on-line, they meet the gold standard, have all required patches and meet the expected secure configuration profile. Finally, another crucial aspect is patch management. Guest virtual machines and the virtualisation infrastructure itself must be regularly evaluated and the latest security patches applied using the standard processes that most organisations follow for patching systems within their physical infrastructure.
Administrative access control Summary: Maintain separation of administrative duties even when server, network and security infrastructure is consolidated within the virtualisation infrastructure. Define specific roles and granular privileges for each administrator in the central virtualisation management software and limit direct administrative access to the hypervisor to the extent possible. With a physical infrastructure, servers and networks are managed through separate and numerous applications. Within a virtual infrastructure, servers and networks may all be managed using software provided by the virtualisation vendor. Control over the virtual servers and virtual networks should be assigned to server and networking administrators
SYSTEM HARDENING HELPS MINIMIZE SECURITY VULNERABILITIES BY TAKING A SERIES OF ACTIONS SUCH AS CONFIGURING THE VIRTUALISATION PLATFORM WITH SECURE SETTINGS, REMOVING UNUSED COMPONENTS AND APPLYING THE LATEST PATCHES. HARDENING CHECKLISTS FOR VIRTUALISATION PLATFORMS ARE AVAILABLE FROM SEVERAL SOURCES. To meet compliance requirements, organisations will have to prove to internal and/or external auditors that good configuration and change management procedures are in place. The underlying virtualisation host must be properly configured; as well as the association between the host and guests, the network configuration and the storage infrastructure.
How to Know the Hypervisor is Secure? There has been a lot of media attention focused on potential vulnerabilities in the hypervisor. It is important to consider this, just as one would consider the security of any third-party software that is being introduced into the environment. Ask the vendor to provide assurance regarding the security of their product. Keep in mind that hypervisor vulnerabilities need to be addressed by the virtualisation vendor, and they are not actionable by your organisation. It is a misperception that the biggest risks in virtualisation are the hypervisor software. The biggest risks are actually operational and stem from mis-configuration and mismanagement, not the core platform. There is no “silver bullet answer” to the question of whether hypervisor technology is secure; it depends on the specific technology and how well it is deployed.
security compliance respectively, and they should update their skills to operate the virtualisation software proficiently and avoid misconfiguration. The specific privileges of each administrator need to be mapped to the particular organisational structure. Careful separation of duties and management of privileges is an important part of mitigating the risk of administrators gaining unauthorized access either maliciously or inadvertently. Depending on the sophistication of the virtualisation software, it is possible to define specific roles and granular privileges and assign those to individual administrators. A virtual switch is a software representation of a layer-2 network switch commonly implemented in hardware. It allows virtual machines on the same host to communicate with each other using the same protocols that would be used over physical switches, without the need for additional networking hardware. For assigning control of the virtual switches to network administrators, one challenge has been that the networking administrators are familiar with controlling switches through the network management application they use for their physical network. Network equipment vendors have introduced virtual network switches which are software implementations of their hardware switches. By deploying such a switch in a virtualised environment, the network administrator is now able to manage and monitor the virtual switch using the exact same interface they use for managing the other physical switches. Another important practice is ensuring that only authorized administrators are able to access the hypervisor for administrative functions. An administrator can perform several critical functions locally if they gain administrative access to a single physical server. This access should be managed carefully because it is difficult to monitor access to many thousands of physical servers in several data centres. It is preferred that organisations disable all local administration of the hypervisor and require the use of a central management application.
Network security and segmentation Summary: Use virtual switches and virtual firewalls to segment virtual networks. Extend and replicate your physical network controls within virtual networks. Use change management tools and work flow to limit the risk of mis-configuration. Work with internal
DIGIT CHANNEL CONNECT
17
MAY 2010
security compliance and external auditors to determine the acceptable mix of physical and virtual network segmentation. As with a physical infrastructure, organisations need to securely set up the virtual network. One of the important aspects for compliance is making sure machines that process protected information are isolated so that the data is not co-mingled or accessible through other machines. A virtual network is similar to a physical network except that it is embodied in software within the virtualisation platform and not in hardware. Virtualisation platforms provide the capabilities to implement effective network segmen-
tation. It is possible to logically group virtual machines and virtual switches into virtual LANs and provide further segmentation using virtual firewalls. The segmentation policy can be enforced regardless of the physical server on which the virtual machines are running. In addition, just as a physical network administrator can misconfigure network equipment, it is possible for virtual networks to be misconfigured. But if network administrators understand and follow proper configuration procedures, segmentation on a virtual network can be just as effective as on a physical system. Virtualisation software and network management vendors provide a rich combination of supporting capabilities. Network security and segmentation is a matter of understanding the capabilities available and avoiding configuration errors. Another aspect is making sure there is visibility into the traffic on the virtual network to reduce the risk of unauthorized access to this traffic. Organisations need to use virtual security devices such as virtual firewalls and virtual IDS/IPS. Security companies are starting to port their existing physical hardware-based network security appliances to run in
DIGIT CHANNEL CONNECT
18
MAY 2010
virtual machines or as virtual appliances. For example, some commercially available firewalls can now be run as virtual appliances and can secure both physical and virtual networks. As with physical systems, an auditor will require proof that the network has been well-designed by showing him/her the entire view of the network and the controls that are in place. Organisations should demonstrate that they can audit the system on a regular basis and can effectively manage change. For example, if a virtual switch has been set up on a sensitive network, demonstrate how virtualisation software can be configured to
send an alert if a new virtual machine is added to that network. In most organisations, there will be a hybrid of networks end-to-end from the desktop all the way to the server. Given this situation, how can it be proven to auditors that the network is indeed the one that carries protected information and is isolated? Organisations will need to produce configurations and logs from both the physical and virtual firewalls and switches to show how the network segmentation has been setup and whether it is effective. For virtual networks and firewalls, the configuration information may be in a different place than the physical network. Network management software and security information and event management (SIEM) solutions that manage and monitor physical and virtual networks from a single console can help with such tasks. One of the most discussed issues regarding network segmentation for virtualisation is regarding PCI DSS requirements and whether in-scope (e.g., virtual machines that process cardholder data) and out-scope systems can be consolidated on the same physi-
VIRTUALISATION
SPECIAL
WITHIN A VIRTUAL INFRASTRUCTURE, SERVERS AND NETWORKS MAY ALL BE MANAGED USING SOFTWARE PROVIDED BY THE VIRTUALISATION VENDOR. CONTROL OVER THE VIRTUAL SERVERS AND VIRTUAL NETWORKS SHOULD BE ASSIGNED TO SERVER AND NETWORKING ADMINISTRATORS RESPECTIVELY, AND THEY SHOULD UPDATE THEIR SKILLS TO OPERATE THE VIRTUALISATION SOFTWARE PROFICIENTLY AND AVOID MISCONFIGURATION. Virtualisation platforms provide the capabilities to implement effective network segmentation. It is possible to logically group virtual machines and virtual switches into virtual LANs and provide further segmentation using virtual firewalls. LOGS FROM VIRTUALISED SYSTEMS CAN ALSO BE IMPORTED TO A SECURITY INFORMATION AND EVENT MANAGEMENT (SIEM) SOLUTION SO THAT THE VIRTUAL INFRASTRUCTURE CAN BE MONITORED AND EVENTS ANALYSED IN THE CENTRAL SECURITY OPERATIONS CONSOLE FOR THE ENTIRE IT INFRASTRUCTURE.
cal server. The PCI DSS does not directly address this question, leaving it open to interpretation. Organisations should work with their auditors to demonstrate how they plan to securely segment their network, including their specific use of the virtual network segmentation technologies. Depending on the applications and current security posture, organisations and auditors can work together to determine the acceptable mix of virtual and physical network segmentation to isolate in-scope systems.
Audit logging Summary: Monitor logs from the virtual infrastructure along with the rest of the IT infrastructure. Correlate server and network logs across physical and virtual infrastructures to reveal security vulnerabilities and risk. Audit logging is critical to managing the security of any IT environment and a specific requirement of many regulations and standards. It provides the ability to track and monitor activities within IT systems. Depending on the product, virtualisation software records all system events and administrator actions for alerting and reporting. Logs from virtualised systems can also be imported to a Security Information and Event Management (SIEM) solution so that the virtual infrastructure can be monitored and events analysed in the central security operations console for the entire IT infrastructure. In order to really understand what is going on in the entire IT environment, logs must be reviewed from all devices in the environment, such as VPN, IDS, and firewalls. They should not be viewed in isolation. With a SIEM solution, events in the virtualisation infrastructure can be correlated to events within other parts of the IT infrastructure. For example, consider that an administrator moved a virtual machine from one server to another. This event can be correlated to other events and show that the administrator logged in over the VPN at four o’clock in the morning, incorrectly logged into four servers, and then logged into the virtual system to make this change. Through correlation of events, organisations are provided with a complete picture of what occurred. Historic reports from virtualisation software and from a SIEM solution are also effective tools in unambiguously demonstrating compliance to internal and external auditors. n This article has been adapted from a white paper. Courtesy: RSA
analyst speak
Where the
G
artner has identified the six most common virtualisation security risks together with advice on how each issue might be addressed. The risks are as follows: Information security isn’t initially involved in the virtualisation projects: Survey data from Gartner conferences in late 2009 indicates that about 40 percent of virtualisation deployment projects were undertaken without involving the information security team in the initial architecture and planning stages. Typically, the operations teams will argue that nothing has really changed — they already have skills and processes to secure workloads, operating systems (OSs) and the hardware underneath. Gartner said that security professionals need to realise that risk that isn’t acknowledged and communicated cannot be managed. They should
SPECIAL
Many virtualisation deployment projects are being undertaken without involving the information security team in the initial architecture and planning stages.
RISK Lies
Virtualisation may be the talk of the town, but there are many security risks associated with it
VIRTUALISATION
As virtualised workloads become more mobile, the security issues associated with virtualisation become more critical to address.
start by looking at extending their security processes, rather than buying more security, to address security in virtualised data centers. A compromise of the virtualisation layer could result in the compromise of all hosted workloads: The virtualisation layer represents another important IT platform in the infrastructure, and like any software written by human beings, this layer will inevitably contain embedded and yet-to-bediscovered vulnerabilities that may be exploitable. Given the privileged level that the hypervisor/VMM holds in the stack, hackers have already begun targeting this layer to potentially compromise all the workloads hosted above it. From an IT security and management perspective, this layer must be patched, and configuration guidelines must be established. Gartner recommends that organisations treat this layer as the most critical x86 platform in the enterprise data center and keep it as thin as possible, while hardening the configuration to unauthorized changes. Virtualisation vendors should be required to support measurement of the hypervisor/VMM layer on boot-up to ensure it has not been compromised. The lack of visibility and controls on internal virtual networks created for VM-to-VM communications blinds existing security policy enforcement mechanisms: For efficiency in communications between virtual machines (VMs), most virtualisation platforms include the ability to create software-based virtual networks and switches inside of the physical host to enable VMs to communicate directly. This traffic will not be visible to network-based security protection devices, such as networkbased intrusion prevention systems. Gartner recommends that at a minimum, organisations require the same type of monitoring they place on physical networks, so that they don’t lose visibility and control when workloads and networks are virtualised. To reduce the chance of misconfiguration and mismanagement, they
feature should favor security vendors that span physical and virtual environments with a consistent policy management and enforcement framework. Workloads of different trust levels are consolidated onto a single physical server without sufficient separation: As organisations move beyond the “lowhanging fruit” of workloads to be virtualised, more critical systems and sensitive workloads are being targeted for virtualisation. This is not necessarily an issue, but it can become an issue when these workloads are combined with other workloads from different trust zones on the same physical server without adequate separation. Adequate controls on administrative access to the Hypervisor/VMM layer and to administrative tools are lacking: Because of the critical support the hypervisor/VMM layer provides, administrative access to this layer must be tightly controlled, but this is complicated by the fact that most virtualisation platforms provide multiple paths of administration for this layer. Gartner recommends restricting access to the virtualisation layer as with any sensitive OS and favoring virtualisation platforms that support role-based access control of administrative responsibilities to further refine who can do what within the virtual environment. Where regulatory and/ or compliance requirements dictate, organisations should evaluate the need for third-party tools to provide tight administrative control. There is a potential loss of separation of duties for network and security controls: When physical servers are collapsed into a single machine, it increases the risk that both system administrators and users will inadvertently gain access to data that exceeds their normal privilege levels. Another area of concern is which group configures and supports the internal virtual switch. Gartner recommends that the same team responsible for the configuration of network topology (including virtual LANs) in the physical environment should be responsible for this in virtual environments. They should favor virtualisation platform architectures that support replaceable switch code, so that the same console and policies span physical and virtual configurations. n
DIGIT CHANNEL CONNECT
19
MAY 2010
network virtualisation
BORDERLESS CONNECTIONS
Network virtualisation provides a powerful way to run multiple networks, and more and more companies are taking advantage of it SUMIT MUKHIJA
V
irtualisation is the buzz word for organisations these days. It is now more in the limelight with the advent of server virtualisation in the data center, enabling organisations to simplify management and optimize performance of vast computing and networking infrastructures.The current global economic climate presents an opportune time to fully utilize the benefits of virtualisation technologies beyond the data center to achieve greater cost savings through increasing network utilization and efficiency. Successful network virtualisation solutions use proven technologies to reduce management complexity and service rollout time while increasing operational control and service flexibility.
DIGIT CHANNEL CONNECT
20
MAY 2010
The business need for virtualisation Today’s collaboration-based businesses require networks that support communication among workgroups and also with business partners. In this “new economy” of a globally distributed workforce and global competition, enterprise organisations continue to use collaboration technologies to help connect geographically dispersed user groups to act and feel like a single, centralized entity. These collaboration technologies improve employee productivity while reducing operational expenses by creating a “borderless enterprise” where employees, customers, and partners can all share significant information and connect their business processes
more efficiently. Integrating all services into a single IP infrastructure can reduce the cost of building and maintaining multiple networks while enabling innovative applications that increase user productivity and corporate competitiveness. Businesses need the ability to manage services in a way that optimizes their performance, speeds rollout, offers flexible delivery options, and supports rapid troubleshooting and resolution. To attain these advantages, network administrators must isolate some services and allow communication among others, all while maintaining security and regulatory compliance. For example, banks need to isolate ATM transactions, manufacturing floors need to protect and prioritize traffic from plant automation systems, and many organisations are familiar with the network and application integration issues that accompany mergers and acquisitions. Today’s LAN virtualisation technologies can ease the management of these and other services that users expect to use anytime, anywhere. Business and IT leaders have been seeking an IT infrastructure that is more responsive to business initiatives and removes inefficiencies. The traditional campus architecture, which delivers basic connectivity to isolated departments with fortress-like barriers, must evolve into an agile and resilient architecture that delivers orchestrated services to integrated teams collaborating throughout the enterprise, and that supports service-level agreements (SLAs). With these changes, the IT department becomes more of a business unit that delivers services to improve the enterprise rather than burdening it as a cost center. Virtualisation helps enable a new, dynamic IT infrastructure that is more responsive to ever-changing business requirements.
What is network virtualisation? Network virtualisation enables IT groups to deploy and manage IT resources as logical services instead of physical resources. Using network virtualisation, IT administrators can segment and align IT services to meet the specific needs of the users and groups on the network. Logical, secure segmentation also helps IT groups comply with regulations for resource and information security. IT staff can use network virtualisation to manage Internet access, voice and
VIRTUALISATION
SPECIAL
video services, RFID-enabled inventory applications, and so on. By managing the network to correspond more directly with the services that people use, IT personnel can focus on adding value to user productivity and closely managing operational expenditures. Therefore, network virtualisation can accomplish many business objectives, helping organisations to deploy and operate services while maintaining security and compliance. Typical objectives include the following: l Guest access: Most organisations allow guest (non-employee) access to the network, usually to use the Internet. A virtualised network prevents guests from accessing confidential information and resources or from inadvertently introducing malware into the network. l Partner access: Business partners often need access to authorized applications and data. l Protection: Mobile devices are vulnerable to infections from spyware, viruses and worms, and other malware. The network needs the ability to quarantine devices during Network Admission Control (NAC) remediation. When devices are deemed safe and compliant, the network can log the device into its authorized virtualised network. l Divisional separation: Using virtualisation to separate users, groups, stores, or divisions helps protect sensitive information from intrusion, unauthorized alteration, or theft. l Device isolation: Certain devices may need to be isolated from others for security or performance reasons. l Hosted services: Commercial real estate managers may include hosted IT services as part of a lease agreement with tenants. Network virtualisation is especially useful for tenants that run point-of-sale transactions.
Business Applications Here are some scenarios that may help solve a variety of business problems that organisations face and help businesses optimize the value of their networks:
Merger and acquisition A banking conglomerate acquires a specialty investment firm. To facilitate integration of IT resources, the IT group uses network virtualisation in multiple steps to merge the talents of both companies and help them focus on corporate initiatives. The first step isolates divisional operations, allowing correspond-
ing departments (such as HR to HR, or finance to finance) in both companies to share resources. This first step does not allow cross-divisional sharing, such as HR at the conglomerate with finance at the investment firm. This step facilitates business integration efforts until the acquisition receives regulatory approval. After the transaction is finalised, IT can resegment the network resources to integrate resources at the investment firm with the entire conglomerate according to policy.
Manufacturing An automobile manufacturing company uses network virtualisation across its converged IP network to support disparate requirements for the manufacturing, facilities, and IT groups. It created five virtualised domains: manufacturing floor, voice and video, other mission-critical departments, guest access, and administration. The virtualised architecture segments applications, resources, and users, yet shares a common set of management tools and staff. The solution allows the manufacturer to maximize application availability throughout its manufacturing processes, where downtime can cost tens of thousands of dollars per hour. The network virtualisation optimizes network and application performance for enhanced productivity of all groups and helps secure vital resources. In doing so, the firm leverages IT to gain a competitive advantage, improve service flexibility, and reduce manufacturing costs.
Retail A retail corporation with 2500 stores uses network virtualisation to enable partner access to individual stores. The retail company has outsourced its energy management, weight and scales, and in-store video demonstrations to three separate partners. The virtualised network separates each partner’s traffic from proprietary store traffic, and from other partners. It reduces cost and complexity, because everyone uses a common infrastructure, and it enables the retailer to hire the best services from multiple partners at a lower overall cost.
Landlord-tenant Commercial real estate companies often manage hundreds of tenants on real estate with a high value per square foot. Cisco network virtualisation can isolate tenants at a shopping mall, allowing every business to use the same
network virtualisation network infrastructure, yet maintain separate voice, data, and video services for information privacy and compliance with Payment Card Industry (PCI) Data Security Standard (DSS) requirements. Virtualisation allows tenants to access their services from anywhere on the premises, through wired or wireless connections.
SUMIT MUKHIJA
INTEGRATING ALL SERVICES INTO A SINGLE IP INFRASTRUCTURE CAN REDUCE THE COST OF BUILDING AND MAINTAINING MULTIPLE NETWORKS WHILE ENABLING INNOVATIVE APPLICATIONS THAT INCREASE USER PRODUCTIVITY AND CORPORATE COMPETITIVENESS.
Using network virtualisation, IT administrators can segment and align IT services to meet the specific needs of the users and groups on the network. NETWORK VIRTUALISATION CAN ACCOMPLISH MANY BUSINESS OBJECTIVES, HELPING ORGANISATIONS TO DEPLOY AND OPERATE SERVICES WHILE MAINTAINING SECURITY AND COMPLIANCE.
Transportation Airports seek new efficiencies to manage thousands of travelers passing through them each day. They lease space to airlines and supporting companies such as caterers and retailers. Applying network virtualisation, an airport can attain higher usage rates at gates, and airlines can avoid paying full-time rents for gates they may not always use. Gate agents can log into the system at a particular gate, and the network automatically assigns the feature and application sets associated with that airline and flight to the gate, such as PC workstation applications, boarding pass printers, and overhead flight displays.
Healthcare Healthcare institutions in the United States are required to protect the privacy of patient records under the Health Insurance Portability and Accountability Act (HIPAA). A network virtualisation solution helps institutions maintain compliance by separating users who access patient data from those who do not, and by enforcing information security policies for all virtualised LANs from a central point. Virtualisation is useful in hospitals that are moving toward “hoteltype” network services, such as allowing patients to use the phone, watch television, or access the Internet on one domain, while maintaining electronic protected health information (PHI) on another domain.
Government Governments can facilitate information sharing and collaboration among agencies and control IT costs through a single, distributed, converged network that all agencies share. With thousands of users, governments can realize substantial cost savings with resource consolidation and centralized IT services, yet each agency can customise its applications and services to meet its particular needs. n Sumit Mukhija is National Sales Manager - Data Centre, Cisco India & SAARC.
DIGIT CHANNEL CONNECT
21
MAY 2010
virtualisation management
Taking
CONTROL With the right tools and best practises, organisations can better manage virtualisation KRISHNAN THYAGARAJAN
DIGIT CHANNEL CONNECT
22
MAY 2010
F
ollowing years of growth and change, most organisations have many specialized physical servers and workstations that remain underutilized. Virtualising such an environment increases and balances utilization by consolidating the physical machines into a single physical host that runs multiple virtual machines (VMs). The VMs share the core four resources— CPU, memory, disks and network cards—of one physical host. While the physical machine does the same work, it can support multiple operating system (OS) instances, related applications and data. To achieve optimum utilization in a virtualised environment, organisations must do a good job of maintaining the environment. This means data
and other assets must be monitored, protected and preserved.
Challenges posed by virtualisation Understanding the performance of the virtual infrastructure is not an easy task because so many layers of components must come together and work as an integrated system. Many administrators are required to go beyond just managing their virtual systems to perform capacity planning and analysis. In many cases, better understanding of utilization and costs associated with the virtual environment becomes essential. Apportioning these costs to the consumers of the virtual infrastructure, internal or external to the organisation, is often a manual and complex task. Furthermore, each of these activities requires expertise, as virtualisation adds new management and operational challenges. On broader terms, some of the challenges that virtualisation encompasses are:
Clarity Virtualisation introduces a layer of abstraction that does not exist in physical server infrastructures. This abstraction can reduce the visibility into the
VIRTUALISATION
SPECIAL
multiple resources running concurrently across the multi-layered virtual environment.
Expertise Virtualisation introduces a new and different technology that requires a steep learning curve. Many organisations try to use traditional operating systems and application monitors with a VM. However, the metrics they provide are often inaccurate due to the complexity of factors such as dynamic resource scheduling, live migrations to alternate hardware, and time-keeping issues inside VMs. Organisations can turn to expert consultants, but that is not a very viable options as they are very expensive.
Tracking The introduction of Live Migration, as well as the advancement of high availability technologies, has made tracking VM movement more critical to understanding performance management. Traditional monitoring solutions make it very difficult to track the occurrence of a migration as it relates to changes in VM performance. Efficiently predicting the state of a VM prior to a manual migration is a complex challenge, and traditional tools do not have capacity to forecast such trends.
Prevention Portability of VMs provides huge benefits, but it also can introduce risk. Before performing a Live Migration, administrators need all the information possible about the core four resources discussed earlier, along with an understanding of growth trends. A pragmatic approach to monitoring is needed to help one understand configurations and patterns—and ultimately enable one to be more predictive and leverage automation in managing the data center. In total, these challenges vary in significance based on a variety of factors and can be compounded by the number of VMs in the environment, the number of administrators provisioning VMs, the lifecycle or movement of VMs within the data center, or the need to understand the impact on the core four resources before a planned or unplanned change. Most importantly, organisations that are running business-critical applications on VMs must carefully mitigate these challenges or otherwise risk missed service levels or
poor end-user experiences.
Transparency Over the last several years, there has been a growing interest by organisations in better transparency and visibility to utilization and costs within the virtual infrastructure. With multiple groups and workloads consuming resources–both physical and virtual– allocating costs for the use of hosts and groups of hosts can be challenging. Additionally, VM sprawl only exacerbates this challenge.
Better management The ways in which an organisation can better ensure that its virtualisation initiative effectively supports its application and end user satisfaction objectives are as follows: 1. Better understand the relationships and interactions of all the components in the virtual and physical infrastructure: Visualising the entire virtual infrastructure in a single view allows one to clearly see the multiple resources concurrently in use across the many layers of the environment, including data centers, data stores, clusters, resource pools, ESX servers and VMs. Furthermore, unless the entire infrastructure is now virtualised, visibility into physical server workloads will still need proactive management. 2. Determine the root cause of an incident or problem before end users are affected: IT administrators need tools that provide detailed alarms, future impact predictions, deviations from normal activity, and specific operational problems. Monitoring solutions that are comprehensive and provide intelligence in order to help determine that there is a problem, convey why it is a problem, and recommend ways to resolve the problem are ideal for diagnosing and resolving issues before they impact end users. 3. Track movement of VMs to understand their potential impact on applications and the business: Organisations need to be able to track assets, both in terms of changes to configuration and location, in order to assess the impact of the changes on performance and availability of the dependent application and on other VMs in the same physical environment. Tracking the
KRISHNAN THYAGARAJAN
TO ACHIEVE OPTIMUM UTILIZATION IN A VIRTUALISED ENVIRONMENT, ORGANISATIONS MUST DO A GOOD JOB OF MAINTAINING THE ENVIRONMENT. THIS MEANS DATA AND OTHER ASSETS MUST BE MONITORED, PROTECTED AND PRESERVED.
Ensuring the performance and availability of virtual and physical server infrastructures can be an elusive goal. But with the right monitoring tools and the best-practice techniques, organisations can take control of these new operational challenges.
movement of VMs will not only give a better understanding of the impact on applications and end users, but also determine the reason. 4. Contain alarm storms from VMs and physical servers for a prioritized IT response: By effectively containing alarm storms and turning data into meaningful information, IT administrators can better understand the correlation between infrastructure, host and VM issues. This is an essential tactic for prioritizing problemresolution efforts and preventing application performance issues that can lead to a poor end user experience. 5. Identify contention for resources between VMs to prevent overcommitment of resources: Organisations need to be able to show the resource impact of moving a VM image from one physical system to another. This information can be used to determine in advance whether plan performance and utilization data at several levels— infrastructure, host and VM—will help identify contention issues. By having a clear understanding of the core resources and historical trend information, organisations can prevent future problems and proactively plan for the future. 6. Provide transparency for IT management and business stakeholders indicating what groups and workloads are consuming resources: To understand resource utilization and reclaim costs for use, organisations need to look at reporting, as well as charge back techniques. Being able to allocate costs for the use of infrastructure to hosts or groups of hosts, whether they are physical or virtual and using industry-standard charge back models’ remains, helps organisations follow best practices. Further, and perhaps more importantly, these objectives help the organisation allocate costs more effectively and proportionately to what resources were truly utilized. Ensuring the performance and availability of virtual and physical server infrastructures can be an elusive goal. But with the right monitoring tools and the best-practice techniques discussed herein, organisations can take control of these new operational challenges. n Krishnan Thyagarajan is MD, Quest Software, India
DIGIT CHANNEL CONNECT
23
MAY 2010
trends
S E L E C T
Toshiba India appoints Tengguo Wu as Director-PC Division
T
oshiba India, the Indian subsidiary of Toshiba Corporation, has announced the appointment of Tengguo Wu as Director of its PC Division. The appointment of Tengguo to spearhead the Indian operations is part of the compnay’s strategy to strengthen its PC division and grow its market share in India. According to Tengguo Wu, Director, PC Tengguo Wu Division, Toshiba India, “It is my privilege to come back to India to head the PC Division which, I believe, is poised for exponential growth in the immediate future.” Tengguo started his career with Toshiba in 1990, in South East Asia and the Indian subcontinent. Based in Singapore in early 1990’s, he led Toshiba’s foray in the nascent Indian laptop market. In the following decades, Tengguo has been instrumental in developing new businesses and markets for Toshiba Corporation across 30 countries including South and Southeast Asia, Middle East, South Africa, and Pacific Islands. He has managed diverse portfolios across sales, marketing and general management. n
S E R I E S
Comstor partners with Cisco as a national distributor
W
estcon Group has announced its appointment as national distributor for Cisco’s advanced network technology solutions with a focus on data centre, collaboration and borderless networks technologies through its Comstor Division, in line with their global model. Comstor India will have a team focusing exclusively on Cisco offering channel partners a full line of Cisco enterprise and SMB solution effective immediately. Comstor India will leverage on Westcon Group’s interlinked network with 39 national resellers and over 220 regional/local system integrators across 10 cities in India. According to Santosh S a n k u n ny, B u s i n e s s H e a d ,
Comstor India, “This partnership would enhance Comstor India’s capability of providing its customers and channel partners with the complete range of solutions and services.” “We, at Cisco believe that along with technological innovation, we need to adopt new business models to drive market transition and deliver optimum value to our customer base. This partnership was a strategic move designed to further strengthen our focus on the advanced technologies space where we see a huge opportunity for incremental growth,” said B. Raghavendran, Vice President, Channel Operations and Commercial Strategy, Cisco India & SAARC . n
Fujitsu launches Celsius workstation portfolio
W
estcon Group has announced its appointment as national distributor for Cisco’s advanced network technology solutions with a focus on data centre, collaboration and borderless networks technologies through its Comstor Division, in line with their global model. Comstor India will have a team focusing exclusively on Cisco, offering channel partners a full line of Cisco enterprise and SMB solution effective immediately. Comstor India will leverage on Westcon Group’s interlinked network with 39 national resellers and over 220 regional/ local system integrators across 10 cities in India. According to Santosh Sankunny, Business
Head, Comstor India, “This partnership would enhance Comstor India’s capability of providing its customers and channel partners with the complete range of solutions and services.” “We, at Cisco believe that along with technological innovation, we need to adopt new business models to drive market transition and deliver optimum value to our customer base. This partnership was a strategic move designed to further strengthen our focus on the advanced technologies space where we see a huge opportunity for incremental growth,” said B. Raghavendran, Vice President, Channel Operations and Commercial Strategy, Cisco India & SAARC . n
Cisco WebEx launches web conferencing on smartphones
C
isco WebEx has announced web conferencing service on smartphones. Through this service, users will be able to join Cisco WebEx Meeting Center web and audio conferences on smartphones including the BlackBerry Bold, BlackBerry Curve 8900, and BlackBerry Storm from RIM, the Nokia E71, Nokia E75, Nokia N97, and other Nokia Eseries and Nseries, and the Samsung Blackjack II. They will be able to participate in audio and web
DIGIT CHANNEL CONNECT
24
MAY 2010
conferencing via 3G or Wi-Fi, attend scheduled meetings and view presentations, applications and desktops with live annotations. Users will be able to launch Cisco WebEx Meeting Center through the browser on their smartphones to receive integrated audio and web conferencing over 3G or a combination of 2G and Wi-Fi. In addition, users will be able to attend a scheduled meeting and to view presentations, applications and desktops with live annotations. There is no cost to attend these meetings, but to schedule and host a meeting from a computer requires a host account on Cisco WebEx Meeting Center. n
S E L E C T
Acer overtakes HP as top desktop vendor
A
trends
S E R I E S
ccording to the latest Gartner India report, Acer has overtaken HP to become the market leader in the desktop category, for Q1 2010. Acer sold approximately 1,58,000 desktops (commercial and consumer) in Q1 2010. Acer’s feature-equipped desktops cater to numerous segments ranging from education to large public sector and government organisations. “The requirement for desktops is still strong in verticals like education and government sectors as well as in smaller towns and cities when it comes to consumer sales. With the increase in PC penetration in the smaller towns, desktop sales will continue to be strong. We are focused on capitalizing on this opportunity and have introduced new products, from time to time, focusing on various sub segments to effectively tap into this market.” said S. Rajendran, Chief Marketing Officer, Acer India. n
Channel mourns accidental death of TAIT Director Anees Khalfay
A
renowned veteran reseller of the Mumbai channel community, Anees Khalfay died in an unfortunate accident on May 19. He fell from the 15th floor of his residence in Mumbai. Besides being a senior player among the Mumbai channel fraternity, Khalfay was owner of Radiant Technologies and was also the director of Mumbai-based Trade Association of Information Technology (TAIT). Initially, there were talks that Khalfay had committed suicide, but it seems they were all rumours. Declining such speculations, a close friend and President of TAIT, Ketan Patel said, “I am sure it is a case of accidental death. His business as well as personal life was good. Anees Bhai (as he was fondly known among the reseller community) was a saint soul. He connected well with all system integrators and vendors. His death is a big loss to the IT channel fraternity.” The channel partners as well as vendors are in shock and grief at the death of one of the channel’s most active members. Expressing his condolences, Rajat Sahu, Head of Marketing at MicroWorld, a long-time principal of Radiant, said, “It was a shock to us about the sudden demise of Anees. Not only was he a good human being, he was also one of the most ethical businessmen I have known. Anees was instrumental in bringing together the entire channel fraternity of Mumbai, be it for events like roadshows, Diwali mela or annual expos. I would like to extend full support and condolence to his family and his team members.” Khalfay, a Bachelor of Engineer, began trading in IT in 1996 with his own startup unit, Radiant Computer Technologies. Prior to that, he was into manufacturing of power electronics products. Khalfay also served as the immediate past President of TAIT. During his long association with TAIT, Khalfay fought for several issues troubling the channel. Being an amiable person, he worked hard for bringing good practices back into the trade and inculcate a sense of ethics among the traders. Team DCC and the 9.9 Media family extend our sincere condolences to his family and friends. May his soul rest in peace… n
S E L E C T
trends
S E R I E S
Hitachi appoints Chakrapani as Director for Channels & Strategic Alliances
H
itachi Data Systems Corporation has announced the appointment of Srikant Chakrapani as Director, Channels and Strategic Alliances for India. He will be responsible for executing the Hitachi TrueNorth Channel Partner program and driving the technology alliances strategy, sales strategy through the channel organisation consisting of systems sntegrators, value added resellers and distributors. Srikant According to Vivekanand Venugopal, Vice President Chakrapani & General Manager, India, Hitachi Data Systems, “His industry experience, deep understanding of our solutions, field sales experience and knowledge of our internal systems provides a strong foundation for us to increase our coverage, and assist channel partners capitalize on our solutions and rapidly evolving market opportunities.” In his previous role, Srikant was the Regional Sales Director for West India and has been instrumental in consolidating the enterprise business in the western region. n
ASUS concludes multi city dealer meet ASUS recently concluded a channel drive to promote the newly launched GarminAsus M10 in 5 metros, namely, Mumbai, New Delhi, Chennai, Bangalore and Kolkata. The meets provided a communication channel for ASUS to educate dealers and channel partners about the product features for GarminAsus smart phones as well as for informing them about the special schemes for dealers. Over 150 smart phone dealers attended the meets. According to John Chen, Country Head for Handheld Business Group, ASUS (India), “In order to educate more dealers about the innovative product features of the M10, we organised the dealer meets in these 5 cities. We will conduct more such eventful meets in the near future.” “Mainly aimed at promoting the newly launched Garmin-Asus M10, we also used the dealer meets to synchronize our smartphone product channel business,” said Gaurav Mehra - Country Product Manager for Hand Held Business ASUS (India). More such meets have already been planned and are being executed by ASUS for advising dealers of the best ways for selling smart phones. n
DIGIT CHANNEL CONNECT
26
MAY 2010
AMD launches “Scratch & Win offer” for SIs
A
dvanced Micro Devices (AMD) has announced a limited period ‘Scratch & Win Offer’ for system integrators. The two month promotional campaign gives the system integrators an opportunity to win prizes like Hero Honda Passion Plus bike, AMD powered netbooks, graphics cards, pen drives, backpacks, and a Hyundai i10 car or an Apple iPhone. On purchase of every Phenom II, Athlon II or 5 AMD Sempron processors from AMD authorized distributors, AMD Premier and select partners, the system integrator will get a scratch card which will give him one of the assured prizes. All scratch cards also qualify for a lucky draw at the end of the promotion where lucky buyers can win a Hyundai i10 car or an Apple iPhone. According to Sandip Naik, Head, Channel Sales, AMD India, “System integrators play a critical role in the whole ecosystem by educating and offering value for money solutions to their end customers. Through schemes like these, we would like to acknowledge the importance of system integrators in the ecosystem and create excitement.” The offer is valid till June 19th, 2010 across the country. n
HP claims success against counterfeit printing supplies
H
P has announced the se izure of more than 12,000 counterfeit products that includes various ink packages, toner packages, ink security labels, toner security labels, finished inks, finished toners, ink car tridges, toner cartridges, toner bubble bags, barcode stickers, user manuals, clamshell packing machine, ink filling machine, laminating machines and more than 5,000 additional production items and around 79,900 toner security labels. These seizures were conducted by SA ACF team in coordination with the local police in Bangalore, Delhi and Gandhidham. They executed an enforcement action on assembly operations in the Ashok Nagar area of Bangalore, Daryaganj and Bazar
Sitaram suburbs of Delhi, In Gandhidham, SA ACF team, in coordination with the local police, executed an enforcement action on two printing operations for counterfeit toner security labels. Also, 6 individuals were detained in Bangalore, 4 in Gandhidham, while a criminal investigation is being conducted. “With counterfeit HP print cartridges, customers
purchase what they often assume to be a genuine HP product, but they instead receive a cartridge that provides them with inferior print quality at best and often times a cartridge that fails to perform at all. Through our anti counterfeiting efforts, HP is determined to protect our customers and our brand,” said Jeff Kwasny, Director of HP Brand Protection for print cartridges.