CIO May 15 2010 Issue

Page 1

Alert_DEC2011.indd 18

11/18/2011 6:12:44 PM


From The Editor-in-Chief

I’ve looked at clouds from both sides now From up and down, and still somehow It’s cloud illusions I recall I really don’t know clouds at all. — Joni Mitchell (Both Sides, Now)

Of Clouds & Silver Linings Nothing’s going to happen in a hurry.

What is it about buzzwords that makes them clichéd, loved and hated? Is it the mindless repetition of a fervent chant or is it much more than that, a lot more, in fact? Take cloud computing, for instance. Over the past year, you would have found it tough to escape ‘cloud computing’ being thrust at you as the most credible enterprise option from a host of technology and service providers. While I do not dispute that there are The typical CIO response to indeed a clutch of vendors who do have the cloud is one of interest, interesting solutions based upon the followed by confusion, cloud, the typical CIO response is one of skepticism and cynicism — interest, followed by confusion, skepticism in that order. and cynicism — in that order. This doesn’t surprise me since the waters have been muddied by technology providers who do not have adequate an understanding about large enterprise realities or for that matter solution sets that can leverage the cloud. The way I see it, just too many vendors have been taking one of three approaches for them to be taken seriously by enterprises. They’re positioning the cloud either as a ‘silver bullet’ that’ll lick all problems that plague organizations; or they’re suggesting that transitioning to the cloud is as easy as the ‘snap of fingers’; or there is the ‘(dis) integration approach’ that tends to downplay issues that stem from transporting data and maintaining its integrity between legacy apps and infrastructure and the cloud. Gimme a break. How many large enterprises will be comfortable moving their data to the ‘public cloud’? How many organizations have the standardized processes that will allow them to integrate the ‘cloud’ with the ‘non-cloud’ bits of enterprise IT? What is the extent of virtualization even in organizations high up on the tech maturity curve? Critical questions that both CIOs and vendors ought to be asking. I’m not suggesting that cloud computing doesn’t have merit. Far from it. I do see large enterprises plumping for private clouds, the benefits and the promise are just too great to ignore. I see them commencing on this journey this year, starting small with ‘non-core’ applications. ERP on the cloud? That’s a while away. What do you feel about this? Write in and let me know.

Vijay Ramachandran Editor-in-Chief vijay_r@cio.in 2

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Content,Editorial,Colophone_Page.indd 2

Vol/5 | ISSUE/06

5/10/2010 2:52:52 PM


content may 15 2010‑ | ‑Vol/5‑ | ‑issue/07

Case Study

M&A d ESIGN BY MM SHANIT H

COVER stORy tWIn POWERED| 24

M&M’s acquisition of Kinetic Motors beat the 70 percent failure rate associated with M&As. Its success came on the back of hard decisions and some unusual strategies — including the role of its CIO.

CoVEr: pHoToGrApH BY FoToCorp/SHAIlESH

Feature by Priyanka more »

VIRtuALIzAtIOn Busts FAkEs | 38 Caught between blistering growth and untamed counterfeit drugs, Bilcare Research was finding it hard to breathe. Here’s how virtualization relieved the company. Feature by sneha Jha

Deep Dive stORAgE | 49 As enterprise data continues to bloat, CIOs are beginning to look at their storage with new eyes, especially as new technologies allow for greater levels of efficiencies.

CIO Skills READy tO PREsEnt? | 44 Less is more when it comes to bang-up business presentations. Here are five tips for better tech talks. Feature by Mary k. Pratt 4

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

more » Vol/5 | ISSUE/07


content

(cont.) departments Trendlines | 11 Business Issues| Enterprise Software Sucks Quick Take | Senthil Nathan on BPR Voices| How Will 3G Benefit Enterprises? Survey | The ‘Big Four’ IT Apps Enterprise Apps | How Not to Build a Cloud Opinion Poll | Going Beyond IT Social Networking |Social Media Means Business Internet | Five Things the Internet Has Ruined Research | It’s Raining Chips Innovation | And Solar Power Just Got Cheaper Alternative Views | Remote Infrastructure Mgt.

Thrive | 70 Business Leadership | Working with the New Boss

Feature by Diane Frank

Mentor | 72 Career |Role Playing

Column by Sunil Mehta, JWT

From the Editor-in-Chief | 2 Of Clouds and Silver Linings

By Vijay Ramachandran

NOW ONLINE “The cloud is here to stay. It is a serious innovation in IT business models. All organizations will adapt to it,” says Suresh Vaswani, joint CEO-IT Business, Wipro.

34

For more opinions, features, analyses and updates, log on to our companion website and discover content designed to help you and your organization deploy IT strategically. Go to www.cio.in

c o.in

Executive Expectations View From The Top | 34 Suresh Vaswani, joint CEO-IT Business, Wipro, looks ahead and tells enterprises which technologies they can’t take off their radars and how Wipro plans to help them meet the future.

22

Interview by Anup Varier

Strategic CIO The Odd Couple | 22 Organizations still have plenty of work to do in building a strong business and IT coalition. Here’s a look at the state of the CEO-CIO union from the front lines. Column by Chris Curran

6

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Content,Editorial,Colophone_Page.indd 6

5/10/2010 2:53:03 PM


Governing BOARD

Alok Kumar Global Head - Internal IT, TCS

Publisher Louis D’Mello

Anil Khopkar GM (MIS) & CIO, Bajaj Auto

Editoria l Editor-IN-CHIEF Vijay Ramachandran EXECUTIVE EDITOR (COMPUTERWORLD) Gunjan Trivedi Associate Editor (Online) Kanika Goswami Features Editor Sunil Shah Copy Editor Shardha Subramanian CorrespondentS Anup Varier, Priyanka, Sneha Jha, Varsha Chidambaram Product manager Online Sreekant Sastry

Anjan Choudhury CTO, BSE Ashish Chauhan Deputy CEO, BSE Atul Jayawant President Corporate IT & Group CIO, Aditya Birla Group

Custom Publ ishing

Associate Editor Arakali A Harichandan Copy editor Kavita Madhusudan Correspondent Deepti Balani

7

Canon India

5

EMC Data Storage Systems

3

Fujitsu Asaia

15

Dr. Jai Menon Director Technology & Customer Service, Bharti Airtel & Group CIO, Bharti Enterprises

IBM India Ltd

Murali krishna K. Head - CCD, Infosys Technologies Navin Chadha CIO, Vodafone

VP Rupesh Sreedharan Senior Manager Chetan Acharya Managers Ajay Adhikari Pooja Chhabra Manager Projects Sachin Arora Management trainee Ramya Menon

17

Amercian Power Conversion

Donald Patra CIO, HSBC India

Manish Gupta Director-IT, Pepsi Foods

Events & Audience Deve lopment

ADC Krone Communications

21

Manish Choksi Chief Corporate Strategy & CIO, Asian Paints

Lead Designers Jithesh C.C, Vikas Kapoor, Vinoj KN SENIOR Designers Jinan K V, Sani Mani Designer M M Shanith trainee designer Amrita C Roy Photography Srivatsa Shandilya Production Manager T K Karunakaran

Advertiser Page No.

Emerson Networks Power

Gopal Shukla VP - Business Systems, Hindustan Coca Cola

Design & Production

Advertiser Index

HP Enterprise Services

Intel Technologies

43 IFC, 1, 8 & 9 8 page booklet

Microsoft Corporation Reverse Gate Fold Oracle

IBC

Rittal India

66 & 67

SAS Institute

13

Smartlink Network Systems

27

Tulip Telecom

BC

Pravir Vohra Group CTO, ICICI Bank Rajesh Uppal Chief General Manager IT & Distribution, Maruti Udyog Sanjay Jain CIO, WNS Global Services

Marketing & Sal es (National)

Shreekant Mokashi Chief-IT, Tata Steel

President Sales and Marketing Sudhir Kamath VP Sales Sudhir Argula General manager Sales Parul Singh Asst. GM BRAND Siddharth Singh ASSt. Manager Brand Disha Gaur ASSOCIATE MARKETING Dinesh P SR. Manager Client Marketing Rohan Chandhok ASSt. Manager Marketing Sukanya Saikia Ad Sales Co-ordinators Hema Saravanan C.M. Nadira Hyder

Sunil Mehta Sr. VP & Area Systems Director (Central Asia), JWT T.K. Subramanian Div. VP-IS, UB Group V. K Magapu Director, Larsen & Toubro V.V.R Babu Group CIO, ITC

Regional sa l es Bangalore Ajay S. Chakravarthy Kumarjeet Bhattacharjee Manoj D Delhi Aveek Bhose, Mohit Dhingra Prachi Gupta, Punit Mishra Mumbai Dipti Mahendra Modi Hafeez Shaikh, Pooja Nayak

All rights reserved. No part of this publication may be reproduced by any means without prior written permission from the publisher. Address requests for customized reprints to IDG Media Private Limited, Geetha Building, 49, 3rd Cross, Mission Road, Bangalore - 560 027, India. IDG Media Private Limited is an IDG (International Data Group) company.

Printed and Published by Louis D’Mello on behalf of IDG Media Private Limited, Geetha Building, 49, 3rd Cross, Mission Road, Bangalore - 560 027. Editor: Louis D’Mello Printed at Manipal Press Ltd., Press Corner, Tile Factory Road, Manipal, Udupi, Karnataka - 576 104.

10

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Content,Editorial,Colophone_Page.indd 10

IDG offices Bangalore Geetha Building, 49, 3rd Cross, Mission Road Bangalore 560 027 Ph: 3053 0300 Fax: 3058 6065 DELHI 410, Hemkunt Towers 98, Nehru Place New Delhi 110 019 Ph:011- 4167 4230 Fax: 4167 4233

MUMBAI 201, Madhava

Bandra Kurla Complex Bandra (E) Mumbai 400 051 Ph: 3068 5000 Fax: 2659 2708

This index is provided as an additional service. The publisher does not assume any liabilities for errors or omissions.

Vol/5 | ISSUE/07

5/10/2010 2:53:03 PM


new

*

hot

*

unexpected

Business: Enterprise Software Sucks issues Got Issues? Enterprise software sure does. That's according to a new report from Forrester Research's principal analyst Paul Hamerman, appropriately titled Enterprise Apps Customers Have Issues. Respondents to the survey cite their top five issues: High Cost of Ownership: 91 percent of respondents said it was a "significant" or "very significant" business problem. Forrester Comment: "We believe that the concerns related to cost of ownership are primarily due to the installations of on-premises packaged applications, where internal support requirements and vendor maintenance contracts are a significant burden to IT shops, often causing other projects to take the back seat." Difficult Upgrades: 87 percent said it was a "significant" or "very significant" business problem.

Business

Forrester Comment: "While upgrading the packaged apps will extend their useful life and provide relief from vendorimposed version support deadlines, the upgrade process itself is often disruptive and expensive." Poor Cross-Functional Processes: 86 percent said it was a "significant" or "very significant" business problem. Forrester Comment: "The issue results from the fact that enterprise applications have been designed and implemented as functional modules, whereas the real end-to-end processes span multiple business functions." Doesn't Match What the Business Requires: 80 percent said it was a "significant" or "very significant" business problem. Forrester Comment: "While packaged applications are mature in many of the core ERP areas (such as finance, procurement and HR), most customers

still find gaps that must be addressed via customization or workarounds." Inflexibility Limits Process Change: 75 percent said it was a "significant" or "very significant" business problem. Forrester Comment: "Inflexibility tends to be more acute in older legacy packages, as well as modern packages that have technically complex tools for workflow and business rules configuration." — Thomas Wailgum

Quick take

Senthil Nathan on BPR Driven by the need to infuse efficiency and effectiveness in their business processes, CIOs are focusing on business process re-engineering (BPR). Sneha Jha spoke to Senthil Nathan, DGM–IS, CavinKare, about his BPR strategy.

iMAG iNG By MM Sh AN it h

s t r at e g y

Why did you embark on a BPR initiative? CavinKare had acquired business in diversified sectors and as a result of this the business' processes were becoming complex and inefficient. BPR helped us solve this problem. We could refrain from ineffective and cumbersome business processes thereby achieving performance improvement. It also enhanced our levels of customer satisfaction and service deliverables. How did you deal with change management? We planned the whole process in advance. We involved all the people that would be affected by the changes. We made sure that the changes were realistic, achievable

Vol/5 | i SSUE/07

and measurable. Also, having a clear vision of what we wanted to achieve with this change helped us in the long run. What were the benefits of the BPR exercise? Enterprise integration led to process improvement and several activities were combined into one. The number of steps in a process has reduced and hence checks and controls have been eliminated. BPR also helped us establish a single point of contact.

Senthil Nathan

How can CIOs ensure the success of a BPR project? A successful BPR project needs to be top-down, taking in the complete organization, and end-to-end processes. It needs to be supported by tools that make processes easy to track and analyze. A step-by-step analysis with time frames that determine probable outcomes and alternate solutions is a must. The project has to be reviewed at regular intervals. And the entire process should be documented for future reference. REAL CIO WORLD | M a y 1 5 , 2 0 1 0

11


How Will the 3G Spectrum Benefit Enterprises? As telecom operators scramble for 3G spectrum and enterprises explore its revenue potential the unprecedented hype surrounding 3G spectrum gets more intensified. Sneha Jha spoke to your peers about 3G’s enterprise benefits. Here’s what they said:

MoBility

trendlines

“Enterprises will expect better bandwidth and faster downloads for business apps. CRM will be more productive on a 3G platform, improving productivity and decision-making.” SateJ reVankar V Vankar AVP–it, NitCo tiles

"With 3G, video conferencing will be an online facility. This will boost connectivity with business. We will see something called a roaming office, based on the principle of anytime, anywhere access." JaY JaY Yant ant Magar Cio & GM-it, Mahindra Navistar Automotives

“It will revolutionize communication in terms of data speed and security. Also, the

possibility of using both voice and data simultaneously will enable the use of enterprise apps on the go.” r. chandran Cio, Daimler india Commercial Vehicles 12

Lend Your

Voice

Write to editor@cio.in

M a y 1 5 , 2 0 1 0 | REAL CIO WORLD

The 'Big Four' IT Apps

s u r v e y Every company or organization has a wide range of software in its application portfolio. But some enterprises carry a little more software bloat than others. Any CIO will tell you that a handful of prized apps are core to the company. For many, the ‘big four’ are the most missioncritical and the most expensive and complicated: ERP, CRM, BI and supply chain applications. A CIO magazine survey of 405 IT leaders regarding their technology priorities, offers an instructive look at the various cycles of corporate application needs as well as short- and long-term IT strategies for those apps. First off, it's clear from looking at the survey data that CIOs and their business governance teams have a lot on their plates right now. Their technology radar screens are crowded by cloud computing, business process management, mobile and wireless, data management and security concerns. CIOs are being pulled in two different directions right CIO Tech Priorities now. Technology decisionA majority of respondenst say makers are split between they are actively researching Bi growth and innovation, and apps in their organizations. cost containment when it comes to the primary focus for IT investments this year, notes the survey report. Others 12% Nearly half (49 percent) BI 35% plan to focus on enabling Supply business process innovation chain 16% (29 percent) and creating CRM ERP top-line revenue growth 21% 16% (20 percent) while an equal number are focused on lowering business operations costs (26 percent) and managing the IT infrastructure more efficiently (23 percent). In terms of the ‘big four’ business applications, the survey data reveals that BI is top of mind: 35 percent of the respondents are "actively researching" BI apps right now. That's not surprising: IT faces more pressure than ever to deliver actionable BI data to the business. As for the other three core software areas, the "actively researching" survey data is less impressive: Just 21 percent are doing that with CRM, 16 percent with ERP, and 16 percent with supply chain management applications. That's not to say those three areas are any less important to companies today, however. They're just more mature types of enterprise software. In fact, the data shows that more than 50 percent of the respondents are piloting, in production or upgrading their CRM and ERP application suites right now. — Thomas Wailgum

Vol/5 | i SSUE/07


How Not to Build a Cloud apps Despite rampant interest, few battle-tested bestpractice guides exist for hybrid internal/ external cloud networks. Here are six starting points to keep in mind.

will also drastically cut response time, Yaskin says.

enterprise

trendlines

Don't Centralize Too Much Using IT services concentrated in datacenters is the whole point of cloud computing and virtualization. But keep your eye on end user response times, says Vince DiMemmo, general manager of cloud and IT services at Equinix, which essentially provides the platform and infrastructure services on which other companies build their own. The lag between when a user presses a key and the response from a server that could be anywhere in the world could spell life or death for cloud computing projects, he says. "If the user experience isn't fast enough, VDI services won't be accepted, just as SaaS wouldn't be," says Mark Bowker, analyst at Enterprise Strategy Group. Don't Forget About the Hardware Virtualization and cloud computing are supposed to make hardware invisible, but that doesn't mean the people providing servers and storage can cut corners with servers that are less powerful or even I/O that restricts the flow of data, according to Gordon Haff, high-performance computing analyst at Illuminata. "Every cycle a command is waiting in a queue or to get through the [server or storage I/O bus] just adds more latency," Haff says. "The faster, more powerful the servers are the shorter the latency to the user, whether they're in the cloud or in a more traditional datacenter." Watch Out for Legacy Issues "Most legacy applications weren't designed to run in clouds or other elastic environments," according to Steve Yaskin, CTO and founder of Queplix, which makes tools to port legacy apps to clouds. "All the data on one person could be spread around three or four

Don't forget about the hardware. Virtualization and cloud computing are supposed to make hardware invisible, but that doesn't mean the people providing servers and storage can cut corners with servers that are less powerful. databases — transactions in one, addresses in another." Metadata catalogs from Queplix or Springsource, which was acquired by VMware, can reduce the number of times an application has to assemble data from multiple sources. Good caching can keep frequently used data available, and structuring storage behind the apps according to how frequently it's used,

Don't Let Your Software Get Chatty Making the network and the servers race only covers two-thirds of the latency pool, DiMemmo says. Many cloud-based applications use standard browsers, rather than interfaces designed for fast-response across the WAN or interfaces at both client and server that size their packets and limit administrative chatter between the two to keep performance as high as possible. "A lot of those APIs are pretty chatty," DiMemmo says. "They have to carve the message up three, four, five times and each of them add 40 or 50 milliseconds. We used to think if we got a 150-millisecond round trip from user to the server, that was pretty good. And it would be if you were only doing it once. These are doing it over and over for every communication."

—By Kevin Fogarty

Going Beyond IT

The CIO role is no longer synonymous with tech only. Today's CIOs are taking up different types of responsibilities — depending on what type of IT leader they are.

Responsibility

Security Strategy Corporate Operations Risk Management Customer Service

Type of CIO Functional Transformational 27% 11% 14% 14% 8%

32% 29% 18% 18% 17%

Business Strategist 23% 34% 27% 12% 30%

Respondents chose all that apply

SoUrCE : C io rESEArCh

14

M a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Vol/5 | i SSUE/07


Business n e t w o r k i n g Mention a phrase with the word social in it, and many CIOs will cringe. But whether you like it or not, there's no denying social media's presence in the enterprise. Contrary to what some IT stalwarts may think, many employees are using it for good, Forrester Research finds in a new report. Here are four steps that’ll help. 1 Understand the people you're engaging. Profile your employees and any participating customers to understand your demographics and gauge how active they already are in social media. Forrester cites that usually, the top types of users at any given business are spectators, joiners, critics and conversationalists. 2 Determine Your Business Objectives. Try focusing your objectives on innovation (such as a video-posting site where employees are encouraged to submit ideas), collaboration (project groups where members can IM each other, post ideas to a wiki and share documents), learning (like a Facebook site where employees can search based on expertise) etcetera. 3 Develop an implementation strategy. Refer back to your business objective and determine how you can achieve that. Social strategy should revolve around determining how you want to change the relationships between people in the social ecosystem, the report says. "By focusing on the relationships between the people in the community, and not the technology, CIOs can keep an eye on the long-term changes that matter," the report says. You should also consider how you will measure the success of your rollout. "A collaboration initiative might identify productivity or cost measures such as revenue per employee or revenue generated from new initiatives, whereas a customer support project might measure the average resolution time," the report says. 4 Select and deploy appropriate technologies. Aside from researching platforms to determine which system you'll manage, this step also includes devising a social media policy: determining which employees are allowed to access parts of the site, and what can be shared. —By Kristin Burnham

trendlines

social

16

M a y 1 5 , 2 0 1 0 | REAL CIO WORLD

5

the Internet Has

Ruined

For some people, the internet nternet is a killer app — literally. here ere are five things the Net is making virtually extinct.

internet

1. Trust in Encyclopedias When we were kids, if something was in the Encyclopedia Britannica, it was true. Now — thanks to Wikipedia — having ‘encyclopedic knowledge’ of a topic isn't as impressive when there's a good chance most of what you think you know was concocted by a 12-year-old. After a 2005 study by the British journal Nature showed Britannica and Wikipedia to be equally inaccurate, faith in all encyclopedias plummeted.

2. Bar Arguments itt used to be you could kill many hours and even more brain cells drinking beer and arguing over arcane trivia. Now whenever there's a question of fact, somebody just whips out a smartphone and does a search on Google. Where's the fun in that?

3. True Expertise Before the Web, if you wanted to call yourself an expert, you usually needed expertise in some field. Now all you need is a blog and sufficient quantities of chutzpah. For example, in a recent survey by Pr Week, 52 percent of bloggers call themselves ‘journalists’. Because calling yourself a ‘typist' isn't nearly as impressive.

4. Gud Spellng you can blame the rise in texting as much as twitter you t for the death of the King's English, though ‘relaxed’ standards for bloggers have also played a role. Will the last copy editor left standing please turn off the lites — er, lights?

5. Celebrity in n the old days you had to be good-looking or talented to become famous. Now, thanks to reality tV and social media, the fatter and more demented you are, the better your chances of becoming a household name. For example: y your our last 17 movies may have totally sucked, but if you have over 1.6 million followers on t twitter, witter, who gives a damn? (Guess who we are talking about.) —By Dan tynan t

Vol/5 | i SSUE/07

illUStrAtioN By UN NiKri ShNAN AV

ocial Media Means

Things


It’s Raining Chips

trendlines

r e s e a r c h The global chip industry will rebound sharply from the global recession and post 20 percent year-onyear growth in 2010, market research firm Gartner said. The growth forecast is slightly higher than others have predicted but is in step with bullish forecasts for the global semiconductor industry this year. Strong demand for a variety of chips has contributed to strong earnings growth at chip makers from microprocessor giant Intel to Samsung Electronics. Strong growth in PCs and memory chips will be primary drivers for semiconductor revenue to reach US$276 billion (about Rs 12, 42,000 crore) this year, up from $231 billion (about Rs 10, 39,500 crore) last year, Gartner said in a statement. Worldwide chip revenue declined 9.6 percent last year. "We have seen clear evidence that the semiconductor industry is poised for strong growth in 2010," said Gartner analyst Bryan Lewis, in the statement.

Gartner expects 20 percent more PCs to be produced this year than in 2009, a major boon for chip makers. DRAM makers will benefit from a 55 percent rise in revenue this year, making the chip segment the fastest growing "by far," the market researcher said. Most DRAM chips are used in PCs. The full-year chip industry growth target is slightly higher than that of in-house forecasters at Taiwan Semiconductor

Manufacturing (TSMC), the world's largest contract chip maker. TSMC recently said it expects the global semiconductor market to grow 18 percent this year, after contracting 9 percent last year, due mainly to strong PC and mobile phone sales. The company plans to spend a historical $5 billion (about Rs 22,500 crore) on new factories and production lines this year to keep pace with fast chip industry growth, and to make up for slower spending during the recession. Samsung, the world's biggest memory chip maker, predicted last month that strong PC sales will raise chip revenue 10 percent to 20 percent this year and that prices of DRAM and NAND flash memory chips will remain strong. Gartner warned that a correction might be needed for the chip industry in the near term in order to head off an inventory glut. The researcher said that according to the data it tracks, there is a need to re-balance chip sales with system sales. — By Dan Nystedt

And Solar Power Just got cheaper iBM has found what it claims to be a less costly way to build solar cells. the company's researchers have cooked up prototype photovoltaic cells from commonly found materials, yet the cells can rival the power-conversion efficiency of commercially available solar cells built from rarer, and more expensive, materials, they claim. the prototype solar cells demonstrated a power-conversion efficiency of 9.6 percent, a 40 percent gain in efficiency from solar cells built by others using the same or similar widely available materials, said David Mitzi, who leads the research team at iBM research.in this case, power-conversion efficiency is defined as the percentage of solar radiation energy that gets converted into an electrical charge via the photovoltaic effect. overall, the sun radiates 2.6 gigawatts of energy per square mile to Earth, research firm WinterGreen research has estimated. y yet, according to iBM, solar cells contribute less than .01 percent of the world's power supply. Part of the challenge is producing sun-energy-collecting cells inexpensively enough to compete with other means of energy production, Mitzi said. there are two ways to do this: lower the costs of the cells or improve their power-conversion efficiencies. i n n o vat i o n

18

M a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Trendlines_May15_10.indd 18

t today, commercial solar cells have been capable of achieving a power-conversion efficiency of about 9 percent to 11 percent, though they are based on rare, expensive-to-procure elements such as tellurium and indium. the researchers went with the thin-film method of making the cells. t typically, the approach involves creating a solution by dissolving all the elements in hydrazine. the solution is then applied to a substrate and heated, allowing a film to form. the challenge with this approach is that zinc is not soluble by hydrazine. So the researchers just put zinc particles in with the solution anyway. this turned out to be a better approach because the zinc particles acted as a stress reliever for the film allowing a thicker film to be made in one deposition cycle without cracking or peeling. Another cost-saving bonus to this work is that it uses a fabrication process that can more easily be duplicated in production settings. "you you can create on a simple substrate many y cells in a series, so this is a very valuable approach for making lower-cost and larger-area arrays," Mitzi said. — By Joab Jackson

Vol/5 | i SSUE/07

5/7/2010 1:15:30 PM


alternative views B Y p r i ya n k a

Remote Infrastructure Management Ayes vs Nays

Remote infrastructure management helps cut costs. There is a direct benefit of not having to spend on employees, etcetera. N. Jayakumar Head-IT Infrastructure, Reliance Capital

More and more organizations

P hotos by Srivatsa Shandilya

trendlines

are moving towards remote infrastructure management. Simply because it helps CIOs concentrate more on the business' needs. Otherwise, they would have to get into the whole process of recruiting people, managing their operational issues, and developing the skill sets required to manage them. We don’t have to deal with all these issues as our data is being remotely managed from Mysore. And it certainly has its benefits.It helps in cutting costs for the company. There is a direct benefit of not having to spend on employees, real-estate, and other mundane bills. It also helps in increasing efficiency and productivity. Also, managing IT in-house is particularly difficult for huge companies with massive infrastructure and country-wide operations. Our users and customers expect 24/7 services. And in today’s competitive environment, one needs to have server and network support at all times. Finally, it can be quite a tedious job to manage the growing demands of customers. Remote infrastructure management is an easy way to manage all this and more. We would rather have it outsourced and managed by somebody who we think is an expert.

20

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Trendlines_May15_10.indd 20

Training employees and infrastructure cost is not an issue because we have a highly skilled IT support team. S.S. Soni Executive Director (IS), Indian Oil

In our case, though we have a centralized IT support and infrastructure, it is not managed remotely. We have a separate team that looks into all matters pertaining to the IT infrastructure. This team physically accesses and manages issues related to it. And the people on this team are actually our own employees. We don’t feel the need to let outsourcers remotely manage our IT infrastructure for us. Training employees and cost is not an issue because we have a highly skilled IT support team. And hence the question of managing IT remotely does not arise. And most importantly, we run mission-critical transactions 24 hours a day. We cannot afford any halt in our transactions at any point in time throughout the year. For instance, our supply chain transactions are of mission-critical importance to us and they must operate 24/7. So, I believe that not going for a remotely managed infrastructure is a practical decision. Sure, downtime is always an issue, whether your IT infrastructure is in-house or outsourced. But, in case of a disaster, we do have a business continuity cell in Bangalore. Hence, it is only in extreme conditions that we switch over to a remotely managed IT infrastructure.

Vol/5 | ISSUE/07

5/7/2010 1:15:38 PM


Chris Curran

Strategic CIO

The Odd Couple Organizations still have plenty of work to do in building a strong business and IT coalition. Here's a look at the state of the CEO-CIO union from the front lines.

W

Illustration by mm shanith

hile they are not Felix Unger and Oscar Madison (comic characters from a play about two mismatched roommates that spawned a successful television series and a movie), a company's CEO and CIO can at times make a fairly odd couple. Differing agendas create significant challenges from the outset. Sure they do share a love-hate relationship (though not as intense as the CIO-CFO one) but at the same time, we all understand it is critical for the CIO to engage the CEO and senior business leaders in discussions of IT investments. Considering those somewhat contradictory points, what exactly is the state of the union between business and IT leaders? Gone are the days when IT used to be a support function and was identified by the ‘cost-center’ tag. Today, it has changed the way business does business and has rightly earned the ‘profit-center’ label. But there is still a wide gap between business’ perception of IT and what it really is. A few years ago, Diamond Management & Technology Consultants launched a broad annual study of various business leaders' 'Digital IQ'. Through the Diamond Digital IQ research, we seek insights into the challenges companies face associated with connecting the enterprise's strategic objectives with the actual business value — which is often reached several years after the big ideas are hatched. The 592 survey respondents this year comprise equal parts business leaders and IT leaders for purposes of balance. The survey covers an array of issues, ranging from attitudes about IT's contribution to corporate competitiveness, to business/ IT alignment, to IT management practices. The industries

22

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Coloumn_Odd_Couple.indd 22

Vol/5 | ISSUE/07

5/7/2010 1:24:39 PM


Chris Curran

Strategic CIO

included in the survey are large or very large companies in banking, financial services, insurance, and consumer products, among others. Some of the most instructive responses in Digital IQ 2010 relate to the senior business executive support for IT. Here are some observations from the survey.

CEOs and Tech Savvy Observation 1: Our CEO or senior-most business leader is an active champion in the use of information technology to improve our business. The promise of a fully integrated organization where there are no formal 'business' and 'IT' distinctions must begin at the top. The business has to understand that IT not only provides it with a competitive edge but also dictates business opportunities in the future. So, IT’s capability must be viewed by all business leadership as both a driver of growth and a tool to improve efficiency. While 64 percent of respondents agree with this statement, I find it incredible that the figure is not in the 80 percent to 90 percent range. I was reminded by the CIO of an oil & gas company that "IT's role is not always strategic, and that's ok." In this case, I would expect the CEO to still be an active champion of IT to improve the business operations and efficiency (and the CIO to work hard to develop a broader perspective of IT's potential).

CIOs and Business Sense Observation 2: Our CIO is very involved in the business strategy development process. Participants' responses here provide further insight into how much senior management teams buy into the importance of IT. Only 54 percent of respondents agree with this statement, which make you wonder what the other 46 percent are doing. An insurance executive told me a story of a claims initiative that some colleagues in 'the business' presented to him, and which was later approved. The initiative involved the use of images, video, and audio to better understand the claims and so that experts could do more of the reviews and QA remotely. Late in the project, one of the managers came back to him and admitted a big mistake that would cost them several million dollars. Apparently, they forgot to estimate any storage for all of the digital media. Observation 3: Our CIO is recognized as a business leader, not just as a leader of IT. Less than half the survey participants say the CIO is recognized as a business leader. This is mind-boggling — is the head of HR a business leader? Maybe he or she is just a 'people leader', not a business leader. This is Exhibit A for why the business-IT line of demarcation needs to be erased once and for all. I believe the way a CIO allocates time is directly related to business' perception of the CIO.

Vol/5 | ISSUE/07

Coloumn_Odd_Couple.indd 23

The promise of a fully integrated organization where there are no formal 'business' and 'IT' distinctions must begin at the top. Observation 4: The CIO lacks productive working relationships with the business leaders. Forty-seven percent say they have neutral or negative perceptions of the CIO-business working relationship. This makes me wonder which among the following might be going through respondents' heads as they answer this question: "This is a problem and I should do something about it." "The problem is on the other side. When will the business (or IT) get its act together?" Is the onus solely on the CIO to develop a good working relationship? This reminds me of my mom saying "it takes two to tango" when I would get in a fight with my sister.

Leap of Faith Observation 5: Business executives are very confident in the company's IT capabilities. Half of our respondents think the business leaders have either neutral or negative attitudes in terms of IT's capabilities. My colleagues, Peter Weill and Jeanne Ross at MIT's Center for Information Systems Research, believe that service delivery is the basis for everything else. Before worrying about improving program management or executive dashboards, make sure that the basic IT services are in order. The CIO at a beverage company, for example, was having significant support issues with her peers in the business units. As a result, she set-up a 'concierge' service specifically to deal with their needs and questions. This may seem extreme, but it reduced the noise and improved buy-in. The value gained from IT in an organization depends on everyone's ability to understand it and access it. The attitude and culture required to embrace IT starts at the top. These initial Digital IQ results show us that 'bi-partisanship' between business and IT could still use a boost as well. CIO

Chris Curran is Diamond Management & Technology Consultants' Chief Technology Officer and managing partner of the firm's technology practice. Send feedback on this column to editor@cio.in

REAL CIO WORLD | m a y 1 5 , 2 0 1 0

23

5/7/2010 1:24:39 PM


Cover Story | M&A

V.S. Parthasarathy, Group CIO & EVP Finance and M&A, M&M, says that the first 100 days are the most important in an M&A. 24

m a r c h 1 5 , 2 0 1 0 | REAL CIO WORLD

Cover_Story_may_010.indd 24

Vol/5 | ISSUE/05

5/7/2010 2:54:45 PM


Cover Story | M&A

Twin

Power By Priyanka

Reader ROI:

The benefits of involving the CIO in predeal talks What hard decisions a CIO needs to take in an M&A Why the first 100 days are crucial

When Mahindra & Mahindra set out to make a foray into India’s burgeoning two-wheeler market, there were heated discussions within the company’s top management on how to go about it. The debate centered around the choice to launch a new set-up from ground up — the green approach — or to buy an existing entity and plunge straight into the market — the brown approach. “We wanted a way that would link our business strategy with the final output we had set for ourselves,” says V.S. Parthasarathy, group CIO and EVP Finance and M&A at Mahindra & Mahindra. “The goal was to get into the two-wheeler market.” The Rs 14,983-crore auto giant’s decision to take an M&A route and buy out a well-known two-wheeler player isn’t unusual — anymore. As the upturn infuses more capital into Indian companies hungry for growth, many are turning to an M&A strategy to consolidate their positions and further their hegemony. REAL CIO WORLD | M a y 1 5 , 2 0 1 0

Cover_Story_may_010.indd 25

Photo by FOTOCO RP/SHAILESH

M&M’s acquisition of Kinetic Motors beat the 70 percent failure rate associated with M&As. Its success came on the back of hard decisions and some unusual strategies — including the role of its CIO.

25

5/7/2010 2:54:46 PM


Cover Story | M&A “There is no doubt that M&A activity is going to go up this year,” says Arsh Maini, director, Deloitte, whose firm has handled many highprofile M&As. “In my line of business, we’ve already begun to see an increase in M&As.” But as a strategy for growth, M&As are among the most risky business propositions for an enterprise and have a track record for failure. Research from various organizations, and over many years, point to the same thing: between 60 and 70 percent of M&As crash and burn and, worse, bring only what M&A specialists call negative synergies. The fact is M&As are the business equivalent of heart transplants and there are scores of variables that go into pulling it off successfully. One of them is IT. Though traditionally, IT leaders have taken a back seat in M&As, especially pre-deal talks, that’s beginning to change. Company honchos and CEOs are increasingly seeing the importance of bringing their CIOs to the M&A table earlier and making them part of the core deal.

26

Turbocharged M&As If few CIOs pay any real attention to honing their M&A skills, they can’t be blamed; for the most part M&As aren’t common among Indian enterprises. But that’s less and less true. According to Ernst and Young, which consults on among the highest number of M&As in India, “about 54 percent of (Indian) businesses state they are likely or are highly likely to acquire other companies in the next 12 months, almost double the number from six months ago.” And another study by Chennai-based Venture Intelligence shows how those plans

are beginning to pan out. According to the analyst house which tracks private equity, venture capital and M&As in India, M&As in the first quarter of 2010 increased by 76 percent over the same period last year. There are many reasons why enterprises turn to M&As. Among the most common are to consolidate their position as market leaders by buying out their competition, like Mittal Steel’s buyout of Arcelor to create the world’s largest steel empire. They also leverage M&As to enter a new — yet adjacent — sector to improve their offering like M&M’s takeover of Kinetic Motors. Then there are the Crompton Greaves of the world. In the last five years, Crompton Greaves has grown from Rs 4,367 crore to Rs 8,737 crore on the back of 19 acquisitions. J. Ramesh, CIO, Cromptom Greaves says the acquisition decisions were taken solely to augment the company’s brand value and in order to be seen as a larger MNC. “Our acquisitions have helped us to become more

Source: Forrester Report: A CIO’s Guide to Merger And Acquisition Planning by George Lawrie (Principal Analyst)

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Cover_Story_may_010.indd 26

“It is very important to make sure that a CIO is brought into an M&A discussion even before a deal is finalized because IT is likely to be a key driver in realizing the value that companies expect from an M&A,” says Maini. “If you don’t bring IT in, you are likely to harm your chances of realizing the value of an M&A.”

Vol/5 | ISSUE/07

5/7/2010 2:54:47 PM


Cover Story | M&A competitive and grab domestic and export orders against stiff competition from global majors. They have brought more to our core competence in the power systems business and have brought more export earnings than any other segment,” he says. A key learning from companies who have gone down the M&A route is the importance of IT. And that’s more true in M&As in the BFSI and telecom sectors. “A failure to integrate the IT systems here directly leads to a business or a merger failure,” says Sivarama Krishnan, executive director & partner, PwC. “And if they don’t integrate the two systems, there will be no cost reduction and no increase in revenue from customers,” he says. In India, he says, Bharti Airtel’s acquisitions of companies such as Skycell and Hexacom were successful primarily because their IT and network were integrated well. And that’s where the CIO comes in. “A CIO can definitely act as an enabler in leveraging cost optimization because they can give insights to the management that can then help leverage cost optimization,” he says. M&M’s takeover of Kinetic Motors is a great example, but it called for some hard decisions.

You’re the Target

Three ways to prepare in case your company is acquired. A merger or acquisition takes two. And any company can be an acquisition target. Even if you don’t see an acquisition coming your way today, it doesn’t mean you’re in the clear, says Albert Eng, former senior advisor with Cerberus Capital Management. You can increase the chances that your IT department (and maybe you) will survive when another company has yours in its sights if you’re prepared. Here’s how Eng suggests you get ready for integrating your company into another. Run the numbers: The acquiring company is going to want to assess your IT organization, says Eng. If you’re prepared, structuring the deal will make the process more efficient and support a more accurate valuation of your company. Document and refresh your IT strategy, systems architecture, projects under development, analysis of past projects and, of course, IT costs. Eng wants to see long-term and detailed budgets in order to know how well the IT organization is performing. Most importantly, make sure you can show quantitatively how your department adds business value. “Most IT leaders falter at this part of the process because they don’t have the experience to align costs with business value appropriately,” he says. Understand all your assets and intangibles: When you’re joining two businesses, you’re also integrating two groups of people. It’s not enough to understand the capacity of your systems — you have to know the strength and weaknesses of your employees, too — and be able to communicate them to the acquirer. Get rid of bad habits: When being acquired, it’s an opportunity look within your IT department for inhibitors to a cost-effective integration. Obsolete or undocumented technology, poor management of your IT budget and inflexible workers can hurt your chances at a successful merge. —Jarina D’Auria

M&M’s M&A In the closing months of 2007, several weeks before a deal was brokered between Kinetic Motors and M&M, Kinetic Motors wasn’t the company it once was. It had been about nine years since it’s partner, Honda, whose designs it had used for over a decade, pulled out of the partnership. As a result, the twowheeler manufacturer that had given India the Luna and whose bike, the Kinetic Honda, had become synonymous with all nongeared scooters, was flailing. The broken partnership left Kinetic Motors with no other option but to create their own set of designs. “That was costing them too much,” says Krishnan. What they needed was a partner who would be willing to support and invest in R&D. About a 100 km away, but at about the same time, M&M, whose brand was associated with tractors and light commercial vehicles, wanted to enter the two-wheeler market. “A two-wheeler is the first thing a person looks for when it comes to personal mobility, so we 28

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Cover_Story_may_010.indd 28

decided we needed to be in the space,” says M&M’s Parthasarathy. The acquisition was a mutually beneficial one, for both Kinetic Motors and M&M. While Kinetic Motors found a company that could help it lower R&D costs, M&M found a perfect way to enter a market that was hot. “The two-wheeler industry in India has been on a tremendous growth path. But in many segments of the two-wheeler sector, there is essentially a duopoly. After doing market research and finding out if growth in this segment was sustainable, we saw that there was room for another player to add value,” says Anoop Mathur, president, Mahindra Two Wheeler Sector. “If we had taken the green-field approach, we would have lost precious time, about two-three years.” In February 2008, a few months before the media caught on to the buyout rumors, pre-deal talks between the companies took off. Like most pre-deal talks, they were kept very quiet. To ensure things stayed under

wraps, both companies decided to keep less than 10 people involved from both sides, says Parthasarathy. And that could be the start of an M&As problems. While it makes sense for companies to keep M&A information on a need-to-know basis, IT leadership is excluded from that list. In a recent paper, George Lawrie, principal analyst Forrester Research, says that “IT implications of M&As are poorly anticipated,” and suggests that CIOs should come forward and claim their right to be heard — even before the deal is finalized. Fortunately, that’s not how it worked at M&M. Parthasarathy was among the handful of people onto the deal from the beginning. That’s because, apart from being M&M’s top IT leader, he is also the automotive giant’s EVP finance and M&A. In the last two years alone, he has led 25 transactions with acquisitions ranging from Rs 200-crore to Rs 2,000 crore.

Vol/5 | ISSUE/07

5/7/2010 2:54:47 PM


Cover Story | M&A That combination of duties is rare for a CIO, but it shows M&M’s commitment to having IT leadership present for M&As. And the Mahindra Group takes their M&As seriously: They have consistently used the strategy to keep growing. In fact, the group, which recently took over Satyam, describes itself as a “federation of autonomous businesses united by a common core purpose and a set of core values,” and has a separate M&A cell. But just because he was part of an elite circle didn’t mean Parthasarathy’s worries were over. In fact, they were just beginning. “The systems at Mahindra and those at Kinetics were 180-degrees apart,” Parthasarathy would later say.

30

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Cover_Story_may_010.indd 30

Sourav Sinha, CIO, Kingfisher Airlines, says that when his company took over Air Deccan training staffers took time. That’s vital for CIOs to budget for because training takes place at a point when integration fatigue could have set in.

“To me, the initial 100 days represent the most important period. We first wanted to focus on health drivers rather than growth drivers,” says Parthasarathy. “You can’t just tell [the acquired company] that we want to be on SAP. They can’t do that right at the start.”

Parthasarathy’s 100-day pre-integration plan aimed to achieve three things: To take full ownership of Kinetic Motors’ IT systems, strengthen them just enough to keep them whole, and then upgrade and move the entire system. Strengthen, lift, move.

P hoto by Srivatsa Shandi lya

An M&A is like an arranged marriage: The couple tying the knot is rarely in charge — but they have to make it work. It’s also similar because the first 100 days after an acquisition are very crucial, says PwC’s Krishnan. He says that soon after a deal is closed there is a considerable amount of excitement and anticipation on both sides and companies need to leverage those energies to their benefit. “This is also the time when an acquiring company sets an agenda of what is going to happen in the next two to three months,” he says. “It is during this period when executives need to drive the most for change.” At M&M’s headquarters in Mumbai, Parthasarathy was aware of the importance of the first few months. But he was also faced with two important questions: What did he want to accomplish in that time frame and how fast could he push the integration without creating long-term cracks in the new relationship? Sure he could push for a quick integration and show short-term benefits, but that could also create an ecosystem that needed tweaking — or worse, patching up — every couple of months. That is why he decided instead to slide slowly into the integration. His strategy called for a 100-day pre-integration plan, followed by a 10 month-integration process, timed to synchronize with a larger Mahindra Group standardization initiative called Project Harmony. His decision to trade reliability over speed is one that any CIO called into an M&A will have to face.

P hoto by dat ta kumbhar, mid-day

The 100-Day Plan

Vol/5 | ISSUE/07

5/7/2010 2:54:49 PM


Cover Story | M&A Although he had started the IT familiarization process at the due diligence stage, Parthasarathy’s team now focused on getting a deep understanding of Kinetic Motors’ systems. They set about studying all of its processes until they knew them as well as their own. Some of what they found echoed the reports Parthasarathy had received from the due-diligence, and it didn’t thrill them. “While we had an integrated ERP, their systems were working in silos and not talking to each other,” says Parthasarathy. But the sheer exercise of reviewing every process, asset, and resource opened their eyes to parts of Kinetic Motors’ set up they wanted to keep. Take for example, Kinetic Motors’ superior sourcing. Parthasarathy was aware of how important sourcing was to their new business. He had learnt from his due-diligence reports that Kinetic Motors’ was doing a reasonably good job at sourcing low-cost materials. And he chose to keep the practice even after the acquisition had taken place. That’s an approach not all companies believe in. When Kingfisher took over Air Deccan, for instance, the latter’s IT system was scrapped and Air Deccan’s operations were moved to the platform that Kingfisher used. “When I joined, these two airlines were operating on two different systems and they were quite different,” says Sourav Sinha, CIO, Kingfisher Airlines. “The system at Air Deccan was much smaller and from what I experienced in my first few days, it was not stable. It was not exposed to different travel agents and when you merge two airlines you have to operate under one code.” The approach of evaluating and sieving out the best processes immaterial of who acquired who, isn’t something all companies follow and won Parthasarathy goodwill. “These are trustbuilding and credibility-building measures,” says PwC’s Krishnan. “It is a good step to adopt the best practices of either of the two companies involved in an acquisition.” At the same time, Parthasarathy made sure that team from Kinetic Motor was introduced to the systems of M&M. They were explained why some of M&M’s systems were better suited for the business. Take, for example, the quality check procedure at Kinetic Motors’ assembly line. Parthasarathy noticed that defects were only picked out at the final stage

Vol/5 | ISSUE/07

Cover_Story_may_010.indd 31

of a subcomponent’s production. Whereas, the new system which M&M used could do that at an earlier stage of production, meaning it could also be fixed earlier. “There has to be a proper inspection clause,” he says. “We took them to the shop floor and showed them how the new system worked,” he remembers. He adds that those who showed interest were chosen as IT ambassadors within the new organization and their job was to spread their beliefs. In the meanwhile, the IT team was also busy dealing with other issues. They needed

During periods of transition and disruption, employees look first to leaders for guidance about

how to react and behave, for motivation, and for focus.

to update the licenses of all systems, replace old PCs with newer ones and meet all of Kinetics Motors’ IT vendors. The IT team also filled in process cracks with IT putty. For instance, Parthasarathy restructured the two-wheeler manufacturer’s stock so that their systems could work more efficiently. “We were not making major investments,” says Parthasarathy. “We were only revamping their processes and making incremental investments.” But perhaps the most important achievement of the 100-day plan was creating a sub-text. In The Importance of Leadership and Culture to M&A Success, Towers Perrin’s Global Change Management Leader, Mark Arian, says “experience shows that what leaders do during M&As has a significant impact on how employees of both organizations react and

promote a sense of community and purpose. During periods of transition and disruption, employees look first to leaders for guidance about how to react and behave, for motivation, and for focus.” Over the first 100 days, the new team at Kinetic Motors learnt first-hand M&M’s belief in collaboration and co-operation — and that they were looking for what was best for the new entity. That built the foundation of trust, which would go a long way in negating the number one reason most M&As fail: a clash of cultures. But would that be enough?

Driving For Two The day of reckoning rolled around on November 16, 2008. Twenty-five M&M executives from various departments were sent into marathon talks with their counterparts at Kinetic Motors and bankers to seal the Rs 110-crore deal. “I was being given a second-by-second report of the talks,” Parthasarathy recalls. “We started at 8.00 am and were at it until 2.30 the next morning.” But before the ink on the deal had dried up, Parthasarathy was already looking ahead. With many months of integration ahead of them, Parthasarathy knew he couldn’t fritter away the energy and goodwill he had earned. He moved quickly to get organized, starting with tasking the core team, comprising functional heads of sales, marketing, sourcing, manufacturing, etcetera, with creating a blueprint for the integration. He also formed a team of key users made up of people who were likely to be the biggest users of ERP. The core team, which had already spent a considerable amount of time reviewing Kinetic Motors’ systems, spent another two to three months creating a blueprint. The process took that long because Parthasarathy wanted to be thorough. He wanted the new entity, Mahindra Two Wheelers, to adopt a system that would help it take a quantum leap and be a market leader in design, manufacturing and sales. And for that he needed to give people the time to debate each process in the blueprint and implement what the group considered the best solution. But Parthasarathy’s open policy came with a price: By opening up the debate and putting collaboration center stage, it also made it his job to ensure that the minority bought REAL CIO WORLD | M a y 1 5 , 2 0 1 0

31

5/7/2010 2:54:50 PM


Cover Story | M&A completely into whatever the majority decided, which meant change management. It wasn’t always easy. “Most people at Kinetics were obviously very comfortable using their systems,” recalls Parthasarathy. For instance, they had deployed a home-grown package for financing and inventory management. “They wanted to go on with those systems only, because they found them very flexible.” Those sentiments are only to be expected says Forrester’s Lawrie. “The prospect of standardizing the IT stack across the new organization inevitably raises dismay or even hostility in the organization to be changed,” he writes. “Line-of-business end users often

have a strong emotional and professional investment in their apps.” Parthasarathy and team employed a two-pronged approach to convince them to change their point of view. First, they showed them the limitations of Kinetic Motors’ systems, including its inability to cope with an increase in volumes or to operate from different locations concurrently. Parthasarathy also exposed them to the learnings of other Mahindra Group companies. Mahindra’s Retail arm (more well-known for its Mom & Me stores) for example, has also worked on a home-grown system and had moved to an integrated ERP. “At that point, they accepted. Reluctantly,”

remembers Parthasarathy. “They must have thought, ‘Mahindra is now the big boss, how long can we keep saying no?’.” That reluctance disappeared thanks to consistent change management practices. Just before going live in March this year, Parthasarathy remembers meeting with the employees of Mahindra Two Wheelers and asking them if, in retrospect, he had done the right thing by getting them to accept the changes he had suggested. “And they all said that it was indeed the right thing to do,” says Parthasarathy. The change management exercises stretched to other areas as well. This included system access. Earlier all Kinetic Motors’ systems were physically located at the plant, but now, all their systems needed to be at M&M’s corporate office in Mumbai. This meant that staffers could only access their systems via a WAN or a LAN network cable. “It was a major concern for them because now they could not physically access their systems,” he says. Parthasarathy got around the problem using a combination of technology and good,

In a trade-off between immediate gains and long-term synergies, V.S. Parthasarathy, Group CIO & EVP Finance and M&A, M&M, made the hard choice of taking it slow. old-fashioned communication. He laid a dedicated network between the Kinetic Motors plant in Pithampur (also known as the Detroit of India) and the Mumbai corporate office and held detailed discussions.

32

m a r c h 1 5 , 2 0 1 0 | REAL CIO WORLD

Cover_Story_may_010.indd 32

Photo by FOTOCORP/SH AILESH

Working in Tandem With much of the groundwork already behind them, it was time to put the final piece of the puzzle in place to ensure the success of the integration: training. “We had experienced some resistance when the new ERP was being tested,” remembers Parthasarathy. If he didn’t want that to turn into a full-scale mutiny as he got more users involved with the new IT systems, he knew that they had to train people intensively — even if it increased the integration’s costs.

Vol/5 | ISSUE/07

5/7/2010 2:55:01 PM


Cover Story | M&A So, 20 members of the core team, 50 key users, IT training experts from the corporate office of Mahindra in Mumbai, and a few external trainers were taken to the Pithampur plant. They were divided into small groups and familiarized with the new ERP and how it could be utilized across business verticals. The sessions, he says, which took place over six weeks, were interactive with staffers being asked to do transactions on the spot so that they could get a hang of the new systems. The training for each batch, which lasted a couple of days, must have added costs but proved to be a worthwhile investment. That’s a lesson that organizations that have successful M&A track records have learnt: As banal as training is, it’s critical and IT leaders need to be ready for a longdrawn affair. “Training plays a vital role,” says PwC’s Krishnan. Take the Kingfisher and Air Deccan merger for instance. As different as it was from the M&M-Kinetic Motors M&A, Sinha applied the same training lessons. “[Training] is not a single-day affair,” he says. And it was just as intensive and probably cost a lot. Like M&M, Sinha transported both Kingfisher and former Air Deccan employees, only in his case he moved them to airports where Kingfisher didn’t have operations, so that they didn’t get in the way. And at airports where Kingfisher flew from, he says training was a “bit go-easy” and staff took two to three weeks to get the hang of the new systems.

MnA: Masters In Acquisition Thanks to the Kinetic Motors acquisition, M&M has successfully entered one of the largest two-wheeler markets in the world. “Two-wheelers have not only been growing for some years now, they will continue to grow in the years to come,” says Mahindra’s Two Wheeler Sector president Mathur. “When we look at how our business has taken off and the rates at which our volumes and numbers are growing into the scooter market, it gives the confidence that we made the right decision.” As fast as that growth is, admittedly it’s a small base. But that toe-hold on the market could soon help M&M cross-sell with other areas of the M&M stable. “This acquisition will give us an opportunity to be the first

Vol/5 | ISSUE/07

Cover_Story_may_010.indd 33

Anoop Mathur, President, Mahindra Two Wheeler Sector, says the Kinetic Motors M&A allows M&M to tap a new market and build loyalty at the bottom of the transport value chain, which will drive customers to M&M’s other vehicles.

point of contact with the consumer,” says Mathur, who hopes customers will move up to geared motorcycles and then cars — all belonging to M&M. That synergy works two ways. The old Kinetic Motors can now leverage M&M’s deep understanding of a certain set of customers to sell. “No other two-wheeler company has the potential of internal synergies as Mahindra. Our market is very wide as we have presence in the rural, urban, and semi-urban markets,” says Mathur, adding that the acquisition brought about synergies in terms of managing vendors, procuring auto components, and common manufacturing platforms. “These are all synergies that will be significantly useful as we go forward,” he says. And he attributes much of that success to IT’s involvement. “IT and the M&A team worked very closely from the time since we had started evaluating the prospect and we have benefited greatly.” But admittedly, M&M’s story is more the exception than the rule. Most CIOs don’t

wield the influence Parthasarathy does at the M&A table. Though it’s changing, more CIOs need to get on board. “CIOs should have more focus on business than they generally do,” says Krishnan. “And they should not hesitate to make whatever changes they have to get better business results.” Back at M&M’s headquarters, good news from the M&A is making waves. Mahindra Two Wheelers has increased production to 70,000 units in 2009-10. “Our sales have been increasing every month,” says Mathur. In addition, it recently launched two new gearless- scooters: the Duro and the Rodeo and plans to bring out a Mahindra motorbike later in the year. “It’s a great alliance,” says Parthasarathy. “And, I think I can now say: they lived happily ever after.” CIO

Priyanka is correspondent. Send feedback on this feature to priyanka@idgindia.com.

REAL CIO WORLD | M a y 1 5 , 2 0 1 0

33

5/7/2010 2:55:04 PM


VIEW

from the TOP

The Shape of

Tomorrow By Anup Varier Climbing the corporate ladder in an organization as large as Wipro is no mean task and Suresh Vaswani, Jt. CEO, IT Business, Wipro, has been patient in his rise. It’s taken him over 23 years to get to where he is now — at the helm of Wipro’s affairs with his joint CEO Girish Paranjpe. Over the last two years, the duo have proved their skills as they led Wipro through the slowdown and out of it a healthier company. In this interview, Vaswani shares a picture of what the future will look like and IT’s role in it.

What technologies should corporates watch out for? Suresh Vaswani:

View from the top is a series of interviews with CEOs and other C-level executives about the role of IT in their companies and what they expect from their CIOs.

34

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

View from the Top.indd 82

There are six themes that we believe are extremely strategic from a customer viewpoint irrespective of their industry. The first is clearly cloud computing and the second is green IT. The third is collaboration. Most companies are becoming more global as they expand into emerging markets. To do that effectively, they need to collaborate and these technologies will become essential for companies to succeed. Then there is social computing. All the blogs, Twitters and Facebooks of the world will become an integral part of the

organizational ecosystem. That’s a fact that cannot be ignored. Mobility is extremely important. In India, mobile penetration goes deep. So, if companies want to penetrate rural or remote markets they need to think of ways of leveraging the mobile to enable business rather than setting up brick-and-mortar infrastructure and adding to costs. Finally, predictive data analytics — rather than reactive analysis — will be very important because at the end of the day if a company can analyze more data, more productively, it will be able to pick out customer trends and benefit. That will result in a shift from a what-did-we-do-wrong paradigm to a

Ph oto by Srivatsa Sh an dilya

Suresh Vaswani, Jt. CEO, IT Business, Wipro, looks ahead and tells enterprises which technologies they can’t take off their radars and how Wipro plans to help them meet the future.

Vol/5 | ISSUE/07

5/7/2010 3:01:08 PM


Suresh Vaswani expects IT to: Adopt flexible cost structures Take collaboration further Be more sustainable

Vol/5 | ISSUE/07

View from the Top.indd 83

REAL CIO WORLD | m a y 1 5 , 2 0 1 0

123

5/7/2010 3:01:10 PM


View from the Top

what-is-going-to-happen-and-what-canbe-done-about-it attitude. This will help direct investments in the right direction.

Some believe cloud computing is more hype than substance… The cloud is here to stay. It is a serious innovation among IT business models. All organizations will adapt to it. Mediumsized organizations will move to it because it will make more business sense in the long run. Larger enterprises will have a combination of private and public clouds. In the mid-term, larger organizations will see an integration of traditional IT systems; they will implement private clouds within their organizations and use external public clouds to offer services to their users.

Isn’t green IT a product of the slowdown? Where, then, is its place tomorrow? I would not link it to the slowdown at all. Green IT is linked to the larger challenge of ecological sustainability. I would say that it is related to an increase in awareness regarding the need for ecological sustainability, which has gone up because of the crisis that could face us if we do nothing about being sustainable in the next 10-15 years. If corporations don’t become proactive about it or don’t create a definite plan, in the end everybody will lose. Green IT provides a double whammy in a manner of speaking. The two issues that can be tackled by it are: How can I make my IT infrastructure greener by using less power and driving productivity; and how can I use IT to propagate sustainability elsewhere? Both are important. And, in the long run, green IT is also cost-effective.

But how does it fit in a world where companies are focused on the short-term? I think companies are looking at both short-term and long-term goals in order to develop a sustainable business model. At the 36

View from the Top.indd 84

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

“The cloud is here to stay. It is a serious innovation in IT business models. All organizations will adapt to it.” — Suresh Vaswani end of the day, there is a new reality today. The economy has been reset and it will move in a particular direction from here on and many of the expectations of customers, stakeholders, governments have changed. In this new reality some of the concerns that have surfaced are: Am I cost optimal and can I transform my cost structure? This is why, I feel, we need to create more demand by enhancing customer experience and such a demand with an optimal cost structure will drive sustainability.

What key lessons came out from the slowdown? Today, organizations are becoming more risk-aware. The have started identifying risks and coming up with methodologies to mitigate them. That’s something organizations would not have done five years ago. People assumed that the world was a hunky dory place and expected everything to go fine. But post the slowdown, when things started turning tough, the net learning is to be risk aware and that, in turn, would take care of the spikiness in the market. The entire process of identifying risk and solving it has speeded up.

Organizations are also looking seriously at developing a variable cost structure. If I do everything myself, I will have a bloated cost structure. If I do just what I need to do myself and I have partners to do the rest, I have a cost structure that I can scale up and down and not impact my business substantially. And my partner works in a much larger ecosystem because he is doing similar jobs for multiple people and can absorb variations.

Do you expect more Indian companies to outsource? Is your focus moving to the domestic market? We are very focused on the domestic market. Today, about 21 percent of our business is coming from emerging markets like India and the Middle East. As a company, Wipro has always focused on our home markets because it goes without saying that if you want to be a global player then you have to be a good player in your domestic market. A focus on the domestic market also helps us create solutions and products that we can leverage globally and allows us to further our expansion plans into more emerging markets. Emerging markets like ours are important because they necessarily are high-growth markets and generally do not have legacy systems.

You’ve moved away from dedicated offshore centers. Is that a conscious shift? So far the trend has been that you had dedicated ODCs (offshore development centers) for customers. But that trend is shifting to a more shared-services model. The paradigm is shifting because the customer is saying: Look, this is the service I want, this is the outcome I want, these are the constraints under which we operate. You give me a service outcome rather than giving me a building full of people and networks. This has given rise to the birth of what we now call and have branded as the ‘flex Vol/5 | ISSUE/07

5/7/2010 3:01:16 PM


View from the Top

delivery’ concept. So if I have to do Oracle support for a customer, I can actually do it from one of our flex delivery centers and still meet the security concerns and response time requirements of a customer. At the same time, I am able to leverage more specialized resources more effectively.

Which other emerging markets have you identified as targets? China is a very big market which we are only now beginning to look at from a a market addresal perspective. There are markets like Latin America which are under-penetrated by service providers like us, and markets like Africa which we have started addressing last year. We have a development center in Egypt through which we deliver services to some of our middle-eastern customers. And, broadly, we do business in South Africa as part of our global thrust. Africa is certainly an interesting market though there are always country risks and you have to plan to mitigate those risks.

What’s the demand in India like? Is there a difference between verticals? IT demand in India is very broad based. Though some sectors like telecom, defense, and BFSI are more visible in their interest in IT, the drive is widespread and the difference in adoption across verticals is not starkly different. This is because customers have clearly understood the need for IT to differentiate from the competition.

What is Wipro’s approach to M&As? Wipro continues to invest inorganically and we have a clear idea as to where we need to make our inorganic investments. Our acquisition of Infocrossing, for instance, was on the same lines. The company has datacenters across the US which we didn’t have; it has a revenue runrate in the datacenter space which we didn’t

Vol/5 | ISSUE/07

View from the Top.indd 85

SNAPSHOT Wipro have; it is in the managed which means that there is a datacenter space which lot of work. And I still have Revenue: we were not. I could have a lot of work to do despite Rs 27,124 crore tried to build datacenters having another CEO. No. of employees: in the US but that would About making it work, we 108,071 have taken almost as long (Girish and I) have developed Established: as five to seven years. clarity on where we can 1980 And Infocrossing was leverage the ‘power of two’. very synergistic with our Togetherness and working Headquarters: company, so the acquisition independently while keeping Bangalore made sense. We will not each other informed gives acquire a company just to us that ‘power of two’. We increase revenue. That makes no sense. defined it pretty early in the game and I Value-added acquisitions that enable us think we have a happy equation. This helps to enhance our value proposition to our us deliver better and that’s not just empty customers are what we look for. talk as is evident from the last two years. The model needs a lot of trust. It helps that we have been in the same company for What is the role of a CIO long and are aware of each other’s working in a merger? styles. This also sets a collaborative tone Nobody wants to build a high cost in the organization. At the end of the day structure. Say one partner in an M&A is there are separate business units, service running on SAP and the other on Oracle; lines, and functions that need to work one has got 10 datacenters here and the together and a collaborative model right at other three, then at some point or the the top sets the tone for the organization. other they will have to consolidate their IT backbones. In an M&A, one of the first things that gets considered is the cost of What’s the big learning from IT and best practices. If a company has your stint at Wipro? very poor IT systems, I would be very The importance of integrity. More worried about acquiring it. precisely, professional integrity. Not overThe CIO has a part in making those committing at the point of sale either to things happen and in making sure that the clients or internal colleagues and peers is acquisition realizes its potential. He may important. It’s a value that holds you in or may not be involved in the business good stead from a long-term perspective. decision of the acquisition but his role is Doing something to win a deal now, which certainly important in terms of making you shouldn’t have ideally done, is a much sure that the benefits of the merger are larger loss. And it is better to be very clear actualized from an IT perspective. with customers upfront. CIO

Many joint CEO relationships have failed. Why should it work for you? There might be examples that show it doesn’t work but there are also enough that demonstrate that it works. Take how SAP recently announced a joint-CEO model. What needs to be understood is that ours is a high-growth industry and we are fairly ambitious in terms of where we want to be,

Anup Varier is correspondent. Send feedback on this interview to editor@cio.in

REAL CIO WORLD | M a y 1 5 , 2 0 1 0

37

5/7/2010 3:01:17 PM


Case File

Busting

Caught between blistering growth and untamed counterfeit drugs Bilcare Research, a pharmaceutical packaging company that specializes in anti-counterfeiting solutions, was finding it hard to breathe. Here’s how a dose of virtualization relieved the company off its ailments.

Fakes with

Virtualization By sneha Jha

Reader ROI:

How virtualization can help drive business growth What to watch out for when taking up a virtualization project

38

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Case Study_Bilcare.indd 48

Vol/5 | ISSUE/07

5/7/2010 5:23:14 PM


Case File Manoj Arora, Global CIO, Bilcare Research, a pharmaceutical packaging and research company, hates fakes. And he has reason to. According to the figures released by the Center for Medicines in New York, counterfeit drug sales will touch $75 billion (about Rs 3,37,500 crore) globally by the end of this year. It isn’t surprising then that Arora thought it fit to enhance the company’s anti-counterfeiting packaging solution by adding a non-clonable feature to it. This feature when attached to any product will vouch for its authenticity and protect the brand from counterfeiters. While that would give the company an upper hand over counterfeiters, it also meant added resources and expenses. “At Bilcare, we were looking to optimize our IT resources. This included all kinds of assets, licenses, hardware, communications, networking and people. If our IT costs continued to rise, it would impact our margins in a big way,” says Arora. And that was not the only problem.

Epidemic Outbreak With over 60 percent market share in the pharmaceutical packaging industry, Bilcare Research has recorded 100 percent growth over the last three years. Headquartered in Pune, the company has manufacturing and R&D units located across the UK, the US, Singapore and India. The 500-strong company has 12 offices and caters to the requirements of over 500 pharmaceutical customers. It recently doubled its capacity of supplies for clinical trial services with the opening of a second unit in UK. Though, fast business growth is something that every organization wants to pursue, it’s a fact that growth comes at a price. And that’s what happened at Bilcare. The company’s IT systems and apps were now threatening to eat into the benefits of this robust growth. If Bilcare had to maintain a sustainable growth pace, IT would have to step up its efforts of optimizing resources and compressing costs. “I wanted to set up a scalable and cost-effective infrastructure for our core business that is easy to deploy. I also wanted to manage the IT infrastructure with little or

Vol/5 | ISSUE/07

Case Study_Bilcare.indd 49

no disruption to our existing IT services,” says Arora. With immense growth, and the introduction of the nonclonable solution, Arora’s IT systems were struggling. “Bringing this solution to the market would need us to build a robust platform. But we did not want to augment our existing resources. This was the threshold when I decided to deploy a costeffective solution for our IT infrastructure,” says Arora.

In order to conquer these challenges at an early stage, Bilcare’s IT team carried out several performance and reliability tests. Once they were convinced about its viability, they went ahead with the project. SNAPSHOT

Bilcare

Speedy Recovery

To avoid downtime, the company decided to carry Revenue: out the migration in a Rs 1,000 crore phased manner. Bilcare’s employees: IT infrastructure is 500 centralized in its Pune global — V. Srinivas, CIO, Nagarjuna Size of IT team: headquarters. Physical Fertilizers and Chemicals 8 servers are categorized into And that cost-effective infrastructure servers, DMZ cure, Arora realized, was (de-militarized zone) servers and SAP virtualization. He knew that it would help servers. Arora decided to migrate around 15 him manage rapidly growing technology servers in the first phase. The second phase requirements and retain the lean structure marked the migration of SAP applications of his IT resources. “My key concern was to the virtualized environment, marking the in the area of server utilization, continuity completion of the project. and performance. I wanted a robust and Today, after less than a year of the scalable solution with a minimal footprint,” project, the benefits are already beginning says Arora. to show. The Rs 20 lakh project resulted Virtualization fit the bill as it would unlock in savings of 40 percent on power much of the under utilized capacity of the consumption, space and management company’s existing server architecture. The overhead costs. It freed up the servers and next step was to select the solution that best the manpower needed to maintain them. suited the company’s needs. Arora’s IT team, There has been a 25 percent increase in manned by eight people, was equipped with man-hours enabling the personnel to prior experience in virtualization on UNIX focus on other important projects. This and other proprietary platforms of operating proved to be one of the most significant systems and hardware. Convinced of the benefits as it lends itself to the success of viable benefits that virtualization could other projects that were in the pipeline. deliver; they began scouting for the right “Had I not done this (virtualization), product. But in doing this they adopted a our other three growth projects dealing cautious and deliberate approach. “We were with authentication services and track concerned about transitioning enterprise and trace services would not have been applications, server utilization and the so successful. These are very critical stability of the virtualized platform. We projects for our company. I did not have wanted to avoid any complexity related to to augment my team of eight people for downtime post virtualization,” said Arora. these new projects. We are growing at a After zeroing in on the right product, the rate of 50 percent without augmenting IT team at Bilcare set out to give shape to the our team,” says Arora. project. But the course of the implementation That says it all. Arora would have never was ridden with impediments. “The imagined that his hatred for fakes could save challenging part of the project was him money and resources. And how! CIO when the storage area networks were on heterogeneous systems. Before making the change, we had to analyze how the change Sneha Jha is correspondent. Send feedback on this would affect our organization,” says Arora. feature to sneha_jha@idgindia.com Headquarters:

Pune

The Right Remedy

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

39

5/7/2010 5:23:19 PM


BI’s Dirty Secret:

Better tools ools No Match for Bad strategy er to deliver actionable BI ev an th re su es pr e or m s IT face CIOs say this is no time st te ar sm e th t Bu . ss ne si data to the bu r tools do no good tte Be s. se as m e th to BI w to blindly thro t starting data. gh ri e th d an gy te ra st ss ne without smart busi

By Thomas Wailgum


Business intelligence

BI Backlog

The pressure on CIOs to deliver business intelligence (BI) tools and analytic applications — on the cheap and ASAP — has been building steadily for years. In 2010, survey results point out that that demand has reached a fever pitch with which CIOs are very familiar. “The interest in BI, the use of it and sophistication of the use just grows every year,” says Bill Swislow, CIO and SVP for what are the business processes that will product at Cars.com. “There are always be made more efficient by the BI tools; business problems, and people are ensure that the right data will get to the always looking for new BI tools to solve right people using the BI solution; and old problems.” then select the correct software tools that Recent Aberdeen surveys of enterprise will ultimately help users make more executives show that BI has ranked informed and intelligent decisions. number one (for two years running) as In other words, now is not the time to the technology that will have the most blindly throw BI technology to the masses. impact during the next two to five years. A January 2010 Kognitio and Baseline Strategy First, Technology Second Consulting survey of BI practitioners Steve Anthony has been working with noted that they expect to see “deeper use” and implementing BI applications for of BI at their companies this year and a long time. He lists former consulting plan to add capabilities to more business gigs (rolling out a massive BI system, for lines. Almost one-third of those surveyed instance) and the packages he’s worked indicated that they plan to roll out new BI with (all the biggies: Business Objects, tools into the corporate mix. Cognos, Hyperion). So what does that all mean for CIOs and Now Anthony is a CIO with his own IT departments? Whether your company shop at Charles River Associates (CRA), a is a newbie or a seasoned BI user, the global consulting firm that offers financial demand for analytic applications will most and business management assistance likely be insatiable for the foreseeable to companies and governments. And future. “It has certainly in our business while he’s plenty conversant about BI become increasingly influential,” says applications and their features and Swislow. “Ya know, it gives one a feeling functionalities, he’s also well-versed in of knowledge, power and influence. And discussing the critical steps that should knowledge is power, right?” precede all of that other tech stuff that Swislow should know. Before he added comes at the end of the process. the CIO role to his title just about a year “BI is an interesting animal,” Anthony ago he was, he says, “one of the most active says. Overall, what’s important in any BI users and strongest advocates for BI endeavor, he says, is: making certain the initiatives from business side.” Now he’s data is right and believable; determining the one having to field the requests, explain how employees will use the data to get and expand IT’s capabilities, and help the actionable results; and “ensuring that business get its BI wants and needs, where whatever we do aligns to our common and when possible. business strategy.” For sure, BI analytic All of that can take some Reader ROI: applications and dashboards time to figure out. At CRA, Why strategy matters are hotter than a recent Tiger for instance, “we spent six more than technology Woods photograph. But in a months before we even The right questions mad corporate rush to deliver started development” on a to ask BI and analytic applications new and innovative BI system, The importance of to ever-eager business users, Anthony says. He lists several being intuitive CIOs should first determine key questions that they asked

Is there a BI backlog in your organization?

Vol/5 | ISSUE/07

Don’t know 19%

Yes 51%

No 30%

Source: Forrester

themselves: What key performance metrics do we need to operate as a company? What are the data sources? What are we trying to achieve? How does all of this align to our strategy? What does this mean to us? Where is the data? And lots more. “We went through this whole big six months’ worth of getting that information together, and once together, then we did this huge data-vetting exercise,” he says. Senior management support throughout the entire process was critical. The result, he says, will be an allencompassing, innovative BI system with dashboards and Google-style querying that enable executives and practice leaders from across CRA to view, and slice and dice the data streams that they need most — whether that’s HR or financial data on operating income or SGA revenues; newbusiness queries on employee skills and capabilities, or potential business conflicts; research pulled together from disparate sources; or social media capabilities to provide context and collaboration. (Some functionalities are already up and running, he says, and new elements are being added to the system on an ongoing basis.) “It’s really the overall intelligence that a company has in order to, at end of day, get the best revenue generation or the most revenue that you can based on specific opportunities,” Anthony says. “It’s a big undertaking,” he adds, “but companies that do it right and have the stamina can gain a strategic edge in comparison to competition and the folks who may not.”

BI Tools Work Better — Finally The growing corporate attraction to BI applications is not without valid reason: REAL CIO WORLD | m a y 1 5 , 2 0 1 0

41


Business intelligence The analytic packages, user-friendly dashboards and data-management tools have, indeed, matured to a more capable state than ever before in their short history. Enterprise software vendors of nearly every stripe peddle some type of BI application. “BI, to me, isn’t new,” says Jeff Liedel, the CIO of OnStar, the in-vehicle communications company and GM subsidiary. “What’s new about it is that the tools have matured enough so that they can take cost out, increase speed and improve the depth of the analysis that you can do with your valuable resources.” In other words: BI has finally hit the ‘cheaper, faster, better’ stage. The huge career-enhancing opportunity for IT right now, Liedel says, is to “lower the cost to generate the data and also give our business colleagues better ways to analyze [the data] to come up with trends and opportunities from the data.” Cars.com’s Swislow says, simply, that BI is fundamental to his business, “which means it’s a critical area to have processes with IT to mobilize the resources to

maintain it and innovative how we use it.” He says that Cars.com relies on vendor products for core tools and innovation. But around the edges,” Swislow says, “you have to innovate on your own.” But whether they’re using off-theshelf or in-house BI apps, CIOs need to remember in this new era of Google Apps and SaaSy Web interfaces that they’d better ensure that ease of use is top of mind. “It has to be easy, intuitive and offer the ability for users to drill down and have that layers of ‘depth,’ so that as people get smarter they can drill down even further [into the BI tools’ capabilities],” says CRA’s Anthony, offering Microsoft’s (MSFT) Excel as an example. “But you have to make sure those layers are very easy to begin with.” When that type of situation transpires, a sustained “snowball effect” can occur among the user community, says Swislow, which can then lead to more valuable business results from the BI tools. “It’s like every step you take making better use of the tools just opens up more opportunities to use the information and do more interesting analytics,” he says. “The questions never end, and

What’s Jamming Up Your BI? Reasons for the BI backlog.

54%

Complex application change process

50%

Not enough IT resources Complexity associated with combining unstructured data into BI Complex data modeling change process

48% 32% 10

20

Slow Response Average time it takes for BI requests to be fulfilled. Hours Days Weeks Months

23% 41% 28% 8%

Source: 8 4Forrester m ay 1 5 , 2 0 1 0 |

30

40

50

60

when you answer one question, that means you have time to ask another question.”

BI for One and All In the not so distant future, as the definition of corporate business intelligence continues to sort itself out, it’s easy to imagine BI as a highly targeted analytic interface, delivered to various users, that is able to manage, integrate and present the enterprise glut of back-office transactional data that streams through organizations today, such as ERP, CRM and supply chain systems. This personalized BI interface, in other words, would take be able to take the pulse of and easily display everything relevant happening in an enterprise at any time. CRA’s Anthony says that this type of BI system ensures that executives, managers and decision makers are making their decisions based on the analytic and reporting data that is not only most relevant to them but also has the most value from an enterprise perspective. “Everything working together is much more powerful than anyone working in their vacuum,” Anthony says. “That’s where the new CIOs, in my mind, are becoming the strategic partner because of BI and the data provided. This [type of] CIO understands the business and the technical layer that provides that data. It’s a big role.” At Cars.com, Swislow mentions the value that data from “one big analytics engine” can deliver, not only to business users in terms of customer, website and sales data; but also in how Cars.com demonstrates the site’s value to advertisers and also the helpful data that Cars.com can provide to the consumers who make more than 1 million searches on the site each day. “There’s a tremendous amount of intelligence there at multiple levels,” he says. No matter where your BI travels take you, though, CIOs say that the BI journey isn’t one that will be ending any time soon. “BI is an absolute journey that doesn’t end because we must change with the changing business environment,” Anthony says. “Certainly in these dynamic times, if we don’t have the data and believe that data to make the decisions, we lose.” CIO

Send feedback on this feature to editor@cio.in

REAL CIO WORLD

Vol/5 | ISSUE/07


Trendline_Nov11.indd 19

11/16/2011 11:56:19 AM


Ready to

present?

Less is more when it comes to bang-up business presentations. Here are five tips for better tech talks. By Mary K. Pratt

Reader ROI:

How to get inside your audience’s head Why presentations are more about stories than facts Ways to use technology

Feature_Persuade_Pro.indd 82

5/7/2010 5:03:14 PM


CIO Skills

Strategic Initiatives

Strategic Initiatives

Illustration by MM Shan it h

Techies Get Talking By and large, IT types aren’t known for their smooth communication styles or savvy presentation skills. That used to be OK. Now, though, as board members want more details about IT spending, and business colleagues want more information on what technology can do for them, technology employees at all levels need good presentation skills — particularly if they want to move up in the ranks.

Vol/5 | ISSUE/07

Feature_Persuade_Pro.indd 83

Product Ingredients

Product Creation

Product Organization

There’s a lot at stake, says Lori Michaels, Product Product Product Product chief technology offiIngredients Creation Organization Enhancement cer at The Economist Intelligence Unit, a research and advisory firm. Michaels says she’s seen great projects passed over because no one could present a compelling case for them, while Fig 1: Give a straightforward PowerPoint slide, like this one, fl a sh- i n- t h e - p a n a little life ... “bubblegum tech” that was presented Strategic Initiatives well got funded. Good presentaStrategic Initiatives tion skills can help IT leaders reach not just their organizational goals, but their personal goals too. As communication becomes increasingly important, presentation-savvy CIOs are often called upon to carry IT’s message to the rest of the company, simultaneously increasing their visibility and Fig 2: ... by moving the image mid-presentation to show business their perceived value benefits. to the organization. point in time,” she says. Ask yourself: Why Want to create a message that others will is this project important? Why is this project remember for years to come? Consider the going to help those around me? And what do following tips. I need from this group? If you are making a pitch to develop a new Pointer 1: Give your application for your company’s marketing audience an action item department, for example, you need to If you want a persuasive presentation, start demonstrate what the application will do for by defining its purpose, says Kimberly marketing, articulate why it will be money Douglas, president of FireFly Facilitation well spent and spell out the actions you need and author of The Firefly Effect: Build Teams them to take — all from the audience’s point That Capture Creativity and Catapult Results. of view. “Get extremely clear about what you “What do you want them to know, think, want to get out of the presentation, from feel, or do differently?” Douglas asks. these particular people, at this particular Product Enhancement

Thomas Murphy needed $300 million for new ERP. With a price tag that large, you’d think Murphy would add every justification in his presentation to his business colleagues. But even though he had enough data to fill 200 slides, Murphy, senior vice president and CIO at AmerisourceBergen, resisted that temptation. Instead, he sold his plan — to IT employees, line-of-business colleagues and C-level executives at the pharmaceutical services company — with a mere five slides that used impressive images and just a few persuasive facts to get his point across. For example, he showed an image of an iceberg to demonstrate that little issues with the company’s current processes were symptomatic of much bigger problems to come. And when he told audiences that the company’s supply chain application was older than Pong, up came a screen shot from the ancient video game. “It always got a big laugh, followed by a look of dawning realization and fear,” Murphy says, explaining that “good sales people use analogies or powerful references, because people don’t remember the numbers. They remember the story.” Three years later, and halfway through the implementation of the ERP system, IT people still talk about the iceberg. “I have always said that the CIO’s role is primarily a sales role,” Murphy observes. “That’s really what we do. We have to sell to people who don’t know they want to buy.”

REAL CIO WORLD | m a y 1 5 , 2 0 1 0

45

5/7/2010 5:03:15 PM


CIO Skills Answering those questions will help you articulate what you need to convey. Michaels asks her team members to sum up in one sentence what they want to convey in their presentations and what they want their audience to come away with. The exercise helps shift the presentations from a regurgitation of technology facts to an action that the audience can rally behind, she says. Michaels once worked with her vice president of technology as he was preparing a presentation to stakeholders about a new database architecture. His original presentation had about 30 slides, mostly detailing the benefits of the new technology. To help him tailor his presentation, Michaels asked him to define his audience and explain what he needed from them. “His audience was all high-level stakeholders from business management,” she says. “What he needed from that group was to get approval for funding and set expectations on a timeline that would satisfy the business goals.” That produced a very different deck with points on ROI-related information, goals that related to the business’s timelines,

and examples of features to be delivered that accomplished their strategies.” In the end, the presentation didn’t mention architecture or relational database structures — and he got the funding he needed.

Pointer 2: Say what the technology does, not what it is “The majority of presentations I see are, ‘We’re going to go with Java and it will solve the problems,’ whatever the problems are,” Michaels says. “But the CFO just hears, ‘I want $5 million to make my life easier.’ “ That’s why you need to leave out technical jargon and focus on explaining what that technology will bring to those in front of you, says Abbie Lundberg, president of Lundberg Media and former editor-in-chief of Computerworld (a sister publication of CIO magazine.) “Where a lot of IT people fall down is they talk about what the technology does. They tend to talk about the functioning of the systems, which is a big mistake. Most others don’t understand it, and they don’t care about it,” Lundberg explains. “IT people have to be

How to be a Smooth Tech Talker

Source: Forrester

Think of a presentation as a story with a beginning, middle and end. AmerisourceBergen CIO Thomas Murphy says he has built presentations on storyboards like movie directors do. Start with a headline, says Suzanne Bates, author of Speak Like a CEO. “Figure out the problem and the solution, and then present that big idea in the first minute or two. That’s backward for many technical people, who don’t understand how people can make a decision without hearing all the facts.” But with this approach, she says, “you’ll be far more successful in getting their attention.” Understand your topic thoroughly. You don’t want to present everything you know, but having detailed knowledge about the issue will help you present with confidence, answer questions effectively and build credibility with your audience. Prepare a condensed version. Even if you’ve been told you’ll have a certain amount of time, you might be told to cut it short for myriad reasons, Bates says. So know where you can trim if you have to. She recommends having a three-minute version ready to go. Test the technology. Abbie Lundberg, president of Lundberg Media, attended a recent meeting where the first 15 minutes or so was spent waiting for someone to get the video working. Not the best way to engage an audience, she points out. Ask “What questions do you have?” rather than the “Does anyone have a question?” It’s a small rephrasing, says Kimberly Douglas, president of FireFly Facilitation, but the first version is much more inviting and likely to elicit audience responses. Practice. Douglas recommends asking someone who can give you honest feedback to watch you run through your presentation to ensure you’ve got it right. — M.K.P 84

Ap r i l 1 5 , 2 0 1 0 | REAL CIO WORLD

Feature_Persuade_Pro.indd 84

more audience-focused. They have to ask, ‘What does my audience care about?’ “ Think about what you want the technology to do for each audience. If it’s going to help sales deliver goods to customers more quickly, that’s what you present to sales. If it helps your call center people handle calls faster, that’s your key talking point, Lundberg says. AmerisourceBergen’s Murphy, who needed two years to sell his $300 million ERP project, honed his ability to describe technology in business terms by meeting one-on-one with his counterparts in other departments. It wasn’t until he framed the need for the ERP project in the terms that his business colleagues focused on — describing it in terms of revenue vs. profit — that he was able to really engage his audiences during presentations, he says.

Pointer 3: Use more images, fewer words As Murphy learned with the success of his iceberg picture, images speak louder than words. “Audiences can only read or listen. They can’t do both,” says Suzanne Bates, president and CEO of Bates Communications and author of Speak Like a CEO and Motivate Like a CEO. This point is particularly important when speaking in front of a live audience, which is often forced to squint while looking at small type squeezed onto slides by a presenter desperate to cram it all in. Bates tells her clients to resist that urge. “You want people to walk out of the room remembering your presentation and what you said, and the only way you can do that is with powerful imagery and good stories.” Bates suggests that clients write a script before compiling any visuals. And then, before they log onto PowerPoint or a similar program, she has them get out crayons — yes, crayons — and draw images based on their messages. Those images then become the basis of a photo or more polished design.

Pointer 4: Think beyond plain-vanilla slides Despite the high-tech communication tools now at our fingertips, experts say most presentations still feature slides with bullet points and more bullet points. It’s time to expand that basic menu. Vol/5 | ISSUE/07

5/7/2010 5:03:15 PM


Michaels, for example, has used PowerPoint’s animation feature to help illustrate the transformations that her IT projects can bring. “We’re almost always decommissioning and rebuilding, so what better way to illustrate that than to put a picture up and morph it?” she asks. She once needed to deliver a presentation on agile development and why it would work well for her business colleagues, who were still wedded to a two-year development-and-delivery cycle. Michaels knew that her audience didn’t care about the process of agile development as much as the results it could bring. So she designed a slide featuring a block arrow with the project’s name on it. The arrow was made up of smaller arrows that at first all pointed right (Figure 1). Then, when Michaels talked about how agile development delivers pieces over time, the smaller arrows turned and pointed down to individual business benefits listed along a timeline (Figure 2). “Good presenters are able to pull on the right tools at the right time,” Lundberg says, noting that video and other visuals have more impact than bullet-point presentations. Lundberg says she uses video in the same way as graphics — to highlight a point in a memorable way. She recommends keeping video clips to under 30 seconds, going as long as a minute “only if it’s really good.” “I tend to use video for humor, too,” she says. Lundberg cites a presentation she made on IT’s ability to drive both efficiency and innovation. In it, she noted that though it’s possible to do both, it can be hard for an immature IT organization to achieve. She used an analogy: “Can you pat your head and rub your belly at the same time?” alongside some YouTube clips, first of a 3-year-old having a tough time and then of a daredevil teen doing that and much more. “Video is also good if you need to show a demonstration of some kind. In that case, you might want to hire someone to produce it for you. Again, though, you need to keep it short,” Lundberg says. “A mistake I’ve seen a lot of CIOs make is to run a long promotion-type video about their company. Really, nobody cares how fast your cars go or how big your ships are. It just looks like a commercial to 48

M a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Feature_Persuade_Pro.indd 86

Recommended Reading The Presentation Secrets of Steve Jobs: how to be Insanely Great in Front of any audience, by Carmine Gallo

Presentation Zen: Simple Ideas on Presentation Design and Delivery, by Garr reynolds

Presenting to Win: the art of telling your Story, by Jerry Weissman Slide:ology: the art and Science of Creating Great Presentations, by nancy Duarte

Speak Like a CEO: Secrets for Commanding attention and Getting results, by Suzanne bates Why Business People Speak Like Idiots: a bullfighter’s Guide, by brian Fugere, Chelsea hardaway and Jon Warshawsky

them, and we all know how much people love commercials.” On the other end of the spectrum, sometimes nothing works so well as a good, old-fashioned analog prop. When Murphy had to talk about business transformation to 650 salespeople from his company, he opted to unroll a supersized printout representing the company’s systems and its complexities. “It was probably four feet wide and 10 to 12 feet long. I rolled it out in front of me on stage and let it roll over the stage floor,” he recalls, explaining that the highly detailed technical chart showed all of the data connections among 300 or so master applications. “It wasn’t the drawing but rather the act of rolling out the scroll that amplified my point about complexity,” Murphy says. Props like that, he says, get the point across in a way that words and data charts can’t.

Pointer 5: Show your passion Most techies are used to developing highly detailed, well-documented reports and requirements, making it extra difficult to switch from that scientific frame of mind to an emotional one. But Douglas says just such an adjustment can help IT folks connect with their audience. She cites the case of one IT

manager from a large financial institution speaking out at an IT managers meeting. At a point where the meeting seemed to be going off track, the manager rose and issued an impromptu plea regarding the critical need to work more cooperatively. “I can see him standing, I can see the room. He stood up and called people up by name, saying, ‘We are here to come together.’ He was passionate about the need for people to leave their own silos behind,” she recalls. What made the moment so memorable, Douglas says, was the manager’s obvious emotion. “When he talked about the excitement of being in that organization’s IT department, there was a transparency, a vulnerability,” she says. “He made great eye contact with people. He commanded presence from both the tone of his voice and the passion that came through. It was just riveting.” Douglas doubts it would have had the same impact if he had put up a pile of slides and talked through critical data points. AmerisourceBergen’s Murphy would agree with that assessment wholeheartedly. “It’s hard to sell if your passion about what you’re selling doesn’t come across.” CIO Send feedback on this feature to editor@cio.in

Vol/5 | ISSUE/07

5/7/2010 5:03:32 PM


everything you wanted to know and more

Storage’s Problem of Plenty As enterprise data continues to bloat, CIOs are beginning to look at their storage with new eyes, especially as new technologies allow for greater levels of efficiencies.

Vol/5 | ISSUE/07

Deep Dive_May2010_StripAds.indd 49

What’s Inside Deep Dive Features Notes on Cloud Storage ������������������������������������������������������������� 50 Storage on a Diet ��������������������������������������������������������������������������56 SSDs in the Datacenter �������������������������������������������������������������� 60 Virtual Headaches �����������������������������������������������������������������������64 Column Scouting for the Perfect Fit ����������������������������������������������������������68

REAL CIO WORLD | M a y 1 5 , 2 0 1 0

49

5/7/2010 5:06:58 PM


Deep Dive | Storage

Security Vendor Lock-in

Downsides

Are We Ready ?

Cheaper storage VS Cloud storage

Benefits

Future-Proofing?

Cloud Strategy

Notes on Cloud Storage By Robert l. Mitchell and Stacy Collett If you aren’t even willing to consider moving your data to the cloud, you aren’t alone. But if you also get the feeling deep down that you can’t procrastinate forever, here are stories from people who are parting with their data today.

50

M a y 1 5 , 2 0 1 0 | real CI CIo oW Worl orlD D

L

iz Devereux knows a thing or two about cloud storage. As director of IT storage and digital imaging at Banner Health, Devereux oversaw the construction of an internal 150TB storage grid. The grid delivers storage as a service to the healthcare provider’s network of hospitals and healthcare facilities in seven states, which use it as a repository for radiological images. But she would never entrust that data to an external cloud service provider. “I’m nervous about someone else controlling my data,” Devereux says. Cloud storage offers some enticing advantages. It’s pay as you go, with no capital outlay and no need to buy extra equipment in anticipation of future storage demands. You scale storage dynamically and pay only for what you use. But you must trust your data to the cloud — and the vendor. Few midsize or large businesses are VOl/5 | ISSUE/07


Deep Dive | Storage

willing to trust the cloud today, although some are experimenting. “There’s a huge amount of interest,” says Gene Ruth, an analyst at Burton Group. But, he adds, none of his firm’s Fortune 100 clients is using a cloud storage service for live data today. It’s probably wise to proceed with caution, says James Damoulakis, chief technology officer at Glasshouse Technologies, an independent IT consulting and services firm that focuses on enterprise datacenters, storage and other elements of the IT infrastructure. “Cloud storage today is pretty much an early-stage concept,” he says. Aside from a few heavyweights, like Amazon. com’s Simple Storage Service (S3) and Verizon Communications’s Online Backup and Restore service, most offerings come from small start-ups. “It’s best suited for low-priority or low-access, lowtouch kinds of applications, primarily file-based as opposed to block-based,” says Damoulakis. But he says he does have clients that use services from Amazon as temporary expansion space for testbeds or marketing programs. The most common storage-as-a-service offerings are online backup and archiving apps. Things have changed since the days of StorageNetworks, a company that couldn’t make it as a hosted backup and shut in 2003. The idea behind StorageNetworks was outsourcing — providing a service that used the same storage frames that were in the datacenter, says Damoulakis. Now, many cloud storage services use low-cost, commodity storage in a distributed architecture. “We’ve advanced far in virtualization, the Internet, distributed computing and the grid concept,” he says. Michael Peterson, president of Strategic Research, launched a storage service provider in those early years and was an adviser to StorageNetworks. He says cloud storage is a broad term that incorporates a variety of technologies and business models. For example, some service providers use distributed, commodity storage, while others might use traditional midrange or high-end storage frames. Cloud storage service offerings range from basic file-based storage infrastructure services, like Amazon’s S3, all the way up to storage-as-a-service apps. With the exception of start-up Zetta, most vendors aren’t pitching the cloud for primary storage.

The Security Minefield Joe Mildenhall, CIO at Apollo Group, is taking baby steps into cloud storage. “We have a lot to lose. If we’re playing, we’re only going to play with

VOl/5 | ISSUE/07

the big guys,” he says. The for-profit educational institution is using Amazon’s S3 to temporarily store papers that some of its 400,000 college students submit through the Apollo Web site. But even with Amazon, Mildenhall will entrust only low-risk data to the cloud. For example, students can submit Word documents to the Apollo Web site, which runs the documents through a grammar-checking engine and then parks them in Amazon’s S3 storage. When a student retrieves his document, the data is purged. “The major characteristic is that it’s not very important storage to us,” Mildenhall says. Still, data security is a challenge. Many cloud storage vendors offer encryption for data in transit and at rest. Some, such as Zetta, make encryption the default setting. That’s important because in a storage cloud, your data might be on the same disks as data from other users, says Ruth. If another customer’s data is raided by law enforcement agencies, for example, could yours go with it? “The laws are not sufficient to protect innocent parties whose data is on the same equipment,” says Ruth. To address that, some vendors keep each customer’s data on a separate disk. Zetta encrypts each customer’s data with a different key. Mildenhall says he feels confident that Amazon will be around for a while, but he still doesn’t trust that the data will be. If he were to entrust business data to Amazon’s storage service, he says he would need a mechanism to ensure that a copy of the data was replicated back to his datacenter. “I’m not willing to say that the copy of data in the cloud is the only copy I’ve got,” Mildenhall says.

This is as far as pioneers have gone till date.

What if management asks me who else is using cloud storage? Smells like trouble

Pioneer speak

The Data Transfer Hurdle Security concerns is not the only worry that’s lining the faces of IT leaders. Take the case of Jeff Kubacki, CIO at Kroll, who set a goal for the risk management consulting firm to reduce its 13 petabytes of storage costs by 25 percent over the next three years. Cloud storage could certainly be part of the answer— if they can beat the network challenges. Kroll’s IT architects will be investigating ways to migrate about 25 percent of the risk assessment firm’s eligible data through its Internet ‘pipes’ and into the cloud. (The majority of data, mostly legal discovery documents, is considered too sensitive to store in the cloud, Kubacki says.) While storage capacity in the cloud is expandable, limits in the capacity of network connections to the cloud can create challenges for enterprises with multiple petabytes of data to move back and forth. real CIo WorlD | M a y 1 5 , 2 0 1 0

51


Deep Dive | Storage

Other concerns: Vendor lock-in lack of standards hard to use vendor stability

That’s what I’ve been saying!

Enterprises are asking whether their pipes are big enough to transfer their stored data to the cloud, and often, the answer is no. “The latency is the big inhibitor for what you can use [cloud] storage for,” says Adam Couture, an analyst at Gartner. “Right now, for enterprises, we see the [use restricted to] archiving, backup, maybe some collaboration.” But most cloud providers say there are easy ways around capacity issues when migrating data to the cloud — starting with the physical migration of the initial data to the datacenter location. It’s relatively easy to host and transfer large amounts of data from a day-to-day, user-level perspective, says Rob Walters, general manager of the Dallas office of cloud hosting company The Planet. But moving 20TB to 25TB of data in a chunk continues to daunt current systems. “The networks that we have [today] just aren’t good at it. It’s just a weak point right now, and everybody is looking at dealing with that,” Walters says. For enterprises, the ‘initial ingestion’ of backup data to the cloud can be done by copying data to the cloud over a WAN or LAN link, but “that initial backup, depending on how much data you have on your server, could take weeks,” Couture cautions. Doctors’ offices that hire Nuvolus to create private cloud storage for their sensitive patient data don’t like data to be copied and physically taken

Cloud Vs In-house Storage: Who Wins? If you believe that storage is cheap and getting cheaper and that you don’t need to move to the cloud, check your assumptions. Cloud storage is priced like a utility. Over and over again, vendors repeat the same mantra: “Customers pay only for what they use.” Pricing for public cloud storage ranges between 12 and 25 cents (between Rs 5 and 11) per gigabyte per month. Yet the real savings in cloud storage may have more to do with the ancillary costs of storing your own data. “l “lots of datacenters are already starting to fold under the amount of data under management,” notes Terri McClure, an analyst at Enterprise Strategy Group. “If you are out of space, power and cooling, it doesn’t matter how cheap [storage] gets — starting to offload data into the cloud is a better and cheaper alternative than building a [new] datacenter.” Marc Staimer, president of Dragon Slayer Consulting, suggests the best way to calculate storage cost is to take into account ancillary costs, including power and cooling, annual licensing fees for storage software, and the cost of datacenter floor space and security. “The conventional wisdom is overall storage costs per gigabyte are rapidly declining,” Staimer notes. “In fact,” he says, “conventional wisdom is completely wrong. Overall, the burdened storage costs per gigabyte in most datacenters are actually increasing or at best staying flat. This is because the hard disk drive acquisition cost is a very small part of the overall burdened storage costs.” — Julia King 52

M a y 1 5 , 2 0 1 0 | real CIo WorlD

out of their offices, says Nuvolus CEO Kevin Ellis. So the company requires its healthcare industry clients to have “a decent Internet connection” — typically 10Mbit/sec. — to transfer the backup data over the pipes, says Ellis. “Depending on the office, we could be looking at pretty long upload times,” he says. “You’re uploading overnight. We’re trying to make sure we’re not impacting the doctor’s office during the day as well.”

Inhibiting Circumstances The fear of vendor lock-in is another concern. Every storage service provider has its own proprietary APIs. In some situations, the user might also want to define metadata associated with a data set, such as aging information or security parameters. But storage service providers handle that differently as well, says Ruth. “These services shouldn’t require specially designed interfaces to make them work,” he says. Vendors are just starting to work on standards to eliminate the problem. The lack of common APIs would create problems if a storage service provider were to suddenly shut its doors — and that’s a possibility when you’re dealing with a start-up. “Once you get in bed with a service provider, you hope to heck they’re not going to go out of business,” Ruth says. It’s not how to get the data back that worries Manjit Singh, but whether he’d even have access to the data if the provider went belly up. “If it’s bankrupt, the creditors might just come in and take the equipment, and they don’t care what’s on it,” says Singh, vice president and CIO at Chiquita Brands International. He has yet to give cloud storage a try. Rich Zoch is experimenting with Zetta’s storage service at the University of Texas at Austin — but not for primary storage. “It’s a great platform to offload backup archives that are encrypted,” says Zoch, senior systems administrator. But so far he has trusted the service only with dummy data. He says he plans to use it as a secondary storage pool for backups as an alternative to tape. Zoch says he likes the fact that Zetta uses public key encryption that’s compliant with Federal Information Processing Standard 140-2, but the university still might decide to encrypt the data itself before transmitting it. And since he’s using Zetta only for secondary copies, he’s not worried about getting it back if something happens on Zetta’s end. Even if the stored data is accessible, some storage-as-a-service applications, such as Zmanda’s VOl/5 | ISSUE/07


backup and recovery systems, store data on a thirdparty platform such as S3 on the back end. So it’s important to do due diligence on where and how data is hosted and how to get it back, says Singh. But, he says, that’s no different from the checks one should do with any other software-as-a-service provider that stores data. What’s the best way to get started with external cloud storage services? “You have to trust, but verify,” Ruth says. That means touring the datacenter to see what’s stored where, creating a service-level agreement with meaningful metrics and performing regular audits to make sure the vendor is living up to them. And if the storage-asa-service provider is using a third party for the underlying storage infrastructure, you’ll need to perform due diligence on that vendor as well.

Pipeline Problem What’s the way around the data transfer challenge? Some vendors offer private connections established from the enterprise to one of the provider’s storage nodes. This is well suited for companies with initial data sets between 2TB and 75TB, or fewer than 750 million files, and where data transfer is time-sensitive, according to Nirvanix, a cloud storage provider. It also works well for one-time and ongoing data migration that requires high throughput and moderate latency. The other option — most often used by enterprises — is the ‘sneakernet’ approach, where data is physically picked up from the customer on a disk, tape or appliance provided by the cloud storage provider, and taken to the datacenter for initial backup. “We’ve had customers that have shipped storage arrays,” says Jon Greaves, chief technology officer at private cloud host Carpathia Hosting. “In some cases, customers have physically removed disks from the chassis after they have been mirrored, and delivered those.” Greaves has seen large companies use both the Internet and sneakernet methods. Nirvanix, for instance, will send its customers storage servers with dual Gigabit Ethernet ports to transfer data within their own facilities. Once the data is transferred, the servers are sent back to Nirvanix and the data is migrated to the cloud. Amazon Web Services LLC supports moving large amounts of data in and out of its cloud using portable storage devices. It uses a high-speed internal networks to transfer data directly onto and off of storage devices, bypassing the Internet. Carpathia builds private clouds for its enterprise customers based on technology from ParaScale.

VOl/5 | ISSUE/07

Important Cloud Storage Questions You Want to Ask

A bird’s-eye view of the technology and the factors to consider before storing your data in the cloud. What’s the difference between a public cloud and a private cloud? The public cloud is a pay-per-use storage utility. All components sit outside of the customer’s firewall in a shared infrastructure that is logically partitioned, multitenant and accessed over a secure Internet connection. Public cloud storage providers, such as Amazon.com with its Simple Storage Service, typically charge a monthly usage fee per gigabyte of storage plus an additional bandwidth fee for transferring data to and from the cloud. Public cloud customers require no physical storage hardware or any special in-house technical expertise. Rather, the cloud storage service provider manages the storage infrastructure, pooling its capacity to accommodate the needs of multiple customers. Users typically access their publicly stored data via an Internet connection. A private storage cloud is usually built behind a company’s firewall, using hardware and software owned or licensed by the company. All of a company’s data remains in-house and is fully controlled by in-house IT staffers. Those staffers can pool storage capacity to be shared across different departments or used by different project teams within the company, regardless of their physical locations. As with public clouds, storage capacity can quickly and easily be increased by adding servers to the pool. ParaScale’s Cloud Storage product and Caringo’s CAStor are two examples of applications that can be used to set up private cloud storage systems. John Engate, CTO of Rackspace Hosting, describes the private cloud differentiation this way: “The user is still in the IT business, the software configuration business, the storage management and support business, and the datacenter business. You still have to have all of those resources.” There is one exception. When a private storage cloud is created by carving off a piece of a public storage cloud for one customer’s exclusive use. Users pay a premium for private cloud storage, much like one pays a higher rate for a private room in a hospital. Essentially, “the difference between public and private cloud storage comes down to the way you connect,” says Mike Maxey, director of product management at ParaScale, who sells software designed to aggregate disk storage on multiple standard linux servers to create a single, scalable, selfmanaging private storage cloud. “If you’re connecting over a wide-area network and sharing the resources with other customers, it’s a public cloud. This makes sense if you’re a highly distributed company and creating applications but don’t have a shared infrastructure,” Maxey explains. “It’s also good if you’re putting out transient data, like movie trailers, that might run for five months. Temporary storage in the [public] cloud makes sense.” Is cloud storage for all types of data? No. Cloud storage best handles large volumes of unstructured data and archival material, such as credit card and mortgage applications or medical records. For now, public clouds can’t reliably handle highly transactional files or databases that require consistently fast network connections. Any kind of online transaction processing is a no-go. Cloud storage also isn’t an appropriate choice for Tier 1, Tier 2 or block-based data storage, says Jim Ziernick, president and real CIo WorlD | M a y 1 5 , 2 0 1 0

53


Deep Dive | Storage CEO of Nirvanix. “If someone is trying to replace a SAN in supporting a transaction-processing system like CRM, we’re not appropriate. Even if we did do block-level storage, the latency of the Internet would cause a noticeable delay,” he says. “What we can do with cloud storage is give users nearly the access that they have with [network-attached storage],” he adds. Data backup, archiving and disaster recovery are three likely uses for the cloud, says Engates. “The cloud is for any kind of large-scale storage need with any kind of static-type data,” he says. “You don’t want to store a database in the cloud, but you might store a historic copy of your database in the cloud instead of storing it on very expensive SAN or NAS technology.” “A good rule of thumb is to consider cloud storage only for latency-tolerant applications,” says Terri McClure, a storage analyst at Enterprise Strategy Group. “Backup, archive and bulk file data would all do well in the cloud, where sub-second response time is not a requirement.” On the other hand, databases and any other data that is “performance-sensitive” aren’t suited for cloud storage because of latency, she notes. But before moving any data to a cloud, public or private, users need to address a more fundamental question, says Mark Tonsetic, program manager at The Corporate Executive Board’s Infrastructure Executive Council. “If you go to cloud storage, does it solve the problem of understanding where and why storage growth is out of control and where the point of value is [in storing a particular set of data] in the entire end-to-end business process? Just moving the technology to a cloud is not an optimal solution,” Tonsetic says. Who is using cloud storage today, and how are they using it? Start-ups and new Web 2.0-based service providers are among the biggest users of cloud storage, at least for the time being. On the larger enterprise side, cloud storage customers are fewer and farther between. “We are in the very early stages of adoption. Typically, when we’re talking to customers, we’re talking with classic early adopters,” says Nirvanix’s Ziernick. These include strictly regulated financial services companies that are required by law to store client audio conversations and other large data files, and content delivery networks that store and then stream images, audio and video to customers. More and more, users within companies are tapping cloud storage for pilot projects and proof-of-concept initiatives, Ziernick says. Does cloud storage eliminate the need for in-house technical resources? Public clouds remove the need for server and storage administrators, but not all technical resource requirements go away, according to ParaScale’s Maxey. Many public cloud storage services use newer protocols, such as WebDav or REST, for access, he notes. If a customer’s in-house applications don’t support those protocols, the technical staff will need to make changes to code. Otherwise, Maxey says, “it’s like being dropped in a foreign country and not knowing the language.” Newer software applications developed in a modular fashion will be easier to adapt for storage clouds, he says; older applications will be more difficult. Eliminating IT staff was never the point for Forrester Construction when it signed up with Iron Mountain for cloud storage service, says CIO Tom Amrhein. “The real advantage,” he says, “is the ability to align your IT resources to tasks that make a difference to the business.” — Julia King 54

M a y 1 5 , 2 0 1 0 | real CIo WorlD

“It depends on how quickly they need to see data up and running, and the use of the data. If it’s long-term archiving, it’s typically a more gradual migration of data,” he explains. “If they need video files for immediate use, and it’s tens to hundreds of terabytes, that’s the time we start looking at alternative methods.” After that initial transfer, Internet bandwidth will rebound because only blocks of data that have been changed are added to the backup. There is no such thing as ultimate scalability or infinite capacity in the cloud, Walters says. It’s the provider’s responsibility to plan capacity, manage delivery of future storage and stay ahead of demand. “If someone is going to upload 10-plus terabytes [of data], you know about that in advance, and it’s a carefully orchestrated exercise,” he says. Storage providers use sophisticated methods for capacity planning. Carpathia, for instance, constantly pushes traffic across its network at about 450Gbit/sec. to 500Gbit/sec. and plans for capacity changes using algorithms borrowed from the telecommunications industry. “You have a T1 line and have to figure out how many core minutes you can squeeze through that T1 line, which is really an over-provisioning problem,” Greaves explains. Telecom companies also use a unit of measure called an erlang, which describes total traffic volume in one hour, to help determine where they are in the provisioning cycle. “We use exactly the same approach on our cloud,” Greaves says. “We can figure out that we’re at 1.2, and at 2 we’re going to have capacity challenges. So when we hit that 1.2 threshold, that’s when we order more hardware.”

Moving Foward Despite the challenges, most users see a bright future for cloud storage. Singh from Chiquita’s says he could see a role for cloud storage for file services if he had to replace his file servers. Others see the cloud as a potential way to back up remote offices. At Apollo, Mildenhall says he sees a larger role for cloud storage at Apollo as well. “It would be reasonable to put file sharing and e-mail in the cloud,” he says. And Mildenhall says he envisions a day when core business data might be hosted in the cloud — as long as he has backups of everything. Ultimately, Burton Group’s Ruth says, IT organizations might use cloud storage as an alternative to building more datacenters to hold copies of critical data. But, he adds, “they need to get over the idea of moving the data off-site.” CIo Send feedback on this feature to editor@cio.in

VOl/5 | ISSUE/07


Deep Dive | Storage

Data storage needs continue to grow unabated, straining backup and disaster recovery systems while requiring more online spindles, using more power, and generating more heat. No one expects a respite from this explosion in data growth. That leaves IT leaders to search for technology solutions that can at least lighten the load. One solution particularly well-suited to backup and disaster recovery is data de-duplication, which takes advantage of

Storage on a Il lUSTRATIO N BY MM S HANIT H

Data explosion threatens to overwhelm storage systems, particularly the backup tier. Data de-duplication can help, but it isn’t that simple. By Bill Brenner

56

M a y 1 5 , 2 0 1 0 | real CIo WorlD

Deep Dive_May2010_StripAds.indd 56

Diet the enormous amount of redundancy in business data. Eliminating duplicate data can reduce the amount of storage space necessary from a 10:1 ratio to a 50:1 ratio and beyond, depending on the technology used and the level of redundancy. With a little help from data de-duplication, CIOs can reduce costs, lighten backup requirements, and accelerate data restoration in the event of an emergency. De-duplication takes several different forms, each with its own approach and optimal role in backup and disaster recovery scenarios. Ultimately, few doubt that data de-duplication technology will extend beyond the backup tier and

VOl/5 | ISSUE/07


Deep Dive | Storage

apply its benefits across business storage systems. But first, let’s take a look at why data de-duplication has become so attractive to so many organizations.

Too Much Data, Too Little Time Duplicated data is strewn all over the enterprise. Files are saved to a file share in the datacenter, with other copies located on an FTP server facing the Internet, and yet another copy (or two) located in users’ personal folders. Sometimes copies are made as a backup version prior to exporting to another system or updating to new software. A classic example of duplicate data is the e-mail blast. It goes like this: Someone in human resources wants to send out the new Internet acceptable use policy PDF to 100 users on the network. So he or she creates an e-mail, addresses it to a mailing list, attaches the PDF, and presses send. The mail server now has 100 copies of the same attachment in its storage system. Only one copy of the attachment is really necessary, yet with no de-duplication system in place, all those copies sit in the mail store taking up space. Server virtualization is another area rife with duplicate data. The whole idea of virtualization is to do more with less and maximize hardware utilization by spinning up multiple virtual machines in one physical server. This equates to less hardware expense, lower utility costs, and (hopefully) easier management. Each virtualized server is contained in a file. One of the great features of virtual machines is that IT personnel can stop the virtual machine, copy the VMDK (virtual machine disk) file, and back it up. Simply restart the machine and you’re back online. he IT team keeps golden images of working virtual servers to spawn new virtual machines — not to mention the backup copies. Virtualization is a fantastic way to get the most out of CPU and memory, but without de-duplication, virtual hard

Vol/5 | ISSUE/07

Deep Dive_May2010_StripAds.indd 57

60%

Of CIOs are either in the process of de-duplicating or have plans to de-duplicate their primary, backup, or archive data in 2010. Source: IDC

disks can actually increase your network storage requirements.

Straining Backup Systems Old tape backup systems are too slow and lack the needed capacity. New high-end tape systems have the performance and capacity but are quite expensive. And no matter how good your tape drive is, Murphy’s Law has a tendency to jump all over tape when it comes to restoration. VTLs (virtual tape libraries) provide a modern alternative to tape, using hard disks in configurations that mimic standard tape drives. But at what cost? Additional spindles equal additional cost and additional power consumption. VTLs are fast and provide a reliable backup and restore destination, but if there were less data to back up, you’d have lower hardware and operating costs to begin with. Data glut compounds the difficulty of disaster recovery, making each stage of near line and offline storage more expensive. Keeping a copy of the backup in near line storage makes restoration

of missing or corrupt files easy. But depending on the backup set size and the number of backup sets IT teams want to keep handy, your near line storage can be quite substantial. The next tier, offline storage, is comprised of tapes or other media copies that get thrown in a vault or sent to some other secure location. Again, if the dataset is large and growing, this offline media set must expand to fit. Many disaster recovery plans include sending the backup set to another geographical location over a WAN. It would be beneficial to keep the size of the backup set to a minimum. That goes double for restoring data. If the set is really large, trying to restore from an off-site backup will add downtime and frustration.

Defining Data De-duplication It is the process of detecting and removing duplicate data from a storage medium or file system. Detection of duplicate data may be performed at the file, bit, or block level, depending on the type and aggressiveness of the data de-duplication process. The first time a de-duplication system sees a file or a chunk of file, that data element is identified. Thereafter, each subsequent identical item is removed from the system but marked with a small placeholder. The placeholder points back to the first instance of the data chunk so that the de-duplicated data can be reassembled when needed. This de-duplication process reduces the amount of storage space needed to represent all of the indexed files in the system. For example, a file system that has 100 copies of the same document from HR in each employee’s personal folder can be reduced to a single copy of the original file plus 99 tiny placeholders that point back to the original file. It’s easy to see how that can vastly reduce storage needs. Another benefit of data de-duplication is the ability to keep more backup sets on near line storage. With the amount of backup disk space reduced, more point-inREAL CIO WORLD | m a y 1 5 , 2 0 1 0

57

5/7/2010 5:07:34 PM


Deep Dive | Storage time backups can be kept ready on disk, making file restoration faster and easier. This also allows for a longer backup history to be maintained. Instead of having three versions of the file to restore, users can have many more, enabling a very granular approach to file backups and accommodating loads of backup history. Disaster recovery is another process that greatly benefits from data de-duplication. For years, data compression was the only way to reduce the overall size of the off-site data set. Add in de-duplication and the backup set can be reduced even more. Why transfer the same data set each night when only a small portion of it changed that day? De-duplication in disaster recovery makes perfect sense: Not only is the transfer time reduced, but the WAN is used more efficiently with less overall traffic.

How Data De-duplication Really Works Data de-duplication works by analyzing a chunk of data — be it a block of data, a series of bits, or the entire file. This chunk

Why De-duplicate?

is run through an algorithm to create a hash, and the unique hash in turn is stored in an index. As each new chunk of data gets ‘hashed’, it is compared to the existing hashes in the index. If the hash is already in the index, this means that chunk of data is a duplicate and doesn’t need to be stored again. If not, the hash is added to the index, and so on. It is easy to see that the number of hashes potentially stored in an index can number in the millions or tens of millions. The greater the amount of data analyzed, the larger the hash index will be. It is possible, if the hashing algorithm isn’t sophisticated enough, to have a hash collision. In other words, different chunks of data come out with the same hash value. The analysis engine must then use other criteria to determine which hash value is correct and how to deal with the other chunk of data. If no method is available to sort out the hash collision, both hashes must be discarded and those chunks of data cannot be indexed for de-duplication. The system needs to spend some time learning what data it will see in order to

Firms are deploying de-duplication in a number of places in the infrastructure stack to address practical real world challenges. And the benefits are here to see. Driving down cost: De-duplication offers resource efficiency and cost savings through reduction in datacenter power, cooling and space. As well as reduction in storage capacity, network bandwidth and IT staff. Improving backup and recovery service levels: De-duplication can significantly improve backup performance to meet limited backup windows. De-duplication technology also leverages random access disk storage for improved recovery performance compared with sequential access tape methods. Changing the economics of disk vs. tapes: De-duplication makes disk-based backup feasible for a wider set of applications. Tape still has a role in enterprise datacenters due to its economics and archival properties. However, cost per GB declines for disk when used with de-duplication are leading to disk costs equal to or less than tape costs. Reducing carbon footprint: De-duplication reduces the power, cooling and space required for storage thus reducing carbon footprint and enabling environment responsibility. De-duplication technology addresses many of the long-standing backup challenges that firms large and small have been dealing with for over a decade. These challenges have included keeping up with doubling of data growth, meeting shorter backup windows, enabling faster recovery from operational and disaster related failures and the like. Source: IDC

58

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Deep Dive_May2010_StripAds.indd 58

build the initial indexes. This means there is a cold penalty the first time a piece of data passes through the de-duplication analysis engine. On subsequent passes, when duplicate data passes through the engine, the duplicates are found and taken out of the system. Only the hash reference is stored or transmitted (depending on the situation), greatly reducing the number of bits needed to represent a much larger piece of data. To see how this works, let’s return to our example of the PDF e-mail attachments, which went out to 100 employees. Each PDF is a whopping 25MB, which means the mail server’s storage has just increased by roughly 2.5GB (plus the e-mails themselves). The mail server does a nightly backup to an NAS appliance. The backup, as we saw above, has just increased a minimum of 2.5GB from a single e-mail blast. If the network admin wants to keep a week’s worth of backups for disaster recovery, the weekly backup is now 17.5GB larger due to the bulk e-mail. By removing the duplicate attachments and storing only one, de-duplication keeps the nightly backup to a paltry 25MB — the size of the original attachment. For the weekly backups, instead of storing 17.5GB, the backup system stores only 175MB — a 90-percent reduction in storage requirements.

File, Bit, and Block-level De-duplication There are three levels of de-duplication: file, bit, and block level. In file-level de-duplication, each file is analyzed as a whole, and a hash value is created. The de-duplication software looks at the whole file regardless of size. The problem is that if a file is different by just a single bit, its hash will be different from other versions. This makes file-level de-duplication inefficient and impractical. For example, one user creates a document in Word 2003. Another user opens it and saves a copy (without any changes) to another location, but in Word 2007 instead. Same file, different format — different hash. Bit- and block-level de-duplication are vastly more efficient. In both cases, the analysis engine breaks files into chunks or segments and creates hashes based

Vol/5 | ISSUE/07

5/7/2010 5:07:35 PM


Deep Dive | Storage on those smaller pieces. The smaller pieces allow for a higher probability of a match; the more matches, the greater the reduction of data on the system. Unlike with file-level de-duplication, if a portion of the file changes, bit- and block-level de-duplication only transfer the changed portion, not the entire file. The goal of a bit- or block-level system is to have a very large index of hashes so that it can identify many more duplicate items. But if the placeholder is nearly the same size as the removed data, the effort to create a hash isn’t worthwhile.

Source, Target, Or Inline De-duplication Just as there are three levels of de-duplication, there are also three ways of performing de-duplication. Source de-duplication analyzes files on the original server and creates hashes for each file, bit, or block. During a backup operation, a backup program with a built-in de-duplication engine creates hashes for the data and compares them to the backup destination (which is also running the de-dupe backup software). If the hashes are the same, only the placeholder reference is copied over. If not, then the data is copied and the hash value is updated on the backup server. This method is good for keeping network bandwidth under control. De-duped data is transferred over the LAN/WAN instead of each and every bit. The backup destination’s backup set is smaller because it receives optimized data only. Plus, a software package is the only change to the network. No new appliances are required to de-dupe the data. The overall backup process tends to be slower if backup software does the de-duplication, due of the extra overhead needed to create, analyze, and transfer hashes and their associated chunks of data. Target de-duplication does not impact source server performance in any way. The de-duplication is performed on the destination server once the backup is finished. Any off-the-shelf backup software package can be used with target de-duplication since no hashes are created on the source server. Target de-dupe is backup system agnostic — it really doesn’t

Vol/5 | ISSUE/07

Deep Dive_May2010_StripAds.indd 59

Want Efficiency? De-duplicate

Increasing demand by IT buyers for greater storage efficiencies will drive adoption of de-duplication solutions over the next 12 months. According to new survey results from IDC, over 60 percent of respondents are either in the process of de-duplicating or have plans to de-duplicate their primary, backup, or archive data in the coming year. “The tipping point for spending on de-duplication solutions stems from larger projects around improving storage performance, virtualizing servers, and disaster recovery,” said Laura Dubois, program director, Storage Software. “The importance of de-duplication and the opportunities it presents were validated by the public bidding war waged in 2009 between EMC and NetApp for de-duplication heavyweight, Data Domain.” Firms with more than 6 petabytes (PB) of total disk storage place higher priority on storage performance as a driver, and 57.5 percent of survey respondents said their organizations are currently implementing de-duplication or have already de-duplicated primary data including virtual servers. Additionally, users’ satisfaction with de-duplication technology is highest in the areas of performance, overall system, and management. Other key findings from the IDC survey include improvements in implementation, ROI, and vendor commitments; EMC (including Data Domain) and NetApp dominate hardware-based de-duplication while EMC, Symantec, and IBM dominate softwarebased de-duplication. —By Channelworld India

care how the data gets to the backup server. Duplicate detection is typically very fast in this situation, making it ideal where performance is a primary concern. There are two major drawbacks to target de-duplication: storage space and network bandwidth. Because the de-duplication doesn’t take place until after the backup set arrives, the backup server must have a large amount of spare disk space to store the backup set until de-duplication is complete. One of de-duplication’s selling points is the reduction of spindles, but in target de-duplication, a few more are needed to hold the pre-de-duped backup set. Target de-duplication also requires all of the backup data to be sent over the wire to the backup server. Again, because de-duplication doesn’t happen until the backup job arrives at the destination; every bit has to be sent over the network, in some cases via a slower WAN link. Inline de-duplication splits the two previous methods by using a specialized appliance to intercept the data in flight before it makes it to the backup server. It creates hashes on the fly

as data is passed through the appliance and only sends through the unique chunks. This approach to de-duplication doesn’t require any special software on the source server (like target de-dupe, it is backup-application agnostic) nor does it need additional storage space on the destination server. Stored data is already optimized, reducing hardware costs and power consumption. Because it does the de-duplication analysis on the fly, the inline approach also reduces the consumption of network bandwidth by taking redundant bits off the wire. This is especially important when backing up to a server via the WAN. Fewer bits on the wire equals faster transfer. The only real downside to inline data de-duplication is the cost of the specialized appliance — and the possibility that if the appliance is underpowered or the wrong unit is chosen, the device itself can become a bottleneck. For the most part, that won’t happen, but keep scalability in mind when choosing an inline solution. CIO Send feedback on this feature to editor@cio.in

REAL CIO WORLD | m a y 1 5 , 2 0 1 0

59

5/7/2010 5:07:37 PM


Deep Dive | Storage

SSD SSD

in the s

Datacenter SSD storage can fix datacenter bottlenecks for a price. And some pioneers say that with declining prices, the technology is worth a try — for certain applications. By Tam Harbert

VOl/5 | ISSUE/07


Deep Dive | Storage

IllUSTRATIO N BY VI KAS KAPOOR

A

s a new decade opens, more and more datacenter operators find themselves struggling with an enterprise bottleneck not of their own making. Servers’ hard disk access times have not kept up with the increasing speed of their CPUs, and the resulting lags can be a limiting factor in some database and caching applications, particularly those involving software as a service and cloud computing. As these applications become even more popular, the bottleneck is likely to get worse, analysts predict. And the database appliances designed to target the problem, which have been on the market for years, remain expensive. Enter flash memory. Until recently, the high cost of flash memory limited it to consumer items with relatively low storage capacities, like digital cameras and MP3 players. But over the last three years, flash prices have declined an average of 60 percent a year — a rate that’s “faster than has ever happened in the world of semiconductors,” according to market research firm In-Stat. This confluence of factors is bringing flash, in the form of solid-state drives (SSD), into datacenters. While it’s still early in the adoption curve, analysts predict an increase in the use of SSDs in the enterprise. At its annual datacenter conference in December 2009, research firm Gartner called flashbased solid-state storage one of the most

VOl/5 | ISSUE/07

important technologies of 2010. Gartner isn’t alone in singing SSDs’ praises. “Anybody that’s managing and buying storage should be taking a look at the flash options on the market and determining whether it’s a good fit for them,” says Andrew Reichman, a storage analyst at Forrester Research.

Costing Out SSDs To be sure, despite their declining prices, SSDs remain much more expensive than hard drives on a cost-per-gigabyte basis. Depending on performance, SSDs can range in cost from $3 to $20 per gigabyte (about Rs 135 to Rs 900), says Jim Handy, an SSD analyst with Objective Analysis, a semiconductor market research consultancy. Hard drives range in price from 10 cents per gigabyte on the low end to $2 to $3 per gigabyte (about Rs 4 to Rs 135) for enterprise-level, 15,000-rpm models. “The ratio between SSD and HDD pricing is about 20:1 for the bottom end of both technologies, and this will stay in effect for the next several years,” Handy says. Gartner analyst Joe Unsworth puts the average cost of SSDs at about 10 times that of hard disk drives. But cost-per-gigabyte is not the most important factor in some of these access-heavy datacenter applications. Rather, it’s cost per IOPS (input/output operations per second). The average enterprise-class, 15,000rpm hard drive achieves 350 to 400 IOPS, says Scott Stetzer, director of enterprise SSD products at STEC, an SSD vendor. The average enterprise-class SSD can push 80,000 IOPS. “It’s a world of difference in the level of performance.”

7-year Forecast Enterprise HDDs vs. Enterprise SSDs (units in millions) HDDs

SSDs

2007

27.8

0.0

2008

33.8

0.1

2009

37.0

0.2

2010

40.2

0.4

2011

42.9

1.4

2012

43.7

3.5

2013

42.7

5.4

The seven-year compound annual growth rate is projected to be 4.8 percent for hard-disk drives and 154.8 percent for solid-state drives. SOURC E: I N-STAT

Such performance benefits can outweigh the cost differential in certain instances, particularly once you factor in savings from energy costs (SSDs use less energy than hard disks). Applications that require extremely fast location and/or retrieval of data — like credit-card transactions, video on demand or information-heavy Web searches — stand to benefit from solid-state technology. Although the total share of the enterprise market that uses enterprise SSDs will remain small, the technology will play an important role in these critical applications, analysts predict.

SSDs in the Datacenter SSDs are making their way into datacenters in several ways. First, most server vendors

real CIo WorlD | M a y 1 5 , 2 0 1 0

61


Deep Dive | Storage are offering SSDs as options, either as a replacement for a hard drive or in addition to one. Fusion-IO even offers an SSD on a PCI card. Second, most storage vendors are incorporating SSDs into their systems. EMC, for example, buys SSDs from STEC and incorporates them into its Symmetrix and Clariion products. And finally, several companies are building general-use data-access appliances that incorporate SSDs. For instance, Schooner Information Technology has developed appliances for the Memcached caching system and for MySQL databases. Rather than targeting specific functions like business analytics or Web caching, as previous data appliances have, Schooner incorporates flash in an appliance that aims to improve performance of the entire data access tier of the datacenter, explains John Busch, Schooner’s chairman and chief technology officer.

Where SSDs Make Sense SSDs appeared on the radar of datacenter operators in the last year to 18 months. “It became an increasingly frequent conversation [with customers] about 13 to 14 months ago,” says Steve Merkel, director of solutions engineering at Latisys, a managed service provider. “We started running a few scenarios where it looked like SSDs would be a reasonable technology to use.” Today, there are two prime uses. The Planet, which runs eight datacenters and hosts 18 million Web sites worldwide, sees customers using SSDs on their host servers to speed Web analytics

skin

62

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Deep Dive_May2010_StripAds.indd 62

Forecast: Enterprise SSD shipments 6000

Thousands of Units

5000 4000 3000 2000 1000 0 2007

2008

2009

2010

2011

2012

2013

Source: Gartner Inc.

applications, says Rob Walters, director of product management. Latisys sees a similar uptake. One customer, for example, does analytics and profiling of Web site visitors, then offers up targeted ads. “They have very large data sets,” says Merkel. “They can’t sit around and wait for traditional storage [response times].” Others simply need their databases to run faster, so they use SSDs rather than traditional storage arrays. One Latisys client, for example, is using SSDs to increase the speed at which it maps address changes for the US Postal Service. Rather than put entire databases in the system’s main memory, which would be extremely expensive, the company stores data on SSDs for quick access. Using flash memory in this way is about 10 times more expensive than storing it on hard drives, but about 10 times less expensive than storing an entire database in dynamic RAM, says Forrester’s Reichman. A third option is to put frequently accessed information in cache memory. To cache or not to cache was a dilemma for NextStop.com, a Web service designed to help people find and recommend things to do in particular locales around the world. Launched in June by two former Google product managers, the service was running into problems with slow page loading. The company had invested

in a caching infrastructure at its managed hosting provider, but that wasn’t helping much because there was no way to predict what needed to be in the cache. “Take a page about a particular restaurant in Singapore, for example,” says NextStop co-founder Carl Sjogreen. “That page isn’t accessed every five minutes — it’s only accessed when people are searching for it, and that happens sporadically.” And yet, when someone does want to access that page, it needs to be available quickly, so caching doesn’t solve the problem. Self-professed dorks, Sjogreen and NextStop’s other co-founder, Adrian Graham, had been tracking the development of SSD technology for some time and thought it might solve their problem. When Intel announced its X25-M SATA SolidState Drive in late 2008, NextStop started testing it and pushed its managed service provider to offer it. About four months ago, NextStop replaced the hard drives in its RAID with SSDs and saw an immediate decrease in average page load time of more than 50 percent. “We found SSDs to be an order of magnitude cheaper than putting all that content in memory,” says Graham. “SSDs are more expensive than disk drives, but they are a lot cheaper than RAM. It was a nice middle ground for us.” In fact, in certain situations, SSDs might actually be less expensive than using hard drives. When trying to squeeze the highest performance out

Vol/5 | ISSUE/07

5/7/2010 5:07:46 PM


Deep Dive | Storage of hard drives, datacenter operators sometimes use only a small fraction of the outside of each spinning disk, a process called short-stroking, explains Reichman. The technique requires more drives and uses only 10 percent to 15 percent of the capacity of each drive, wasting the rest of that capacity. “So if you’re using only 10 percent of each drive, and [SSDs] are 10 times as expensive, it could be cost-effective to replace 10 hard drives with one SSD, and you’d end up getting better performance and potentially equal or lower cost,” he explains. Those types of highly specialized uses aside, Reichman isn’t seeing datacenters widely adopting SSDs. While storage vendors see SSDs playing a role in tiered storage, most datacenter operators haven’t yet figured out which data can really costjustify the high performance of SSDs. “They often don’t know which volumes are the most latency-sensitive,” he says. Without knowing that, why buy media that’s 10 times more expensive? Another barrier to using SSDs in tiered storage is an inability to accelerate only certain parts of a database. “In most storage systems, you have to provision the whole volume on the same tier,” Reichman explains. “So if anything in that table needs to be accelerated, then the whole thing has to be accelerated.” Storage vendors are starting to add features that allow selective tiering of databases. Such blocklevel tiering, “could be the killer app to allow users to take advantage of solid state economically,” he says. Toward that goal, EMC recently released FAST technology that supports automated database tiering, allowing SSDs to be used to their full potential. Managed service providers are starting to experiment with ways of integrating SSDs into a storage hierarchy, using each type of storage in the most economical way. For example, The Planet is using SSDs in a shared infrastructure service that it plans to offer its clients soon. The infrastructure consists of 10 diskless host machines using a centralized storage array that contains both hard disks and SSDs.

Vol/5 | ISSUE/07

Deep Dive_May2010_StripAds.indd 63

On the Horizon: Flash Cloud How cloud computing could be flash storage’s savior. Cloud computing and flash-based storage, two of the fastest-growing technologies, may drive each other forward as Internet-based service providers demand faster access to large amounts of data. Flash storage has lower reading latency than hard disk drives because it doesn’t need to spin a disk to get data. With SSDs (solid-state disks) and PCI Express flash cards, it’s possible to read data in less than a millisecond, compared with several milliseconds on a hard drive. And this could be useful in public clouds, where one service provider may be delivering data to hundreds or thousands of customers at the same time. “Flash is going to have a very, very significant effect on not just storage, but infrastructure as a whole,” says Facebook VP of Technical Operations Jonathan Heiliger. Since the 1990s, financial firms and other companies have stored large amounts of transaction data in DRAM for quick access, says storage consultant Tom Coughlin. Flash isn’t quite as fast as DRAM, but it’s less expensive, takes up less space, uses less power and holds onto its contents whether it’s powered or not. As a result, IT leaders are starting to see it as a more affordable path to fast reads of certain types of information, such as metadata, transaction data and bits needed for transactions. And this makes it interesting for online entertainment and Internet-based companies, says Coughlin. But aside from questions about long-term reliability, one of the biggest remaining challenges for the technology is how to put the right data in it. Cloud computing companies are likely to be at the cutting edge of overcoming those challenges as they try to keep up with booming Internet activity. “The decisions they make, or the innovations they create, are likely to change the market in a big way,” says Forrester Research analyst Andrew Reichman. – By Stephen Lawson The Planet will use the SSDs as a caching layer to accelerate the disk performance. The shared infrastructure will be more resilient and offer better performance for the same price that some customers pay to lease low-end servers, says Walters.

Game-changing Technology Long term, SSDs may end up changing the architecture of the datacenter. “I don’t think we’re seeing all the performance gains from SSDs that we could see,” says NextStop’s Graham. “There are more gains to be made by thinking about how to re-architect some of these systems that have for a long time been optimized to work well with disks.” That’s the focus of some vendors, like Schooner, which emphasizes that it isn’t just adding SSDs, it’s focusing on revamping

the entire data-tier architecture. “The [individual component] technology today is way ahead of what the architectures can utilize,” says Busch, adding that re-architecting could produce 10 times greater improvement in performance. No one’s arguing that SSDs are going to replace hard drives. Although market research firms, including In-Stat, expect enterprise SSD shipments to remain comparatively miniscule for many years to come compared to sales of hard disk drives, those small numbers may belie solid state’s impact. “When these solutions are done right,” notes Gartner’s Unsworth, “SSDs have a transformational capability within the datacenter.” CIO

Send feedback on this feature to editor@cio.in

REAL CIO WORLD | m a y 1 5 , 2 0 1 0

63

5/7/2010 5:07:48 PM


Deep Dive | Data Loss Prevention

Virtual Headaches Head aches By GARy AnTHeS

Storage virtualization is hot, and for good reason. But its benefits bring added layers of complexity. What to watch out for.


Home Remedies There’s an age-old choice in IT — whether to adopt a best-of-breed strategy for the power and flexibility it can bring, or go with a single vendor for accountability and simplicity. J. Craig Venter Institute (JCVI) believes in best of breed. The genomic research institute runs Linux, Unix, Windows and Mac OS in its datacenter. For storage, it draws on technology from multiple vendors. “It’s quite a heterogeneous environment,” says computer systems manager Eddy Navarro. “Thankfully, we have a very talented staff here.” And a talented staff was just what was needed to master the many flavors of storage virtualization, which can make multiple physical disks look like one big storage pool. Like JCVI, many organizations are enjoying the lower costs and added flexibility of storage virtualization. But the benefits can come with some headaches. Here, five IT leaders who have led successful storage virtualization projects offer advice for relieving the pain.

Headache 1 Managing Multiple Vendors For several years, JCVI had employed software-based virtualization in the form of Red Hat’s Linux Logical Volume Manager, which allows logical partitions to span multiple disk drives. More recently, the notfor-profit research institute added hardwarebased virtualization in the form of NetApp’s V Series system to create a single virtual pool of storage consisting of EMC Symmetrix disks and legacy Clariion disks. The Clariion drives, which came into the datacenter from a corporate merger, were being poorly utilized, Navarro says. Now, the NetApp V system reformats data going to and from the EMC disks, “and then you carry on just as if it’s another NetApp system,” Navarro says. That enabled JCVI to wring better performance from the legacy disks. Each of JCVI’s vendors makes its own unique contribution to a

VOl/5 | ISSUE/07

powerful and cost-effective storage architecture, Navarro says. But the diversity comes at a cost. “When you are talking about multiple vendors’ hardware — and they compete with each other — it may not be the easiest thing to get support when something goes wrong,” he says. “So you have to ensure compatibility first and foremost, and you have to know in advance something is going to work.” Pain Killer: Study the documentation, do your homework, and ensure that your approach has been tried before and is certified by the vendors, says Navarro. And if you don’t have experienced technical staff, he adds, be prepared to hire some outside professional help.

Headache 2 Dealing with Extra Layers of Technology and Complexity Even companies with less-complex environments report that although virtualization can ultimately simplify storage administration, putting it in place and tuning it is a demanding job. Lifestyle Family Fitness, a rapidly growing chain of 60 health clubs is a Microsoft shop built around SQL Server and .Net development of Web applications. For storage virtualization, it uses IBM’s SAN Volume Controller (SVC), disk arrays from IBM and EMC, and IBM Brocade SAN switches. IBM DS4700 disks provide 4 Gb/second. Fiber Channel connections for the company’s online transaction processing applications, while the Clariion drives handle lessdemanding jobs like backups. The IBM SVC was brought in to resolve an I/O bottleneck. The high-speed Fiber Channel drives and cache on the SVC appliance opened up the bottlenecks almost like an I/O engine would, says Mike Geis, director of IS operations. Moreover,

Before venturing into storage virtualization, make sure to stick this list on your board. 1 Be realistic: This is going to be complicated� 2 Assign someone on staff who really knows the technology, or hire a consultant, at least at the beginning� 3 Do your homework� Read the documentation and understand the pieces and their interfaces� 4 Be sure that your gear and their interfaces are certified by your vendor(s) for the versions/releases that you have� 5 Consider upgrading your old storage gear when you go to storage virtualization� 6 Make sure you have thoughtful policies and procedures for maintenance and backups� 7 Guard against virtual sprawl (of both storage and servers)� 8 Ask yourself: Do I really need this?

the setup allowed Lifestyle Family Fitness to use its new IBM-based SAN while continuing to use its old EMC SAN. “In the past,” he says, “you’d bring in a new SAN and have to unload the old one.” Geis says the SVC architecture promises vendor independence. He says he has a “great relationship” with IBM, but if that ever changed, he could easily bring in drives from another supplier and quickly attach them directly to his storage network. “We aren’t held hostage by the vendor,” he adds. But the advantages come with some difficulties, Geis notes. “You are adding complexity to your environment. You add overhead, man-hours of labor, points of failure and so on. You have to decide if it’s worth it.” Pain Killer: “Pick strong partners — both vendors and implementation partners — and make sure you are not their guinea pig,” Geis advises. real CIo WorlD | M a y 1 5 , 2 0 1 0

65


Deep Dive | Storage

Headache 4 Setting Up Those Important Management Tools Like Rose, Jon Smith takes a very broad view of virtualization. “For me, a server is no different from a hunk of data storage, and I can move it wherever I want,” says the CEO of ITonCommand, a hosted IT services provider. “Whether it’s running the operating system or it’s just data, it’s all storage.” Smith says that eventually virtualization technology will enable any data to go anywhere — on directattached storage when high performance is needed or somewhere on a SAN when speed is less critical and a higher level of redundancy is required. ITonCommand uses HP BladeSystem c3000 disks for direct-attached storage, and LeftHand Networks Virtual SAN Appliances and LeftHand’s SAN/iQ software on an HP StorageWorks array for storage virtualization on its iSCSI SANs. The company is now standardizing on Microsoft’s Hyper-V hypervisor, part of Windows Server 2008, for server virtualization and on Microsoft’s System Center Virtual Machine Manager

94

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Deep Dive_May2010_StripAds.indd 66

for administration. The glue that holds everything together, Smith says, is Microsoft’s new Virtual Machine Manager for provisioning and managing physical and virtual computers. One of the ideas behind virtualization is to ‘abstract’ the physical layer in IT from the software layer, to in essence mask hardware boundaries from the application and the application’s users. But the benefits of hiding the physical resources — greater flexibility, better utilization and potentially easier administration — come at a price, says Eddy Navarro, computer systems manager at J. Craig Venter Institute. “So you have this abstracted area of storage, and you have a performance issue,” he says. “In the traditional model, it’s a straightforward deduction to say that this area maps to these disks so that must be where the hot spot is. But with virtualization, if things are running slowly and there’s this amorphous pool of storage, where exactly is the problem? You want to make sure you have the proper tools to tell you where the problems are.” Storage virtualization vendors have tools for performance monitoring and troubleshooting. “But with these

enterprise tools, it’s a matter of installing agents everywhere, and it can balloon out of control,” Navarro warns. “The agents themselves can cause this giant admin task. So is it really worth it to have this huge application monitoring things, or do you want a little bit of smarts and do some in-house work to write some custom scripts to tell you what’s going on?” JCVI has chosen to apply carefully targeted smarts via some homegrown software. “Fortunately,” Navarro says, “we have the technical expertise to do that. If you don’t, it’s not easy to set that up.” “With VMM on a display, a system admin can look at all the virtual servers’ hypervisors across my whole environment, all in one spot, and adjust them,” he says. “It’s pretty cool stuff.” It’s cool when it’s set up, but getting there isn’t so easy, he acknowledges. “System Center is new, and so is [Hyper-V]. It took us a while to figure out how to connect all our old virtual machines into the hypervisor. It’s not the easiest setup out of the box.” Smith says continued virtualization at ITonCommand will result in a true “utility computing” model for his clients. “It will take a while, but people will stop thinking of physical boxes running one

Vol/5 | ISSUE/07

5/7/2010 5:08:03 PM


Deep Dive | Storage operating system. Hardware will be nonexistent to the end user. It’s just going to be, ‘How much horsepower and storage do you want?’” Pain Killer: “Find an expert who knows virtual technology and knows Microsoft System Center,” says Smith.

Headache 5 Getting the Right Gear Babu Kudaravalli, senior director of business technology operations at National Medical Health Card Systems, gives this definition of storage virtualization: “The ability to take storage and present it to any host, of any size, from any storage vendor.” He’s pursuing those goals with three tiers of storage, each supported by a different HP StorageWorks product. The technology used in each tier is chosen for the mix of cost, performance and availability it offers. Kudaravalli uses high-end HP XP24000 disk arrays for the most demanding and mission-critical applications, lower-cost Enterprise Virtual Array 8000s for second-tier applications, and Modular Smart Array 1500s for archiving, test systems and the

Vol/5 | ISSUE/07

Deep Dive_May2010_StripAds.indd 67

like. His five SANs hold 70TB of data, of which about 35TB in the EVA and MSA tiers is virtualized, he says. Kudaravalli says there are several things to be careful about when buying storage virtualization products. First, be aware that vendors typically certify their products to work with the latest versions of other vendors’ products. If you don’t have those exact versions, your interfaces might not work. He says this is a good reason to think about replacing your old gear when you go to a heterogeneous storage environment — or at least to keep current on the latest releases. Second, Kudaravalli says that although virtualization should ultimately simplify storage management, setting up a virtual system is complex. Careful planning and an understanding of the limitations of products is crucial. A few years ago, vendors had very different definitions and standards for virtualization, says Kudaravalli. “But now they seem to be coming together,” he says. “They are trying to offer similar features and capabilities, but it is not completely mature.” Pain Killer: Although storage virtualization is often undertaken to better utilize existing resources, it may

have a perverse impact, says Rick Villars, a storage analyst at IDC. “The whole point of virtualization is to make it easier to provision or move a resource, to create a new volume or another snapshot, or to migrate data from one system to another,” he says. “But when you make something easy to do, people are induced to do it more often.” According to Villars, volumes, snapshots, data sets and even applications can needlessly proliferate. “You can go from being more efficient to more wasteful. It’s just what can happen with virtual server sprawl.” Preventing that is a matter of policies, procedures and good business practices, not technology, he says. Users agree that there are many technical details to master when pursuing storage virtualization. But Navarro suggests starting with a basic question: Why am I doing this? “Virtualization is a hot word, a big thing. But is it really necessary? There are benefits, but ask yourself if you are doing it for the right reasons, or just because you want to be on the cutting edge. It’s very easy to get swept up in these groupthink movements.” CIO Send feedback on this feature to editor@cio.in

REAL CIO WORLD | M a y 1 5 , 2 0 1 0

95

5/7/2010 5:08:05 PM


by Matt Prigge

Scouting for the Perfect Fit Strategy|The endless variety of enterprise storage solutions means that you can find pretty much exactly what you need — if you have your requirements straight. The trick to avoiding overspending or under-specification is to know what questions to ask when you’re deciding which option is right for you. Here are three questions I seldom see asked or answered with enough care or thought:

your storage needs over the past year or two, both the rate of capacity — and performance — load increases. Keeping future storage-intensive initiatives in mind, such as that pie-in-the-sky paperless office everybody keeps talking about, is important too. Either way, you can’t make useful forward-looking estimates without knowing the period of time that you’re planning for.

the road that makes replacing the entire storage system seem like a more costeffective option. How will my storage grow with me? Any enterprise-class storage solution should be able to scale with your needs. The differentiating factor among storage vendors is exactly how they support that growth and how much it will cost you. For example, most vendors scale their storage

The trick to avoid overspending or under-specification in your storage solution is to know what questions to ask when deciding which option is right for you. What is the expected lifetime of the solution I am designing? If you want to prevent your budget from being shattered by unexpected expenses a few years down the road, give very serious thought to how long you want your storage solution to last. Sudden cost increases in the out years usually stem from unplanned capacity growth and unexpected late-year support costs. Define the period of time that your solution is likely to be used, and you reduce the risk of either eventuality. When there’s data explosion, it’s always tricky to assess your future storage without low-balling it or simply coming up with a fantastically large number. Giving yourself an accurate estimate will require lots of information about 68

M a y 1 5 , 2 0 1 0 | real CIo WorlD

Understanding how your prospective storage vendors price their equipment and support also plays into this heavily. Some vendors skew their prices toward the hardware, while the all-important support contracts that apply to them are relatively cheap and easy to plan for. Other vendors will sell the hardware for a lot less, while the support and software subscriptions make up the bulk of what you’re paying for. Knowing which boat you’re in before you buy is absolutely critical. It may lead you to buy a five-year support contract at the outset when you have more bargaining leverage and competitive advantage. Otherwise, you could be stuck with a support renewal bill three years down

capacity by simply adding more shelves of disk onto common controller architecture. Pricing these expansions out before you’ve bought anything is a good way of gauging what these iterative increases may cost if and when you need them. In these architectures, it’s important to recognize that, as you add more storage capacity and disk performance, you may not be adding more resources to the controllers. Knowing that the controller resources you’re purchasing are sized correctly for the maximum potential growth during the solution’s lifetime is critically important. Other vendors scale their storage systems by adding entire shelves that include both disk and controller resources. This architecture tends to scale much more

VOl/5 | ISSUE/07


by Jeff Bardin

Deep Dive | Storage gracefully from a technical standpoint. As more disk is added, so is a comparable amount of controller and interconnect bandwidth, which essentially guarantees that you won’t create a bottleneck. On the other hand, these architectures generally require more expensive growth iterations. If you simply need a small amount of additional storage, you might not be able to simply add a few disks; instead, you’d end up buying a whole new enclosure complete with controllers and a full stack of disks. Regardless of the architecture you end up with, knowing how your solution will scale — even if you don’t think you’re going to need to scale it — is very important. Before you buy, ask yourself what the solutions you’re evaluating would cost if you needed to scale them at twice the highest rate than you think you’ll need to. Hopefully your estimates are accurate and you’ll never need to do that, but the answer to that question can often shed light on critical scalability differences between the options you consider. Will I have enough capacity to make use of the features I’m buying? Capacity planning is an interesting process. On one hand, it seems to be fairly simple — just add up the amount of storage you’re using now and do your best to assess how much that number is likely to grow going forward into the future. But it certainly doesn’t stop there. One of the reasons we spend the kind of money we do on enterprise-class storage is that it generally comes with a lot of very helpful features such as site-to-site replication and block-level snapshots. You might even have the ability to de-duplicate your data directly on your primary storage array. None of these features come without costs: not just any licensing that might be required for them, but also less obvious costs related to capacity and performance. For example, taking regular SAN-side snapshots of your data, a practice I highly recommend, will use additional storage space on your array that you may not have planned for. Depending upon the

Vol/5 | ISSUE/07

Deep Dive_May2010_StripAds.indd 69

Monitor Your Storage, Or Else...

Long-term trending is critical to maintaining a surprise-free storage infrastructure. Why you need to monitor storage: A comprehensive storage monitoring strategy helps you catch problems before they’re problems. Whether it’s simple stuff, such as being able to forecast when you’ll need more disk, or the more complex task of determining whether an application slowdown is storage-related, monitoring is often the only way to answer these questions with any certainty. Without an early-warning system, sooner or later you will be faced with an unexpected capital investment or a prolonged troubleshooting adventure. What you need to monitor: So if monitoring is so important, what exactly do you want to see? The most obvious thing people think about is capacity. Having graphs of storage pool usage going back for years is incredibly helpful when figuring out when additional storage resources may be necessary. But if the CFO asks you today whether you’ll need to make additional storage purchases next fiscal year, can you answer that question with any certainty? If not, you may not have the right information in front of you. Simply knowing how much storage is in use now won’t help — you really need to know how much was in use two years ago and how much it has grown since. How to monitor your storage: This usually depends on what you have. Different SAN and NAS manufacturers provide vastly different monitoring capabilities and tools — and some don’t provide any. Some vendors provide dedicated, easy-to-install tools that monitor just about anything you’d need to know. Other vendors provide complex, hard-to-manage enterprise monitoring tools that lack flexibility. If your storage platform doesn’t come with decent monitoring software, you may be able to get all of the information you’ll need through SNMP or something similar. Then the trick is figuring out how to record the information and store it for review later. —M.P

snapshot algorithm that’s being used, your transactional disk performance may also take a hit. Estimating the performance degradation and planning additional performance capacity to combat it may save you from buying an expensive feature — and turning it off because it slows your storage system down too much. By the same token, if you’re configuring your solution to replicate to another building or site and planning on using synchronous replication, you may need to plan additional disk and controller resources to handle the additional load that synchronous replication will exert on your hardware. The same applies to online de-duplication. Though a huge capacity-saving feature, online de-duplicaiton takes a significant amount of controller resources to accomplish. If that’s a feature you’re planning to use extensively, you need

to know the impact it will have on your storage system’s performance. Good answers to these three questions will undoubtedly raise other, more specific ones. Coming up with detailed, realistic replies is vital — and will give you a much better chance of buying a solution that will fit future storage needs, without a raft of unexpected expenditures. CIO

Matt Prigge is contributing editor to the InfoWorld Test Center, and the systems and network architect for the SymQuest Group. Send feedback on this feature to editor@cio.in

REAL CIO WORLD | m a y 1 5 , 2 0 1 0

69

5/7/2010 5:08:09 PM


y o u r l i f e & c a r e e r pa t h

Working with the New Boss By Diane Frank b u s i n e s s l e a d e r s h i p It doesn't always take a year full of global upheaval for a company to acquire a new CEO. The fact is a company'sy top leadership can go and come, just like anyone else. And when money is tight and staff members are stressed, it's more essential than ever for CIOs to have a game plan for what to do when there's a new personality with new priorities in the top office. "Hang on to your hat," says Maurizio Laudisa, CIO of medical services company Lifelabs. "It's all about who you get." Gaining a firm grasp on that subjective aspect of the transition can be the key to doing more than just surviving, according to a CIO Executive Council survey. To flourish, CIOs should get themselves in front of the new CEO and get good and familiar with the person behind the title. As members of your company's CXO club, you need to be able to answer questions like: How do they like to communicate? How do they operate? And what are their experiences with IT? That last question is particularly important to tease out early, says Don Zimmerman, CIO of Wendy's/Arby's Group. No matter what a CEO knows about IT and how technology can advance the business, dealing with a new CEO whose experiences with IT are good versus one who has been burned is a much different situation.

IllUSt ratI o n by M M Shan Ith

thrive

What you need to do to impress the new chief executive.

70

m a y 1 5 , 2 0 1 0 | REAL CIO WORLD

Vol/5 | ISSUE/07


To flourish, CIOs should get themselves in front of the new CEO and get good and familiar with the person behind the title.

threeminute coach Help ! How do I keep my team from getting derailed by social networking distractions? Gaylan Nielson and Brent Peterson, authors of Fake Work and co-founders of The Work Itself Group.

Always: Ensure that every employee knows what work is critical. Our research shows that about 50 percent of all work is 'fake work', or work that is not directly linked to organizational strategy. Social networking and other workplace distractions are symptoms of a common problem — workers without clear expectations. Always translate organizational strategies into clear tasks and provide a forum for team discussion and alignment among co-workers. Alignment requires a concerted effort to create ownership, determine task importance, coordinate workloads and establish accountability for results. Sometimes: Meet and have strategic conversations and reinforce accountability. Use real work tasks — those that are critical and connected to strategy — to manage and monitor performance. Managers should avoid noticing and rewarding non-critical work. Instead, review the obstacles and resource issues that obstruct real work and let the team help curb social networking distractions. Never: Assume that strategies are understood and are finding their way into daily work. Managers think they can align people, but employees must adjust and find ways to drive strategies day to day. You can't ignore distracting behaviors, but ultimately you have to focus on monitoring real work tasks and expected outcomes. In areas like IT, exciting new projects often dominate, and strategic links are ignored. Most people don't like doing fake work. They want to accomplish work of value. CIO

thrive

Even for CIOs with a business background, laying this extensive personal groundwork can seem like a lot of slogging for little gain. But skipping it can have repercussions, warns Jim Burdiss, VP and CIO of Affinia Group. "It went very poorly the first time I went through this," he admits. "You need to gauge in that first meeting where all of their interest and experience comes from, and I couldn't have missed it more if I threw a dart from 100 yards. I was over-anxious, trying to justify what IT was doing and IT expenditures, and I didn't listen enough. I talked too much and I absolutely was not myself." Unless you're at a relatively small company, the CEO isn't going to have a personal bond with every member of the IT staff. But the CIO doesn't have to be the only one in the relationship. Lack of communication leaves the door open for rumors. Some CIOs have even taken the step of asking the new CEO to meet with senior IT staff, ensuring that at least some level of personal relationship goes deeper into the organization. Getting direct reports involved with establishing IT's message to the top also creates a higher level of comfort with long-term benefits for both sides. No matter what steps CIOs take for themselves or their staff, one of the biggest opportunities for the IT organization as a whole can be the chance to establish IT's role as a partner in value for the company and the CEO. "You have to market yourself and your staff as an innovative group that knows how to leverage technology for more effective business purposes," says Chris Barber, SVP and CIO of the Western Corporate Federal Credit Union. And sometimes CIOs need to stop talking and listen to a new perspective. "Be open and honest," suggests Allan Davies, CIO, Asia Pacific, for global logistics company Dematic. "But realize that like with all departments, a fresh pair of eyes on the IT organization can often see beneficial changes." CIO

Gaylan Nielson and Brent Peterson have been Diane Frank is content development specialist for the CIO Executive Council, a peer advisory service

organizational consultants for over 20 years working

and professional association founded by CIO 's publisher. Send feedback on this feature to editor@

with Fortune 1000 companies. Send your queries to

cio.in.

vijay_r@cio.in

Vol/5 | ISSUE/07

Thrive_May_2010.indd 89

REAL CIO WORLD | m a y 1 5 , 2 0 1 0

71

5/7/2010 3:54:35 PM


Insights from Members of the CIO Governing Council

Sunil Mehta

The senior VP & area systems director (central Asia) of JWT joined the company in 1998. Mehta started his career as a programmer and system analyst at First Financial Compuserve. After serving the company for five years, he moved to the Business India group (which was a client he was servicing as an employee of FFC) under the same role. Mehta rose to become the CEO of a group company (Business India Graphics) after serving as GM-IT. Before joining JWT, he was VP and CEO-special projects at Dalal Street Journal (DSJ).

Role Playing From being a CIO to a CEO and back, Sunil Mehta tells you what you need to do to play both roles with aplomb.

Photo by Srivatsa Sh an dilya

Career A better car, a bigger house, and more money — people think being CEO means getting all this and more. CIOs often look towards the other side thinking the grass is greener there but I believe that it is equally green on both sides. It just depends on the way you look at it. Being CEO isn’t a cake walk. When I was with the Business India group, I set up a new unit called Business India Graphics of which I was made the CEO. There are serious responsibilities that are attached to that position. Not just because the title is heavier but it asks for a much higher level of maturity. So, if you want to be CEO, you have to go beyond technology. That's why even as CIO, I used to take interest in establishing new ideas, global diversification of the company and strategies on tying up with MNCs. It isn’t easy. There is no template that you can follow or an advice that can help you on your way to CEO-dom. You have to show zeal and potential, be proactive, and come up with things that you would do differently. People keep talking about boardlevel involvement for a CIO but it’s the executive management committee that runs the company on a day-to-day basis. The board is a leadership group that meets once or twice in a year. So, I feel that getting on that committee is more important in terms of going beyond one’s regular role. I have been lucky to have reported to the MD, Chairman, and CEO in my career. This helped me be a part of the inner circle of 72

Mentor_May010.indd 90

M A Y 1 5 , 2 0 1 0 | REAL CIO WORLD

the higher management. I think it is very crucial for CIOs to have constant and close interaction with the management because they need to understand the business intentions of their organizations. At the end of the day, some CIOs are domain experts, some are business enablers, and some others are visionaries. Those are the guys that will become CEOs. But CIOs today are very cautious and tend to play by rule books, policies, compliance guidelines, etcetera. But a CEO role demands instant decision-making capabilities and an appetite for risk, keeping in mind the accountability that rests on his shoulders. My CEO stint helped me focus more on revenues, finances, cost management, people management, and growth. Being CIO is good enough. There is nothing wrong in sticking to being CIO. You can even come back to being a CIO after holding the CEO position. If what you are doing excites you, then there’s nothing like it. Like I said, it becomes really interesting

CIOs today are very cautious and tend to play by rule books. But a CEO's role demands instant decisions and an appetite for risk. once you go beyond your IT-only role. The CEOs and CFOs I have worked with have been very supportive in recognizing the business initiatives that I have suggested. So, even if you are not titled as CEO, you definitely assist the company in making sure that good ideas are implemented. And to be able to do that you need to keep yourself updated with the latest. After all, a technology guy is like a doctor: You can’t treat a patient with a medicine or a skill you picked up some 30 years back. CIO As told to Anup Varier Anup Varier is correspondent. Send feedback on this interview to anup_varier@ idgindia.com

Vol/5 | ISSUE/07

5/7/2010 3:56:59 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.