July 15 2006

Page 1

Alert_DEC2011.indd 18

11/16/2011 3:51:42 PM


From The Editor

The last cover story of CIO (A New Game Plan) looked at how IT strategists are

The Worm Turns Business executives are becoming more accountable for IT results.

evolving into business strategists. It explored how CIOs are beginning to delve into business metrics, and not just IT measures to determine the success of IT projects. As our advisory board sees it, IT executives are getting more accountable for business results. An equally interesting trend is that business executives are becoming more accountable for IT results. Take the case of ICICI Bank, which takes a different approach to IT projects. Its MD & CEO K.V. Kamath, tells us in this issue that in his organization, business units ‘own’ technology and implement it. A major benefit of this, according to him, has been the dramatic reduction in the number of failed projects as well as a reduction in implementation time. “Since the user departments are conscious of their own overheads, it leads to further savings,” he adds. He goes on to say that the CIO’s role in all this is to cut down “technological anarchy” since various user groups can end up creating systems that don’t communicate with one another. “Here, a CIO is somebody who brings together and keeps in harmony the various user groups in the context of their technological needs,” he observes. If CIOs don’t step up A CIO I was speaking to the other day was efforts to be a part of appreciative of the banker’s remarks, though decision making, they may he observed that if various departments in a find themselves out of a multi-business unit scenario did indeed own meaningful role. their technology, then the CIO would have an extremely difficult time in attempting to bring all of them onto a common platform — since business unit heads would have no obligation to listen to him as their loyalties would lie more with their own units for which they are responsible and against whose performance they are measured. I find this a scary development. Fascinating for sure, but still scary. Corporate executives scale managerial heights because the expertise they bring along is vital to an organization’s growth. Whether this means that IT leaders talk business or business leaders tilt toward IT is hardly a moot point from an enterprise’s point of view. Either way, if CIOs don’t step up their efforts to be a part of corporate decision-making, they may find that if they’re still around, it won’t be to provide their company with weapons to take on competition. It will be to take orders and put out technological fires.

Vijay Ramachandran, Editor vijay_r@cio.in

Vol/1 | ISSUE/17

Content,Editorial,Colophone.indd3 3

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

7/13/2006 3:45:13 PM


content JUly 15 2006‑ | ‑Vol/1‑ | ‑issue/17

Executive Expectations View From The Top |  36 K.V. Kamath, MD & CEO, ICICI Bank, has an unusual approach to IT implementations: he ensures that user groups take complete responsiblity. CIOs, he says, should focus on harmonizing these groups. Interview by Gunjan Trivedi

Executive Coach IT’s Good News |  23 CIOs need to help their staffers understand that if they can hold on during the tough times, the payoff is just around the corner. Column by Susan Cramm

P hoto by Srivatsa Shandilya

Sajid Ahmed, global head-infrastructure, Syntel, has a network philosophy that is rooted in the manufacturing sector: too much inventory is always too bad.

2 6

Network Infrastructure

COVER STORy | A cable Fable | 26

Cove r: Imaging by bines h sreedharan

I

The 10G Ethernet standard might have had varied applications for IT services company Syntel and plant nutrients manufacturer Nagarjuna Fertilizers. But both believe it is the way to future-proof the enterprise. Feature by Gunjan Trivedi

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Content,Editorial,Colophone.indd8 8

Peer to Peer Red Light, Green Light |  20 How do you align your IT department with the company’s business goals? One CIO used project management discipline and the Traffic Light Report. Column by Dr. Catherine Aczel Boivie

Energy Powering Down |  40 Electricity-hungry equipment, combined with rising energy prices, are devouring data center budgets. Here’s what you can do to get costs under control. Feature by Susannah Patton

more »

Vol/1 | ISSUE/17

7/13/2006 3:45:18 PM


content

(cont.) departments Trendlines | 15 Internet Solutions | Indian SMBs to Spend Big Staffing | Not More but Better Engineers Networks | FIFA Network Tackles Tough Challenges Technology | Next World Cup's IT Test Security | Robots Patrol Berlin Football Stadium Innovation | Great Play on Small Screens Book Review| Why Technology Fails Management Report | User-Friendly IT Governance Telephony | Five Steps to VOIP Success

Essential Technology | 54 Data Center Intelligence | The Shrinking Server

By Christopher Lindquist Security | The Endpoint of Endpoint Security

By Scott Berinato

From the Editor | 3 The Worm Turns | Business executives are

becoming more accountable for IT results. By Vijay Ramachandran

Inbox | 14

4 6

NOW ONLINE For more opinions, features, analyses and updates, log on to our companion website and discover content designed to help you and your organization deploy IT strategically. Go to www.cio.in

c o.in

Govern innovation on track |  46 Transacting with 10 lakh passengers and making Rs 1 crore a day, few e-governance applications can top the success of the Passenger Reservation System. Now, 20 years since its launch, the Indian Railways is steaming ahead to better the system and improve customer experience.

2 3

Feature by Rahul Neel Mani

The electric reforms |  50 Using IT-powered solutions, Bharat Lal Meena, MD, Karnataka Power Transmission Corporation and chairman of the state’s electric supply companies, has managed to track and control transmission and distribution losses — and brought the consumer into the loop too. Interview by Balaji Narasimhan 10

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Content,Editorial,Colophone.indd10 10

Vol/1 | ISSUE/17

7/13/2006 3:45:23 PM


marketing & sales

Ma nagement

President N. Bringi Dev

COO Louis D’Mello Editoria l Editor Vijay Ramachandran

Bangalore

7th Floor, Vayudooth Chambers

Assistant EditorS Ravi Menon;

15 – 16, Mahatma Gandhi Road

Senior Correspondent Gunjan Trivedi Chief COPY EDITOR Kunal N. Talgeri

COPY EDITOR Sunil Shah www.C IO.IN

Editorial Director-Online R. Giridhar D es ign & Production

AMD

32,33

IDG Media Pvt. Ltd.

Special Correspondent Balaji Narasimhan

4,5

Tel : +919342578822 mahantesh_godi@idgindia.com

Bureau Head-North Rahul Neel Mani

Avaya

Mahantesh Godi

Harichandan Arakali

Advertiser Index

Canon IBC

59

HP

13

Interface

25

IBM BC

60

Krone

29

Banglore — 560 001

Delhi Nitin Walia Tel : +919811772466 nitin_walia@idgindia.com IDG Media Pvt. Ltd. 1202, Chirinjeev Towers 43, Nehru Place

Creative Director Jayan K Narayanan

Designers Binesh Sreedharan Vikas Kapoor Anil V.K. Jinan K. Vijayan Unnikrishnan A.V. Sasi Bhaskar Vishwanath Vanjire Sani Mani MM Shanith Anil T PC Anoop

Photography Srivatsa Shandilya

Production T.K. Karunakaran

T.K. Jayadeep Marketing a nd Sa les

General Manager, Sales Naveen Chand Singh brand Manager Alok Anand Marketing Siddharth Singh Bangalore Mahantesh Godi Santosh Malleswara Ashish Kumar Delhi Nitin Walia; Aveek Bhose Mumbai Rupesh Sreedharan Nagesh Pai; Swatantra Tiwari Japan Tomoko Fujikawa USA Larry Arthur; Jo Ben-Atar

Singapore Michael Mullaney UK Shane Hannam

New Delhi — 110 019

Mumbai Swatantra Tiwari

Microsoft Cover Gatefold

Tel : +919819804659 swatantra_tiwari@idgindia.com IDG Media Pvt. Ltd.

Mercury

9

208, 2nd Floor “Madhava” Bandra – Kurla Complex Bandra (E)

Netmagic

21

Raritan IFC

2

R&M

11

RIM

57

SAS

19

Mumbai – 400 051

Japan Tomoko Fujikawa Tel : +81 3 5800 4851 tfujikawa@idg.co.jp

USA Larry Arthur Tel : +1 4 15 243 4141 larry_arthur@idg.com

Singapore

Wipro

6,7

Michael Mullaney Tel : +65 6345 8383 michael_mullaney@idg.com UK Shane Hannam Tel : +44 1784 210210 shane_hannam@idg.com

All rights reserved. No part of this publication may be reproduced by any means without prior written permission from the publisher. Address requests for customized reprints to IDG Media Private Limited, 10th Floor, Vayudooth Chambers, 15–16, Mahatma Gandhi Road, Bangalore 560 001, India. IDG Media Private Limited is an IDG (International Data Group) company.

Printed and Published by N Bringi Dev on behalf of IDG Media Private Limited, 10th Floor, Vayudooth Chambers, 15–16, Mahatma Gandhi Road, Bangalore 560 001, India. Editor: Vijay Ramachandran. Printed at Rajhans Enterprises, No. 134, 4th Main Road, Industrial Town, Rajajinagar, Bangalore 560 044, India

12

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Content,Editorial,Colophone.indd12 12

Vol/1 | ISSUE/17

7/13/2006 3:45:24 PM


reader feedback

Too Much Data from all the issues of CIO I have gone through, I noticed that your's is the very first magazine with a perspective of a CIO who is more like a CEO and not merely a technology manager. It’s an approach that is quite different from any other publication in the market. You rightly show a CIO as someone who strategizes not only for IT but for business as well. CIOs, I feel, are increasingly getting closer to business and are locating avenues for generating business-oriented productivity as well as profitable values for their organization. Interestingly, I have noticed that the moment a CIO dons a CEO’s hat, he realizes that he is actually providing an abundance of information — maybe too much. As a CIO, he does not realize the business value of information he is making available in abundance and continues to generate report after report. CIOs should be more businessoriented and should always look out for value while making information available — especially to customers. A CIO should realize that in making information available to customers, there is an investment of the organization’s assets and resources. And, therefore, returns are key. This realization can only come about when a CIO gets past his familiar role of an information provider and steps into 14

Inbox.indd 14

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

the shoes of a CEO. And here, CIO, as a magazine, is helping. Speaking of information, I would like to confess that I am inundated with information coming from every medium, be it print or online. It’s hitting me hard. The volume of information coming in from different media day-in and day-out makes it difficult to keep track of what I’m reading or where it’s from. In fact, it's come to the point where I've had to get into the habit of indexing and filing relevant information to address this problem.

“cIOs should realize that making data available to customers means an investment from the organization. So returns are key. ”

CTO, AFL

response and sales activities. I suggest that for more information you get in touch with the Infinity Cell at infinity@wockhardtin.com.” — Editor

Customer Gauge

Management Bestsellers

I read the CIO View From The Top interview of Habil Khorakiwala, chairman of Wockhardt (Growth Tonic, 1 February, 2006) with keen interest. Being a player in the healthcare sector here in Sri Lanka, I was interested in knowing more about the Infinity Prescription Information System, the IT initiative that Khorakiwala talked about, which enables Wockhardt to gauge a customer’s experience and views — and the vendor for this system.

I would like to compliment the editorial team and staff of CIO for the quality, content and coverage of various issues in the magazine. The selection of topics is admirable; the articles are focused on issues faced by CIOs on a daily basis. After a long stint on the non-IT side of business, doing business development, strategic planning and finance, I am currently a CIO. I find the articles in CIO particularly relevant because they deal with concepts from a management perspective and not just from a technology perspective. Some of my colleagues, who are functional heads, find that the articles make for interesting reading. I would like to suggest you also feature a page or a corner on management tips or excerpts from a current bestselling management book. I think it will be a worthwhile addition.

S.R. Mallela

KaSTuRi WilSon Director-shared services Hemas Holdings, Sri Lanka

“Infinity is Wockhardt’s indigenously developed sales force automation application that front-ends into the portal www.wockinfinity.com. This application facilitates online tracking of customer What Do You Think?

ViKaS GaDRe CIO, Rallis India

We welcome your feedback on our articles, apart from your thoughts and suggestions. Write in to editor@cio.in. Letters may be edited for length or clarity.

editor@c o.in Vol/1 | ISSUE/17


new

Indian

SMBs to

Spend

Big

S O L U t I O n S Indian small and medium businesses (SMB) will invest up to Rs 5,400 crore on beefing up their Internet infrastructure and solutions this year, says AMI Partners, a research firm headquartered in New York. This will account for almost a sixth of the total IT spending in the country this year, according to the firm. “We defined the businesses as small if their staff strength ranged between one and 99, and as medium if their employees numbered

net

*

hot

*

unexpected

between 100 and 499,” says Avimanyu Datta, an AMI analyst. “The biggest spenders among small businesses are in the retail sector, and the manufacturing sector for medium-sized businesses,” he added. SMB spending on Internet access is forecasted to amount to nearly Rs 3,735 crore this year, a 24 percent jump from last year. Almost all medium businesses and more than half of PC-using small businesses already have access to the Internet. However, nearly half of the small businesses use dial-up access and, therefore, an untapped market still exists in the broadband Internet arena, Datta says. The market is also untapped in the way the Internet is used in the country. “We use a proprietary ‘Global model’ in our surveys. It showed that Indian SMBs lag behind those in South Korea, for instance, in exploiting the Net for business growth,” he says. “In India, It’s still considered an economical channel for communication rather than a strategic growth driver.” Website and e-commerce penetration remains comparatively low among small businesses. Small businesses operate on very low margins as their customers are usually medium businesses, not large ones. As contact with end consumers increases, website and e-commerce penetration is likely to correspondingly increase alongside the margins, he says. — By Harichandan Arakali

S t A F F I n G Conventional wisdom says the US must produce more engineers or risk losing its lead in innovation to India and China, which graduate far more engineers each year than the US does. But that’s not the problem, according to Forrester Research: the US simply need better ones. “The race to develop more engineers evokes the Cold War arms race, and it’s an approach that won’t work in today’s global economy,” says Navi Radjou, a vice president with Forrester. “The US should not be looking at China and India and

Vol/1 | ISSUE/17

saying they are the new Japan and Russia. These countries are trading partners.” Instead, to remain competitive, the US must breed a new type of engineer who is as business-savvy and multi-culturally minded as he is technically trained, says Radjou. Creating better engineers involves retraining current employees and revamping university engineering curricula to reflect interdisciplinary thinking. But even kindergarten teachers can prepare tiny innovators for engineering by encouraging collaboration and promoting

multi-cultural education. Nevertheless, argues Martin Jischke, president of Purdue University, numbers have power. Jischke, who is also an adviser to President Bush, supports interdisciplinary education but insists, “A nation that lacks a critical mass of scientists and engineers will not lead the world in the decades ahead.” — By Lauren Capotosto REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

15

Il lUSTRAT Io N By B IN ESH SREEDHARAN

Not More but Better engineers


B Y J O h n B L AU

FIFA Network Tackles Tough Challenge The World Cup was not only the planet’s largest sporting event, it was also home to what many experts call the world’s biggest communications network built for a single event. Over 15 terabytes of data, the equivalent of more than 100 million books, traveled across a converged voice and data communications network that linked stadiums, control centers, management offices, hotels, railroad stations and other numerous outlets. With over 3 billion fans following the games — the most viewed World Cup ever — “it wasn’t a good time to make a mistake,” said Peter Meyer, head of IT at FIFA. By the end of the games, over 200,000 people, including 15,000 sports reporters, connected to the network, built and managed by Avaya. The company installed an all-IP network, which, for the first time in the history of the games, included voice as an integrated and not dedicated service,

netwOrkS

according to Douglas Gardner, MD of the Avaya FIFA World Cup program. As part of its VoIP (voice over internet protocol) service, the vendor provided a centralized, server-based directory service, as well as client software that allowed authorized users to make phone calls from their notebook computers. Toshiba equipped FIFA organizers with more than 3,000 notebook computers for the event. “There were a lot of people moving around at the games,” said Toshiba spokesman Manuel Linnig. “We configured the notebooks for quick, easy access to all the LANs and wireless LANs within the FIFA network, and installed several security features, including fingerprint readers.” Although WLAN technology was widely deployed, FIFA required all systems to be linked by cable as well.

Next World Cup's IT Test Germany put on a good World Cup tournament — so good that some experts wonder whether they’ll be able to pull off a similar feat in South

Il lUSTRAT Io N By y SASI S BHASkAR

technOLOGy

16

Trendlines.indd 16

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Africa four years from now. For sure, Germany raised the bar for technical precision at a World Cup event. More than 3 million fans, who attended 64 games, had tickets embedded with an RFID (radio frequency identification) chip that contained identification information. Police, fire and emergency squads at every stadium used TETRA (tap-proof digital terrestrial trunked radio) phones. The handsets were also equipped with a GPS (global positioning system) transceiver to locate and direct emergency personnel. The National Information and Cooperation Center (NICC), located inside the German Interior Ministry, was manned around the clock by security experts from around 20 government agencies, including Europol and Interpol. Each of them operated their own communications network and, in

some cases, established special units to monitor activities during the games. The Federal office of Criminal Investigation, for instance, had a unit watching out for possible terrorist attacks. In addition, each stadium was equipped with 23 HDTV cameras and connected via dual fiber optic links to a super high-speed backbone capable of transporting data at speeds up to 480Gbps. Web services will also play a big role in South Africa. “A big difference between the German World Cup and the previous tournaments was our extensive use of Web services,” said Meyer. “And the big difference between the German games and the South African games will be our efforts to have everything based on Web services. The Internet is going to be bigger than it already is in our operations.”

Vol/1 | ISSUE/17

trendLIneS

world cu w up world cup trendlines


trendLIneS

u p trendlines

IMAGING By y BINESH SREE DHARAN

robots Patrol Berlin Football Stadium S e c U r I t y Robots had a heyday in Germany. While one group of robots completed a World Cup in Bremen, another was diligently patrolling Berlin’s Olympic Stadium, one of 12 World Cup venues. Germany won the RoboCup 2006 championship. The RoboCup’s goal has been to develop a team of autonomous humanoid robots that can win against a human World Cup champion team by 2050. With RoboCup over, robot enthusiasts shifted their attention to another group of robots busy protecting the historical Berlin stadium. Eleven robots patroled the stadium every night. The robots, built and operated by Robowatch Technologies GmbH in Berlin, were on contract to provide security services at stadiums in Berlin, Frankfurt and Leipzig. One group of robots was programmed for outdoor surveillance. With the help of Global Positioning System technology, they patroled the outer area of the Berlin stadium. The robots communicated with the control center via 3G mobile technology, using 3G cards, which connected to a dedicated base station in the stadium. All data was encrypted. The robots were equipped with video cameras, radar sensors, temperature gauges and infrared scanners. Camera heads on the robots can turn in all directions and can be controlled remotely. “If a robot registered a change, like a hole in a fence, it sent an alert to the control center,” said Stengl. “ Radar sensors can also detect human bodies through most walls.” The outdoor robots can also be equipped with technology to detect alpha, beta and gamma rays, as well as biological weapons. The robot software system is based on the open source Linux operation system.

Vol/1 | ISSUE/17

I n n O v A t I O n Among the list of technology firsts at the World Cup was the use of a system that could format action to fit on mobile phone displays. For the first time at the global sports event, near-live video clips for mobile phones were being produced with a technology known as ‘pan and scan’, initially developed to adapt screen films to the smaller TV format, according to Brian Elliott, head of international broadcast operations at Host Broadcast Services (HBS). It works like this: production editors select footage and use pan-and-scan technology to zoom in and produce a clip that is exciting and relevant for small screens. Adding to the experience is picture quality. Because the originating feed in the HDTV (high-definition television) digital format ensures that every part of the 16:9 formatted picture is of high quality, any selections of that picture are guaranteed to be equally clear. The edited clip, typically four minutes in length, is encoded and stored on a central file server which licensees, such as mobile phone operators, can transport to their home countries either via dedicated data connections or as a secure FTP download from the Internet. operators can stream the clips to customers over their mobile networks. T-Mobile International struck a deal to stream games live to customers with phones using high-speed connections. In addition to streaming, several service providers offered broadcast mobile TV services, which sends signals from TV stations directly to mobile phones equipped with special antennas. A report from the market research arm of Informa PlC projects up to Rs 1,350 crore in revenue from fans watching streamed or broadcast coverage of the World Cup games on their phones.

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

IMAGING By AN Il T

Great Playon Small Screens

17

7/13/2006 11:50:24 AM


A skeptical view of new technology. revIew Most business books have one big idea that the author draws out for 200 pages. Usually, the idea can be explained in a review, and the book itself, colored with vague examples, is best to skim. Pip Coburn’s The Change Function: Why Some Technologies Take Off and Others Crash and Burn is different. Yes, he has a big idea — that you can predict which technologies will succeed or fail by applying a simple formula — and this idea is covered in the first 10 pages. (And yes,

BOOk

you’ll understand the idea by the end of this review.) But what sets this book apart from so many others is that the rest of it is worth reading. Coburn fills his book with detailed case studies, gleaned from his years as managing director of the technology group at UBS Investment Research, that illustrate his formula. And thanks to the detail, the cases actually teach you something. The core of Coburn’s formula is that a new technology should be widely adopted only if it

meets two criteria. First, it has to address a problem and, second, that problem has to be more painful than the perceived pain of adopting the new technology. “We need to balance our wonderment at technology’s role in creating nirvana with a skepticism about business models,” writes Coburn, now head of his own investment company, Coburn Ventures. The picture-phone, for example, which AT&T pushed from the 1960s to the 1980s, failed because the need to see the person you’re

talking to isn’t a big enough problem to justify buying and learning how to use an expensive new phone system. That’s a lesson that should hit home for CIOs. Applying Coburn’s insight, CIOs should force IT projects through a gauntlet of questions, asking not only if a particular technology will solve a problem but also whether that problem is one that users are desperate enough to have them do something about. — By Ben Worthen

User-Friendly it governance M A n A G e M e n t r e P O r t A new version of Control Objectives for Information and related Technology (Cobit) is better organized than its predecessors, and provides clearer links between IT processes and business goals, says Craig Symons, an analyst at Forrester Research. The improvements make this IT governance tool something that CIOs should seriously consider using, he adds. Issued by the IT Governance Institute (ITGI), Cobit is a set of guidelines that IT organizations can use to employ management best practices, measure IT processes, and align IT with business processes. IT departments can use this tool to measure their value to business, as well as comply with regulations such as Sarbanes-Oxley. Yet, less than half of the CIOs in the financial services industry, where Cobit is most popular, are even aware of the guidelines, according to ITGI’s own assessment. The reason: since it was created in 1996, Cobit has expanded to cover so many control objectives and management guidelines that it’s difficult to make sense of them. A Cobit primer issued by the Sandia National Laboratories in June 2005

lamented: “Of the possible objectives, on which do you spend the effort, and which do you ignore?” Answering that question has become much easier, Symons says, thanks to Cobit 4.0. The authors have done away with Cobit’s multiple volumes, integrating the information about all 34 highlevel control processes, 239 detailed control objectives and related management guidelines into one volume. What’s more, the material is organized by how one approaches projects: first, plan and organize, next, acquire and implement, then deliver and support, and finally, monitor and evaluate. In addition, Symons says, Cobit 4.0 offers more details on how to measure whether IT processes are delivering what the business needs. For example, under the heading ‘defining a strategic plan’ (one of the 34 high-level processes), Cobit outlines how to do that: engage executives on alignment with business goals and develop a proactive process to quantify business requirements. Cobit 4.0 is available at www.itgi.org.

Cobit 4.0 offers CIOs more details than its predecessors on how to measure if it processes are delivering what the business needs.

18

Trendlines.indd 18

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

— By Allan Holmes

Vol/1 | ISSUE/17

trendLIneS

Why Technology Fails

The Change Function: Why Some Technologies Take Off and Others Crash and Burn By Pip Coburn Portfolio, 2006, Rs 1,347.50


Steps to VoIP Success Sage Research recently announced the winners of a contest recognizing organizations that have successfully rolled out voice over IP (VoIP) systems. Here is advice from the top practitioners.

TE L E P H O NY

1 2

Do research. Talk with CIOs, who have rolled out VoIP, about their experience before you call in vendors.

Set clear expectations. Explain to users what the new system will — and won’t — be able to handle. “Nothing causes a problem like planning a simple install and discovering that the upper management was expecting all the bells and whistles,” says one IT manager.

3

Know your network. Gather all documentation for your company’s network infrastructure, so that whoever designs the new system has all the specifications of the current IP network. Winners cited network stress tests and bandwidth tests as important planning tools — and noted the importance of upgrading network equipment before rolling out VoIP to guard against failures.

4

Outsource development. Or not. Companies that hired outsiders to design and implement a VoIP system usually lacked the internal expertise to do it themselves. Conversely, organizations that kept design and deployment in-house claimed that it’s now easier to support and maintain their systems because their staff knows the current infrastructure better.

5

Train users before the rollout. Give employees time to become familiar with their new phone’s functionality before they actually need to rely on it for everyday work.

— By Thomas Wailgum Vol/1 | ISSUE/17

Trendlines.indd 19

7/13/2006 11:50:26 AM


Dr. Catherine Aczel Boivie

PEER TO PEER

Red Light, Green Light How one CIO used project management discipline and the Traffic Light Report to align her IT department with her company’s business goals.

W

hen I joined Pacific Blue Cross in 2003 as vice president of IT, the CEO and I agreed on two foundational principles: one, technology has no value by itself, and two, technology management must switch its focus from operational to business enabler. These principles may seem self-evident but the truth is when there’s a flurry of projects, all of them important to some aspect of the business, technology management can all too easily get swept away in putting out fires. This seemed to be what was happening at Pacific Blue Cross when I arrived. With nearly 2 million members covered, Pacific Blue Cross is the market leader in providing health-care and dental coverage to residents of British Columbia. Our subsidiary, Blue Cross Life, also offers life insurance and disability income protection. While I understood my mission — turning the IT department into an enabler of business — the journey has been far from straightforward. It’s been a long road with many bends and even a few dead-ends. Even so, there’s no doubt we’re making progress. How did we do it?

Project Management to the Rescue Illustration MM Shanith

First and foremost, we began to align every project to Pacific Blue Cross’s Balanced Scorecard. The Scorecard shows and measures the organization’s performance from six perspectives: quality, quantity, infrastructure, clients, people and community-related goals. Every project is now justified in terms of how it supports the goals described in the Scorecard. That keeps the company’s goals clearly in sight for all and shows how technology relates to and enables the 20

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Coloumn Red Light Green Light.in20 20

Vol /1 | I SSUE/17

7/13/2006 3:07:48 PM


Dr. Catherine Aczel Boivie business.After assembling a list of all the projects we were working on, I introduced the project management office (PMO) function. This office oversees all projects of more than one month’s effort — from the business case to a postimplementation review. We fashioned this as a corporatewide PMO, because all projects require disciplined management and almost all projects at Pacific Blue Cross have a technology component. The business welcomed the PMO, since it gave it an overall view of all projects (in the planning, execution or close-out stages) as well as monthly updates on their status. To ensure the success of this new process, the PMO (a manager, three project managers and two-to-five contract project managers, depending on the project mix) conducted three, half-day workshops for personnel who would be managing or sponsoring projects. These sessions helped to obtain buy-in. But as theory is nothing without practice, the workshops were followed by individual coaching sessions for the project managers. Finally, we put all the project management–related processes online so that everyone has a shared knowledge base. The PMO regularly reports project status to IT management, the executive committee and the board of directors through the aptly named Traffic Light Report. This report lists each project along with a short description, its schedule, the stage it is at and a status comment. Next to the project is a red, yellow or green symbol that quickly identifies whether the project is on time, on budget and on scope. The report is also posted on our intranet so that all employees can follow a particular project’s progress. The Traffic Light Report has become a critical tool for demonstrating technology’s value to the business.

Gate 1, Gate 2, Gate 3... In my second year at PBC, I introduced IT governance — in the form of a gating process. Why wait until the second year? Because past experience has taught me that too much change, introduced too quickly, does more harm than good. Circuit overload may cause pushback! Now, before a project can even get to the doorstep of the approval process, it has to be sponsored by a vice president. Only then is it ready for gate 1 — or what we call the ‘thumbs up/down’ gate. Here, the executive sponsor presents the idea to the executive committee, and the members give it a thumbs up or thumbs down. If the project is approved, the proposal continues to the next level (gate 2). For gate 2, the sponsor presents a detailed cost-benefit analysis, because no matter how wonderful the idea, if the cost is too high for the projected benefits, that’s it. If it passes gate 2 and is more than $ 500,00, a gate 3 or detailed business case is prepared. Gate 4 is only for projects that have to be reviewed

VOL/1 | I ssu E/17

Coloumn Red Light Green Light.in21 21

Mission Critical Application Hosting

ERP, Online Trading, Online Billing, Custom Business Applications...

Your Search Ends Here Today, NetMagic Provides mission-critical IT services to large and medium enterprises across the globe, saving them millions of dollars in managing their IT infrastructure. Our customers depend on us for managed Hosting, critical Mail services, Network and Server Security, Bandwidth and Connectivity, Network Monitoring and Management, and Data Storage and Backup solutions, among other things So give us a call, drop us a line or come by and talk to us about your requirements. Hosting Services | Managed Services | Remote Management Professional Services | Disaster Recovery and Business Continuity Remote Management | Bandwidth & Connectivity

Netmagic Solutions Pvt. Ltd. 22, Nirlon Complex, Western Express Highway, Goregaon (E), Mumbai - 400063 Phone: 91-22-26850001. Fax: 91-22-26850002 Email: marketing@netmagicsolutions.com


Dr. Catherine Aczel Boivie

22

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Coloumn Red Light Green Light.in22 22

RESOURCES | ESSENTIAL TECHNOLOGY |

Whipping IT Data Into Shape Enterprises are tackling the ugly problem of reconciling widelydistributed data, driven in part by the move to service-oriented architecture. Read more of such web exclusive features at www.cio.in/features

Columns Hitting to All Fields How do you move into a new industry without taking a backward step?

www.cio.in/columns Resources

COLUMNS

Complete Data Protection Strategy Building a robust data protection strategy is now a business requirement. IT Consolidation Drivers and Benefits Organizations are finding themselves in a position where consolidation does not necessarily

|

|

|

Radical Reform The IT promise: sustainable competitive advantage.

TOP VIEW

GOVERN

Recruitment Rethink for IT Are companies investing enough in business training of recruits in IT teams?

Download more web exclusive whitepapers from www.cio.in/resource

FEATURES

the founding chair of the CIO Association of Canada. Send feedback on this

The Web Lights up Your World Internet users say the Web has improved their efficiency at the workplace -- and beyond

Read more of such web exclusive features at

NEWS |

Dr. Catherine Aczel Boivie is senior VP of IT at Pacific Blue Cross. She is also

column to editor@cio.in

Features

WebExclusive

at the executive level because they cost more than $ 1 million or are very complex. Gate 5 is the post-implementation review. Last year, for example, our senior VP of client development presented a gate 1 concept of adding dental and extended health usage information to our member portal. The executive committee gave his idea the thumbs up. The PMO then helped him to develop a gate 2 high-level business case, which was also approved. As a result, our members can now obtain information online about their coverage usage, thus reducing the number of calls to our call center. Through our ‘gating governance,’ everyone can see how projects are prioritized and approved. We’re able to plan and measure benefits of projects and assess how they enable our business initiatives. Even though projects of more than a month’s effort are now overseen by the PMO, projects of smaller effort also need to be kept aligned with business needs. To do that, we established the change review board. Headed by a manager from the business area, this board reviews all change requests, assigning priorities based on how the change will enable the business. Three years ago, IT was swamped with more than 700 change requests and there was not much hope we’d get to all of them. So we asked all owners of change requests to re-submit any requests that were over a year old. With the change review board prioritizing the requests, we’re now able to see which are the most pressing, which ones overlap and which will be superseded by some that are more encompassing. In three years we’ve come a long way, and these new initiatives would not have succeeded without the active involvement of senior management and the IT team’s hard work. But we still face a number of challenges. For instance, we still need to do a better job of conducting regular postmortems of larger projects to gather lessons learned. And even when we do gather lessons learned on an ad hoc basis, we still aren’t disseminating them to the appropriate personnel, so they can learn from previous experiences. Some of the new technology we are implementing — portals, document management and knowledge management — should help with this. The road to Rome wasn’t built in a day. But there’s one thing we’re confident of: the framework we’ve put in place not only ensures that IT is already more focused on business, but that focus is advancing our business goals. In short, IT is enabling all the employees of Pacific Blue Cross to serve our members better. CIO

Log In Now! CIO.in

REAL WORLD 7/13/2006 3:07:48 PM


Susan Cramm

EXECUTIVE COACH

IT’s Good News CIOs need to help their staffers understand that if they can hold on during the tough times, the payoff is just around the corner.

O

Illust ration Sasi Bhaskar

ne of the fundamental jobs of a leader is to paint an exciting, positive view of the future that connects to the emotional concerns of his or her staff. This task is particularly critical for CIOs now, as the stress on their departments intensifies with the business’ hunger for IT services appearing to be bottomless even as it continues to stipulate that IT control its costs. Adding to the demands on the CIO’s staff is the growing technical sophistication of their internal business partners, intensified competition from external service providers and the increasing trend toward the commoditization of IT processes, jobs and software. Mark Walton, former CNN chief White House correspondent, writes in his book, Generating Buy-In: Mastering the Language of Leadership, that “stories are the language of our mind”. The stories that have the greatest impact — on our thinking, our emotions and, ultimately, our actions — are stories “that project a positive future.” The leader’s challenge is to “connect the dots between the future you want and the future your audience wants” by: b eing clear about what you want your audience to do describing the positive future your audience should see i llustrating how that future will fulfill their needs, wants and goals asking for commitment and taking the first steps toward bringing about the future you want. In June's column (The Worst Job in IT), I challenged readers to begin crafting a story about how IT will exceed the expectations of the enterprise while ensuring the success and satisfaction of the IT staff. I truly believe that IT is entering a new stage of maturity where it will be easier for IT professionals to do their jobs without the fear, overload and confusion that exists today.

Vol/1 | I SSUE/17

Coloumn IT's Good News.indd 23

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

23

7/12/2006 1:48:45 PM


Susan Cramm

EXECUTIVE COACH

The projected slowdown in labor growth will play to the advantage of those professionals who possess innovation-expert-type skills. IT has always been a difficult profession. At first, business partners were completely dependent on IT and there was, in truth, very little IT could deliver due to limitations in technology and IT’s necessary focus on delivering foundational transaction systems. Then, as PCs and client/server computing became prevalent, IT’s frustrated business partners tried to address their own needs through the use of ‘end user tools’ without involving IT. So, IT found itself either fighting for control of systems (and people) that had become enterprise critical or being held responsible for poorly performing user projects and systems. Then, as the promise of the Internet and fears of Y2K generated unprecedented demand, the IT budget and organization ballooned. Not coincidentally, systems such as ERP were implemented that either were not ready for prime-time or ended up overwhelming the organization’s capabilities and finances. As a result, IT’s reputation in the organization suffered, and it was forced to retreat to try to figure out how to satisfy business’ demands by finding efficiencies within core operating costs. But even during this retreat, the importance of managing IT as an enterprise asset and capability became obvious to every layer of the organization. Ultimately, this gave birth to healthy forms of interdependence, i.e., governance and roles, that mirrored practices found in other, more mature areas of the business.

Evolving Into Innovation Experts In the future, the interdependency of IT and business will mean that IT is no longer responsible for the delivery of IT while the business sits on the sideline waiting to judge the outcome. In this future, IT will be accountable for ensuring that IT is done well. I call this enabling IT, which requires creating relationships, roles, processes and an infrastructure that helps the business satisfy its day-to-day needs without involving IT. This means that IT: f acilitates appropriate decision making to protect the interests of the enterprise d efines data, business services, architectural guidelines and technology standards d evelops enabling infrastructure, tools, processes and support resources e ducates and coaches users and provides resources so that the business can manage projects and change, determine necessary functionality, and develop and deploy systems p rovides development and operational resources and services on demand — both staff and technology — in conjunction with external suppliers. In the future, IT professionals will become innovation experts by combining technology savvy, business knowledge, management 24

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Coloumn IT's Good News.indd 24

discipline and the ability to play well with others. Those with a heavier technical orientation will focus on defining architecture, and designing and developing infrastructure and enabling platforms. Professionals with a strong business orientation will focus on collaborating with the business on strategy, governance, and project and service delivery. Individuals with stronger management discipline will specialize in overseeing technology development, service and operational delivery, resource management and risk management.

The Story you Tell The enabling IT model will address concerns about job security, the hierarchy of technical skills and the resources squeeze. It’s undeniably true that commodity IT jobs will be outsourced. But IT jobs and roles that touch on innovation will not be outsourced because they demand the ability to comprehend the connections among technology, data and business processes, and the ability to understand the connections between people and how work gets done within the organization. In this brave new world, IT’s influence in a company will increase. Paradoxically, by giving up control over technology delivery, IT will gain authority as it can no longer be blamed for being a bottleneck to technical innovation. With the business doing (and paying for) daily development effort, much of the variable demand will be external to the IT team, allowing IT to plan its work and ensure adequate funding. All this means that there will be an incredible demand for innovation experts, and it will only be satisfied if the talented professionals currently in place adopt lifelong learning as their mantra. Learning can occur on the job and, in some cases, in the classroom, but now more than ever IT professionals need to take a hard look in the mirror, assess their skill sets, and then reach out for the learning opportunities that will expand their capabilities. The projected slowdown in labor growth will play to the advantage of those professionals who possess innovationexpert-type skills. In the future, these people will be able to write their own tickets, personally, professionally and financially. The future of IT in your organization depends on your ability to communicate this story to your staff and have this model embraced by your company. If your organization doesn’t understand already that today’s pain is for tomorrow’s gain, get busy writing your story. It will ensure that IT’s potential will finally be realized in the real sense. CIO Susan Cramm is founder and president of Valuedance, an executive coaching firm. Send feedback on this column to editor@cio.in

Vo l/1 | I SSUE/17

7/12/2006 1:48:46 PM


Trendline_Nov11.indd 19

11/16/2011 11:56:19 AM


Sajid Ahmed, global head-infrastructure, Syntel, has designed a 10G network at the company's Pune facility.

Reader ROI:

What it takes to be an early adopter of technology How to deal with legacy Why business expectations need to be tempered

Cover story FINAL.indd 26

7/13/2006 3:26:15 PM


Cover Story | Network Infrastructure

a Cable Fable ImagIng by b InESh SrEEdharan

I

P hoto by Sr IVatSa ShandIlya

By Gunjan Trivedi

Vol/1 | ISSUE/17

Cover story FINAL.indd 27

At first glance, 10G Ethernet would seem to be driven by bandwidth requirements. But a closer look reveals the need for network infrastructure to be designed for the future — especially for global enterprises and those rooted in R&D. “How much bandwidth is really enough?” It’s the conundrum that has haunted IT managers ever since networking guru Robert Metcalfe dreamed up Ethernet — a ‘fast’ way of connecting hundreds of terminals across a whole building to minicomputers, while bringing in print and file-sharing capabilities. The original Ethernet LAN sent 2.94 Megabits per second (roughly this paragraph) of data over thick coaxial cable at a distance of one kilometer. That’s evolved over decades with the present age Ethernet transferring data at a blazing 100 Gigabits per second. REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

27


Sounds like magic, doesn’t it? But ask CIOs investing in upcoming technologies to future-proof their enterprise, and they'll say it’s anything but enchanting. They have to grapple with issues of obsolescence, commercially viable technology, costing, and convincing management of the need to go with the cutting-edge stuff rather than the mature solution approach.

Invigorate the Backbone Sajid Ahmed faced a similar dilemma eight months ago when he began to design a network for IT services company Syntel’s Global Development Center in Pune. The management’s directive to Ahmed, the global headinfrastructure of Syntel, was straightforward: build a state-ofthe-art facility that spans across 40-odd acres and can house around 8,000 engineers. The campus also had to double up as a customer-centric and an employee-friendly facility.

Ahmed began the ardent task by taking a top-down approach. First, anticipate the streams of applications that the campus will deploy. Then, examine the business case for the networking technology slated to come in. And, when convinced, rein it in. Cometh the hour, Ahmed was crystal clear about Syntel’s requirements and what it wanted to offer. It was the process of designing the switching and cabling infrastructure of the campus that turned out to be time-intensive as he came to confront a critical question: Do I choose the established technology — or risk adopting the latest? “We are not a research-based or a media company. We are primarily a provider of custom outsourcing solutions for IT application development. For us, our data warehousing activities come closest to choking the bandwidth in our organization,” he says. Still, bandwidth wasn’t Ahmed’s only parameter at that point. Though 100Mbps would have sufficed for the traffic coming in from the access layers, he was sure he didn’t want to lay a network that was running anything less than 1Gb Ethernet at the aggregation or distribution layers. “The reason was simple: future proofing,” he says. A debate ensued, and Syntel finally standardized its campus backbone interconnecting seven buildings with the latest — fiber-modular 10Gb Ethernet (GbE). The 10GbE standard surfaced a couple of years ago. If adopted, it was bound to disrupt existing infrastructure and force enterprises to rip-and-replace. Initially, copper cabling proved to be a deterrent to 10GbE, and vendors could only offer the technology on expensive fiber. This kept the standard out of the reach of many businesses. Of late though, the 10GbE volcano has begun to smolder. Vendors are now fast coming out with increasingly affordable products to support the technology that has begun to stabilize. For an enterprise CIO, the big question surrounding 10GbE now is whether 10GbE is really stable. Loads of man-hours spent tweaking switches, network cards, drivers and

"I would rather have the latest technology for my organization than sit on something old.”

— raj Katari, head-IT, Nagarjuna Fertilizers

Vol/1 | ISSUE/17

Cover story FINAL.indd 28

Ph oto by SUrESh

Cover Story | Network Infrastructure


Resist GbE: Gartner analyst globally, enterprise It t and network professionals will toss away more than rs 45,000 crore on gigabit Ethernet lan gear over the next two years that would be better spent on technologies to support increasingly distributed workforces, says gartner vice president mark Fabbi. “the majority of network designers continue to be caught in traditional design practices,” says Fabbi. “they continue to spend money on bigger and faster core networking technologies at important locations that don’t actually serve the user population,” he says. Corporate applications — even videoconferencing and VoIP — do not require more than a few hundred kilobits per second of bandwidth, he points out. “astute network managers will focus on the upper layers of the stack, and look to security, data control, application optimization and mobility services as key features that will benefit the organization far more than installing gb Ethernet for all desktops.” From a cost standpoint, the gigabit option is complex. application usage, the form factor of the products and the medium of the wiring all contribute to the cost of the technology, according to analysts and users. averaging out the entire industry, the cost of a gigabit port was 80 percent to 300 percent the price of a Fast Ethernet port in 2005. Still, gigabit modular ports outsold Fast Ethernet modular ports by 50 percent in 2005. “It’s no coincidence that large businesses have adopted modular gigabit in chassis switches,” says Seamus Crehan, an analyst with the dell’oro group. “generally, large networks tend to have chassis all the way out to the wiring closets, and enterprises future-proof more and have a greater need for bandwidth.” —Phil hochmuth and Jim duffy

the like only to get the most out of a 10-Gig pipe simply isn’t good enough. CIOs want to know if the solution works. Do they get 10GbE performance simply by powering up? After that, they need to worry about easy integration into an existing architecture and day-to-day management tools — and they certainly have budget concerns. But most of all, they want to know what’s in it for them besides just a really fat pipe. What’s 10GbE really good for that straight Gigabit Ethernet doesn’t already give them? For Syntel, it was the scaling-up options 10GbE provided, apart from superior network performance. As of today, its campus has seven buildings, three of which are up and working. Each building has eight wings that cater to 120 concurrent users and each wing, in turn, has two 10Gbps uplinks to the core layer. With Ethernet running on full duplex, each uplink port has 20Gbps bandwidth. In effect, the 120 users in each wing have an access to a whopping wire-speed of 40Gbps.

What Lies Beneath Considering the distance between buildings in the campus, optic fiber was the natural choice. While designing the core backbone, Ahmed did think of copper cabling. “But we 30

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Cover story FINAL.indd 30

realized that there was a huge distance restriction (copper cannot go more than 55 meters at one go), and our backbone is slated to run for several kilometers within the facility. So, our vote for optic fiber connectivity was clear,” he recalls. Further, Syntel opted to deploy single mode fiber (SMF) cables instead of the cheaper multi mode fiber (MMF). With Ethernet networking evolving way too fast, when it comes to local area networking, it isn’t just cabling but even the inter-networking electronics that need to comply with upcoming bandwidth standards. “The last thing we want is to realize in the near future that we need to scale up to a faster bandwidth standard. Relaying the cables would be suicidal.” The stakes were high, and it perhaps explains why Ahmed devoted more time on switching and cabling than on any other aspect of the network infrastructure. Being an IT services company, Syntel is not like a Tier-1 bandwidth aggregation point of any business and doesn’t need to deploy an end-to-end 10Gb Ethernet, especially on the desktops. “Though 10GbE cards are now available, a single card is as expensive as a workstation. So, having 10GbE cards on the workstations did not make a viable business case for us. We were still confident that 100Mbps at the desktops will continue to rule the roost for at least two years,” says Ahmed. For the core backbone, Ahmed was keen to have a single, consistent network design. Interestingly though, he opted for a combination of SMF and Cat6 UTP (Unshielded Twisted Pair) in the 10,000-sq ft datacenter. He anticipates Syntel getting into R&D-based projects in the foreseeable future. When that happens, the organization would need to scale up to 10Gbps over copper across the datacenter floor. “This is why we have stuck to Cat6 despite having a cost differentiation of 18 to 20 percent between Cat5 and Cat6 cable. Within the floor, everything is Cat6. But between the core switches or between the core and distribution layer of networking, it is all 10GbE,” he explains. Ahmed’s philosophy of core design is inspired by a home truth in manufacturing circles: too much inventory is always too bad. From an inventory standpoint, he wants to keep the campus design systematic and uncomplicated, and not have his team face the blues managing multiple platforms. “Since we are not a core R&D company, we did not want to invest in technologies that are still vague. There are inherent design and implementation challenges with 10GbE over copper standard and newer UTP cabling technologies that theoretically enable enterprises to run 10Gbps over 300 meters,” he says. Even while contemplating between deploying Cat6 and Cat6A, Ahmed decided to stick to the established standard to avoid both the technological challenges and cost differentiation of at least 30 percent. “Cable engineering is fundamental to design, and we have to be vigilant to the design issues. Even the Cat6 cabling becomes wacky when deployed at angles more than 45 degrees,” he points out.

Vol/1 | ISSUE/17


Cover Story | Network Infrastructure

Through the past eight months, Ahmed has been keen to limit overall utilization of the network at 45 percent to enable a choke-free backbone. In this context, a ‘bandwidth-hungry’ component of the management’s brief to him has posed a challenge: an IP-ready facility to enable concurrent streams of data and voice over network. What Ahmed's team has done is identify all the applications to be run at the campus and gauge the bandwidth requirements when the systems are fully loaded. “Since we are not sure where certain apps or technologies are going to be housed at the campus, we have opted to go in for open and interoperable networking standards to accommodate future technologies,” he says. Globally, network professionals agree that for the small price of upgrading to Gigabit in specific cases, the purchase is worth it even if the bandwidth isn’t being used. For instance, at the First American National Bank of Texas, a regional bank with 30 locations in North Texas, almost half the switch ports deployed are 10/100/1000 Megabits per second (Mbps), and almost 60 percent of the Dell desktops have Gigabit Ethernet network interface cards built in. “I wouldn’t consider it overengineering the network,” says Kurt Paige, network administrator for the bank. “I consider it staying on top of the technology. If we’re going to buy a piece of equipment and we can get a 10/100/1000 Mbps port for only a little more, we’ll go with the newer switch, even if the speed may not be used.” At Syntel, Ahmed has kept the design quite modular to accommodate changes without disturbing the core. “Even if CAT 6 evolves tomorrow to effectively run 10Gbps,

we can easily upgrade the cables at the datacenter. The backbone on single mode fiber does not even have to be touched,” he explains.

Testing The Waters Nagarjuna Fertilizers, a leading manufacturer and supplier of plant nutrients in India, took the 10GbE plunge at around the same time as Syntel did — but for a different reason. Unlike Syntel, whose primary objective was to apply 10GbE to future-proof its facility, Nagarjuna Fertilizers innately has a culture of furthering its R&D capabilities. With 10GbE, it sought to get a technological head start. Belonging to a three-decade-old industrial group, Nagarjuna Fertilizers has evolved phenomenally in several areas except IT infrastructure. The 10GbE standard would provide the organization a platform to leap to future technologies, believed Raj Katari, its head-IT. Running on a seven-year-old networking infrastructure that offered 10/100 Mbps connectivity, Nagarjuna Fertilizers is now getting into overdrive. The organization has decided to increasingly revamp its infrastructure, and either replace the network wherever possible in a phased manner or bring up a new structure from scratch. “Being in the fertilizer industry, the organization does have quite a domestic and bureaucratic way of doing business. Nevertheless, as far as IT and MIS is concerned, the company has now got fairly aggressive,” says Katari. This is evident in the company’s IT budget for 2005-06, which

How Fiber Stole Copper's Groove Companies in India believe optic fiber cable (oFC) is vastly superior to copper as a cabling link to run 10 gigabit Ethernet. “oFC has better flexibility and capability to be used over larger distances, and so it is the preferred technology,” says Chandra Kopparapu, VP (asia Pacific) at Foundry networks. dileep Kumar, product managerenterprise, adC Krone, points out that single mode fiber and om3 multimode fiber have been deployed for 10g backbone connections in India for the last three to four years. “Single mode is preferred for 10g campus backbone (building to building) connectivity and om3 multimode is preferred for vertical backbone (floor to floor) connectivity,” he explains.

Cover story FINAL.indd 31

and fiber is the way forward, believes Varghese m. thomas, spokesperson of Cisco Systems. “10g Ethernet is used to connect distribution switches to core in campus environment and in the core of metro Ethernet rollouts. hence, most of the deployment is on oFC,” he says. one reason why copper isn’t preferred as much as fiber might be the rising price of the metal. according to the london metal Exchange, copper prices have tripled over the past four years, and risen more than 59 percent between January and may 2006. Internationally, this has caused several companies to contemplate oFC with renewed vigor. Kumar, however, feels that

the price-increase has a minimum impact on the overall It spending. Kopparapu has another point to make: copper has a relatively limited range. So, is this the end of the road for copper in tandem with 10g Ethernet? no, he says. once new technology comes in that makes it viable between 50 and 100 meters, it could become more attractive. thomas adds: “the deployments of 10g Ethernet over copper are for connecting high-end servers to the network to avoid the bandwidth choke when multiple clients try to access some server and the clients are on gigabit connections.” – balaji narasimhan

7/13/2006 3:26:32 PM


Cover Story | Network Infrastructure

which can’t be replaced now, the organization has to support both the Ethernet standards. The switches that interconnect the buildings are dual port switches supporting both fiber optics and 10/100 Mbps Ethernet. Says Katari: “So, between the two buildings, the switches interconnect through fiber on 10Gbps, and, within the building, the Ethernet is on 10/100 Mbps Cat5e. Some buildings are also interconnecting using 1GbE. As per the latest network design, for any new location we add or any existing network we restructure, we are directly going in for 10GbE both as inter- and intra-building backbone.” In Hyderabad, Nagarjuna Fertilizers connects six buildings with an extended LAN running up to 2 km of fiber. At its Kakinada plant, it runs up to 7 km of fiber. There are a couple of locations where the organization has decided to extend 10Gbps all the way from the backbone to the desktops. At these locations, Nagarjuna Fertilizers has taken a step, which many would fear to take, and interconnected the workstations with 10GbE over copper. The question arises: does its data traffic require 10GbE technology – is there a real business case for it? Says Katari, “We went by the logic that the latest technology would be available more economically, and easier to maintain, than one that is not so latest. Our experience has been that there is no more than 10 to 15 percent of price difference between the two.” Another reason why Nagarjuna Fertilizers was so keen on 10GbE was that the two locations, which are completely on 10GbE, are R&D centers. Though workstations on the operations side of the organization connect to the workstations at the R&D center on Fast Ethernet (100MbE), in effect, the workstations within the R&D center talk to each other at 10Gbps. “We are known to be early adapters of technology and, in some cases, we are almost like test-beds for vendors. Our R&D is one of the unique sites for its implementation in India. We are running a little less than a kilometer of 10GbE over copper at the center, with a backup of wireless connectivity,” says Katari. The R&D content emanating from the centers comprises a lot of streaming information and data. The centers generate loads of content on a regular basis, which is further collated with more information from across different sites across the world. Typically, this would constitute a lot of content-rich graphics and — Sajid ahmed, supporting data. “We have global head-infrastructure, Syntel

increased by a dizzying 500 percent over the previous year to boost the revamp of its networking infrastructure. “Even this year, our budget saw an almost 80 percent increase, enabling us to scale up the infrastructure seamlessly as per our revamp roadmap,” he says. Most of its operation locations are situated at the company’s Hyderabad corporate office and the manufacturing plant in Kakinada, Andhra Pradesh. With several buildings spread out across a vast distance, copper cabling would not have been enough whereas optic fiber easily met the distance requirement specifications. With fiber coming in, Nagarjuna Fertilizers decided to roll out 10GbE and scale up its backbone to run data traffic at 10Gbps. But, the fertilizer manufacturer has a problem. With many buildings still harboring 10/100Mbps Ethernet,

“The last thing we want is to have to scale up to a faster bandwidth standard in the near future.

Relaying the cables would be suicidal.”

Vol/1 | ISSUE/17


a Clinical Test for 10G a business case where we would be generating 1.5 to 2 terabytes of data every year. While a part of the relevant applications are already implemented and in use, much more is yet in the planning stage. We are required to put in the infrastructure to support these future requirements as well. The cost difference of the new infrastructure here had no more than 15 percent of variance from its predecessor. This proved to be a business case supporting our deployment of 10GbE,” explains Katari. In terms of operations, the organization would not have really required to transfer so much data. But, Katari didn’t want to get stuck with the standard that will soon phase itself out and prove to be a deterrent to the enterprise advancing to newer standards. “Today, my computing power requirement may not be more than what is offered by a 386 or 486 processor machine. But are these standards available today? No. The market is continuously pushing you out, and the ability to maintain is also getting pushed up further. So, I would rather have the latest than sit on something old,” says Katari. However, opinions on the extent of future-proofing are divided, especially in the US where Gigabit Ethernet deployments and its utilization are relatively common. At the North Bronx Health Network in New York, LAN ports range from 10M to 10Gbps. Extreme backbone switches link with 10G Ethernet, while some users have Gigabit links to view digital radiology images. But the majority of users still connect with 10/100Mbps, says Adorian Ignat, director of IT. “We have some 10/100/1000 Mbps ports to desktops but not very many right now,” Ignat says. “If I don’t need that bandwidth there’s no sense in putting it in right now.” Currently, Nagarjuna Fertilizers has about 150 users who interact with its SAP ERP. Over the next six months, Katari plans to increase its usage to about 600 users. Having a skeletal IT team, it is important for the enterprise to have end-to-end control over everything possible. While studying the business case, Nagarjuna Fertilizers’ management looks at the cost component — and of the timing of the technology being recommended. Do you really need the technology there? Can you do it sooner or later? Is it really going to help to achieve what the organization wants to achieve? These are the standard questions. “If the budget is not there, I might get into the rigor to prove the difference between 10/100 and 10Gbps. But at this time, I would get into the timing and reasoning aspects — the business decisions more than the technical ones. The technical decisions are left to the divisional head,” says Katari. “However, the management will come back to discuss the business differentiations or returns that were expected after a stipulated time period of the implementation. As of now, the business returns are yet to come.” As far as the technical requirements are concerned, the Hyderabad-headquartered fertilizer manufacturer

Vol/1 | ISSUE/17

Cover story FINAL.indd 35

When Vimta labs announced its plans in may 2005 to deploy 10g Ethernet at its life sciences facility, it might have raised a few eyebrows. Few setups, leave alone a pharma-oriented company, would have hazarded a deployment that was so new in India. one year on, the rs 55-crore contract research and testing services provider has gone the whole hog. the first phase of its 10g Ethernet-over-copper installation is nearly through — at a time when most Ethernet buyers are still opting for fiber. the solution fitted in with the company’s “broad policy of protecting itself from obsolescence”, says chairman and md S.P. Vasireddi. “It suited our need for scaleable architecture. right now, it delivers 1g. but, with a 10-fold increase capability, it is an option that won’t get obsolete very soon,” he believes. but how apt is this standard for a company like Vimta? Its storage requirements are immense, owing to applications such as laboratory Information management Systems. this entails transfer, management and storage of high-resolution graphics and visual data. at each stage of pre-clinical work, for instance, the findings generated would include large slides and 3d data. Vimta labs seeks to store all this for its research and to regularly track the results. In doing so, it has also consciously begun to comply with regulatory bodies like the US Fda. Post the 10g installation, Vimta doesn’t need to worry about bandwidth issues. Consider this: it now has 62 terabytes of data storage that is still scaleable, not to mention the high-speed connectivity between its labs. “bandwidth is critical because we have to constantly transfer large files. 10g Ethernet ensures smoother movement of data,” says Vasireddi. another area where 10g is proving more than useful at Vimta labs is in supporting its learning management System. these systems demand architecture than can store high-resolution audio-visual footage of training programmes, while also enabling convergent technologies for interactive training sessions and navigation through the content. — Kunal n. talgeri

conducts testing. “We have got a very good response, compared to what we used to get in the past. Experiences are very much technical in nature and not anywhere connected to business. I would not start tracking the returns of the implementation unless I start seeing business results. Technically, we are happy. Business expectations are yet to be realized,” he says. “It is a technology that is bound to come out. It might take a little longer to talk in right volumes and commercial diving mechanisms.” With the content growing phenomenally and applications increasingly demanding bandwidth, companies won’t have any choice but to go for 10GbE, Katari believes. CIO With inputs from Phil Hochmuth and Jim Duffy. Senior Correspondent Gunjan Trivedi can be reached at gunjan_t@cio.in

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

35

7/13/2006 3:26:41 PM


Trendline_Nov11.indd 19

11/16/2011 11:56:19 AM


Handing IT

to the Users BY GUNJAN TRIVEDI At ICICI Bank, it’s users— not the IT team — that own, underwrite and monitor IT implementations. Each user group is responsible for the success or failure of the applications they need. It’s an approach that K.V. Kamath, MD & CEO of ICICI Bank, champions and is founded on his own experience with ICICI's Project Finance Division in the early 1980s. That very stint led him to believe that ‘incomplete communication and inappropriate ownership’ are instrumental in the failures of IT projects.

CIO: In your first stint with ICICI’s Project Finance Division, you implemented its digitization. How did the project influence your outlook on technology as an investment?

View from the top is a series of interviews with CEOs and other C-level executives about the role of IT in their companies and what they expect from their CIOs.

36

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

View from the Top - K.V.indd 36

K.V. Kamath: That was way back in 1981 when all we had was an 8-bit machine with 20MB of storage running on an 8086 processor. Despite this, we ran the entire leasing department of ICICI. When I say we ran it, I mean documentation, mail, spreadsheet, database, billing and

accounting processes. It taught us that we could effectively run an almost paperless office and that technology can be applied in an easy-to-understand and cost-effective way across the organization. The division’s computerization also changed the relationship between machines and individuals. Technology was once supposed to be the master and users were slaves. Technology could only be accessed from the CTO’s office and by his team of programmers. With the division-wide computerization, this relationship suddenly changed, and machines and users were on an equal footing. People had developed the ability to interact with machines directly and make them do whatever they wanted.

P hoto by Srivatsa Shan dilya

K.V. Kamath, MD & CEO, ICICI Bank, believes that the CIO must harmonize the organization's user groups in the context of their technological needs.

Vol/1 | ISSUE/17

7/12/2006 1:12:48 PM


View from the Top

K. V. Kamath expects I.T. to: Continue serving as a key market differentiator Ensure ICICI Bank's compliance with regulatory requirements Take banking to rural India

Vol/1 | ISSUE/16

View from the Top - K.V.indd 37

REAL CIO WORLD | J U LY 1 , 2 0 0 6

37

7/12/2006 1:12:50 PM


View from the Top

At ICICI Bank, why do business units or user departments own their technological implementations? To go back to 1981 and the period after, I observed that, by and large, implementations failed because there was an interface in which a third person took in what the user wanted, translated it to somebody else, and then an implementation was attempted. There were two primary problems with this approach: incomplete communication and inappropriate ownership. These are the biggest reasons why IT projects fail. Given this, we felt that user groups should own and implement their own IT.

Does this approach help cut implementation costs and lead to better technology management? I think those are some of the benefits. The true benefit, however, is in articulating your needs, having a better understanding of technological challenges, finding a way to resolve those challenges, and executing a technological solution. This leads to critical benefits. First, there is a dramatic reduction in the number of failed projects. Second, it enables a reduction in implementation time. Eventually, all this naturally saves costs. Since user departments are conscious of their overheads, this approach leads to further savings.

What then is the role of the CIO at ICICI Bank? We have to remember that we also need to run technology in such a way that there isn’t technological anarchy in the organization. Various user groups can end up setting up disparate systems that don’t communicate with one another, hardware platforms that are different from each other, and protocols that don’t interact. There is a 38

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

View from the Top - K.V.indd 38

“Ultimately, technology is a mindset. We must ensure the strength of our ethos, which is that users own technology. ” — K.V. Kamath strong need to ensure this doesn’t occur. A CIO brings various user groups together and keeps them in harmony in the context of their technological needs. The CIO is like a small clearing house, which ensures that all systems talk to each other, that all software is compliant to a viable extent, and that various divisions understand the costs involved — hardware and software — in specific projects. This can be done with a team of less than 15 people and you don’t need a large office to carry out this function. Here, the people implementing projects are embedded in user groups.

You introduced SiliconValley-styled ‘90-day rule’. Is this still the designated time frame for IT projects at ICICIBank? This rule not only continues to be the time frame for IT projects, but for any project within ICICI. We’re always asking our business divisions why projects can’t be done in 90 days or less. For the most part, we have

found that when we have articulated this rule and made it clear that we’re going to stick to it, we have managed to. A big advantage of the 90-day rule is that it prevents projects from slipping or of getting out of hand, and being abandoned for lack of inputs. This rule ensures we have tight control over the implementation of projects. However, there are exceptions. Certain projects, such as a greenfield project, might take more time to roll out. Also, in projects in which the software is available but people need to figure out compatibility issues, implementation can take more than 90 days. Having said that, we have also seen instances where vendors have quoted a year for implementation, and we decided to run the project ourselves with the vendor acting as an advisor. We found we were able to implement the project well under 90 days. I believe that sometimes there is a lot of leeway built into project implementation schedules. But if you have a confident team, you can roll it out in under 90 days.

Has this aproach helped increase ROI? Honestly speaking, we have never taken the ROI approach. For us, technology is a base case. We can say that we are almost a technology company running a financial services business. When that is the case, which ROI are we talking about? Rather, I stress on figuring out how I can ensure that the cost of a project is in the best interests of the business. I also try to assess by what multiples or factors a project is in the best interests of the business.

How does ICICI Bank deal with compliance? Does Basel II, for instance, put a strain on your IT implementations? Not at all. In fact, whatever we have had to do in terms of Basel II, we could have almost done in-house. There is no stress at

Vol/1 | ISSUE/17

7/12/2006 1:12:52 PM


View from the Top

all in terms of what we had to do, as an organization, within the scope of technology or outside. Indeed, technology has gone a long way in ensuring compliance. Increasingly, you have technological solutions that validate whether what you are doing complies with regulatory requirements and internal policies. With technology, the level of compliance comfort increases because real-time validation is introduced, compared to the manual or end-of-day interventions. In the areas of trading, in particular, with technology, you can have real-time monitoring on limits without causing a sense of strain on processes and functions.

SNAPSHOT

ICICI Bank Offerings:

Banking Products and Financial Services total assets* :

Rs 251,389 crore Net PRofit* :

Rs 2,540 crore customers:

1.5 crore

Employees:

27,000 branches & counters:

620

ATMs:

2,225

CIO:

Pravir Vohra

So far, technology *as on March 31, 2006 has been a key differentiator for ICICI Bank. With other banks adopting a similar approach, how will ICICI retain its lead?

How will IT help expand ICICI Bank's services to rural India, given the country’s poor technology infrastructure?

I believe that when we roll out into rural India in the next 12 to 24 months, technology — in terms of telephony — would have penetrated deep into rural India. And telephony will be the primary connect between urban technology and rural India. Without technology, we don’t have a chance to execute our rural strategies. In fact, we expect that our rural efforts will probably cause a technological disruption in the way we use IT. We might witness an interesting situation, in which we will have to retrofit what we will be doing in rural India into urban India. I think we will have plenty of cutting-edge technology that will go into rural India before it touches more urban areas. We’d rather not touch already installed bases in urban

Vol/1 | ISSUE/17

View from the Top - K.V.indd 39

India and do the cuttingedge implementations — basically aimed at bringing down costs — in rural India. I expect chip-enabled smart cards, biometric verification, etcetera, to be introduced into rural India significantly ahead of urban India. Rural India will work on a technology-led, branchless model. In this mode, authentication becomes critical and we will need smart technologies to authenticate. Biometrics is one. We will have to leapfrog with technology if we want to make our rural strategies work.

Ultimately, technology is a mindset issue. I can’t say other banks will not do well with technology. However, we have to ensure the strength of our ethos, which is that users owns technology. We must maintain the capacity to keep an open mind where technology is concerned, show an ability to take risks, and have an entrepreneurial mindset with technology and leadership. Frankly, when I look around, I find very few people who have these. For example, let’s say you want to run established CRM software. You find that costs are becoming prohibitive and that your vendors are no longer able to dialogue on costs. You are happy with the system, but you face a cost dilemma. What do you do? Do you encourage a user group which says it is happy with the product but holds you, as a CEO, responsible if things go wrong? This user group does not

meet the preconditions I mentioned. But, if you have a user group that agrees to own the challenge, that agrees to take a leap and says that they are ready to take a risk since the benefits are huge, bingo! This is the sort of effort which is required to be successful with a technological edge. Now, if there is somebody else who can also take these kinds of decisions, then he becomes a competent competition. Point is: can they do it time after time after time?

As more IT-driven financial products and services are being commoditized, how will ICICI ensure that it keeps introducing market differentiators? I don’t think commoditization happens to that extent. There is still a lot of customization required. And that is where the problem lies. What you see is that, however commoditized a product might appear to be, when you implement, you have to be prepared to face some pleasant or unpleasant surprises. We have become immune to this. Whatever shocks exist, we have to be prepared to bear them. There is, however, a learning curve and it is one of the biggest barriers to anybody competing us. And we’re continuously going up that curve. We have the humility to say that we learn from any business, not just the banking business, and that we keep our ears and eyes open. To be successful, I maintain, we have to have the mindset that we are continuously open to ideas and innovation, and are on the lookout for the next improvement that we can leverage for even greater advantage. CIO

Senior Correspondent Gunjan Trivedi can be reached at gunjan_t@cio.in

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

39

7/12/2006 1:12:53 PM


Electricity-hungry equipment, combined with rising energy prices, are devouring data center budgets. Here’s what you can do to get costs under control.

Feature.indd 40

7/12/2006 1:09:57 PM


Energy

down Powering

By SuSannah Patton

Il lustrat Io ns by PC an oo P

A

typical 10,000-square-foot data center consumes enough juice to turn on more than 8,000 60watt lightbulbs. That amount of electricity is six to 10 times the power needed to operate an office building at peak demand, according to scientists at Lawrence Berkeley National Laboratory. Given that most data centers run 24/7, companies that own them could end up paying millions of dollars this year just to keep their computers turned on. And it’s getting more expensive. The price of oil may fluctuate, but the cost of energy to run the data center probably will continue to increase, energy experts say. This is because global demand for energy is on the rise, fueled in part by the proliferation of more powerful computers. According to Sun Microsystems engineers, a rack of servers installed in data

Vol/1 | I ssu E/17

centers just two years ago might have consumed 2 kilowatts and emitted 40 watts of heat per square foot. Newer, ‘high-density’ racks, which cram more servers into the same amount of space, are expected to consume as much as 25 kilowatts and give off as much as 500 watts of heat per square foot by the end of the decade. The dire predictions keep coming. Most recently, a Google engineer warned in a research paper that if the performance per watt of today’s computers doesn’t improve, the electrical costs of running them could ultimately exceed their initial price tag. “As the demand for computing grows, the cost of power is a larger and larger concern,” says Dewitt Latimer, CTO at University of Notre Dame. Latimer is grappling with finding the space and adequate power to handle a growing demand

Reader ROI:

Why energy consumption has become a headache for CIOs Ideas for reducing data center electricity use

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

41

7/12/2006 1:10:00 PM


Energy for cheaper and ever-more powerful high-performance computer clusters at Notre Dame. The problem comes not just from the computers themselves; Latimer is worried that the air-conditioning needed to keep the machines cool will also eat away at his budget. Like Latimer, every CIO who is responsible for a data center — even those who outsource data center management to a hosting company — faces this conundrum: how to keep up with ever-increasing performance requirements while taming runaway power consumption. The problem is most pressing for companies on either coast and in large cities in between, where space is at a premium and companies compensate by putting more servers into their existing buildings. And there is no simple solution. Business demand for more applications results in companies adding more servers. According to market research company IDC (a sister company to CIO’s publisher), server sales are growing by 10 to 15 percent annually. Nevertheless, CIOs with huge energy bills are developing strategies to contain power costs by deploying more energyconscious equipment and by using servers more efficiently. “There’s no question that the issue of power and cooling is a growing concern,” says John Humphreys, an IDC analyst. “The assumptions used for building data centers have been blown away.”

The Problem: IT Hogs Energy IT’s energy woes have a lot to do with market factors that affect everyone; at the start of the year, the price of a barrel of oil was more than double what it was three years earlier. Anyone who thinks the current energy crunch is going away need only look at global energy markets. The oil shocks in the ‘70s and ‘80s stemmed from large, sudden cuts in supply. This time, it’s different. While it’s true that some of today’s high prices stem from supply shocks tied to the US invasion of Iraq and hurricanes on the Gulf Coast, the world’s thirst for oil over the past 25 years has grown faster than the energy industry has been producing it. And rapid economic expansion in China and India has led to greater energy demand, putting further pressure on the world’s energy markets. Servers in corporate data centers may use less energy than manufacturing facilities for heavy industries, but within a company, IT is an energy guzzler. “We’re pretty hoggish when it comes to power consumption in the data center,” says Neal Tisdale, VP of software development at NewEnergy Associates, a wholly owned subsidiary of Siemens. NewEnergy’s Atlanta data center performs simulations of the North American electric grid to help power companies with contingency planning. “We turn on the servers, and we just leave them on.”

The Solar-Powered Server Farm How one company got its data center off the grid.

P

hil nail and his wife, sherry, have learned that green technology and data centers can go together. the couple started their Web-hosting company, affordable Internet services online (aIso), nine years ago and switched to solar power in 2001. the company, located in romoland, California, provides Internet service to customers that include a film production company and Veggiedate.org, a dating service for vegetarians. the company data center’s 200 servers are powered by 120 photovoltaic panels that generate 42

Feature.indd 42

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

electricity on platforms mounted beside the data center. according to nail, the panels supply power to run the entire data center, including the offices and air conditioners. In case of a power failure, aIso can get power from its emergency generator (which runs on natural gas) or, as a last resort, the utility grid. the hosting company also uses servers with energy-efficient advance Micro Devices opteron processors from open source storage. “We built our company to be environmentally friendly because we thought it was the right thing to do,” says nail. nail acknowledges that a solarpowered data center isn’t for everyone because startup costs can be expensive;

in 2001, it cost him $100,000 to install 120 solar panels for his 2,000-squarefoot data center. He says his investment has paid off in low energy costs, and his eco-friendly marketing message has helped to attract some customers. but he acknowledges that the cost of switching to solar power would be steep for a large data center with thousands of servers. now nail is taking green power to another level. specifically, the data center’s roof, where he intends to put five inches of dirt and cover it with droughttolerant plants. “that’s supposed to reduce the amount of cooling needed by 60 percent,” he says.

— s.P. Vol/1 | I ssuE/17


Energy The exact amount of electricity used by data centers in the United States is hard to pin down, says Jon Koomey, staff scientist at Lawrence Berkeley National Laboratory. Koomey is working with experts from Sun and IDC to come up with such an estimate. Nevertheless, most experts agree that electricity consumption by data centers is going up. According to Afcom, an association for data center professionals, data center power requirements are increasing an average of eight percent per year. The power requirements of the top 10 percent of data centers are growing at more than 20 percent per year. At the same time, business demands for IT are increasing, forcing companies to expand their data centers. According to IDC, at least 12 million additional square feet of data center space will come online by 2009. By comparison, the Mall of America in Minnesota, the world’s largest shopping mall, covers 2.5 million square feet.

SOLUTION

More Efficient Computers

1

Just as automakers built SUVs when oil prices were low, computer manufacturers answered market demand for ever-faster and less expensive computers. Energy usage was considered less important than performance. In a race to create the fastest processors, chip makers continually shrank the size of the transistors that make up the processors. The faster chips consumed more electricity, and at the same time allowed manufacturers to produce smaller servers that companies stacked in racks by the hundreds. In other words, companies could cram more computing power into smaller spaces. Now that CIOs are beginning to care about energy costs, hardware makers are changing course. Silicon Valley equipment makers are now racing to capture the market for energy-efficient machines. Most chip makers are ramping up production of so-called dual-core processors, which are faster than traditional chips and yet use less energy. Among these new chips is Advanced Micro Devices’ Opteron processor, which runs on 95 watts of power compared with 150 watts for Intel’s Xeon chips. In March, Intel unveiled a design for more energy-efficient chips. Dubbed Woodcrest, these dualcore chips, which Intel says will be available this fall, would require 35 percent less power while offering an 80 percent performance improvement over previous Intel chips. And last November, Sun Microsystems introduced its UltraSparc T1 chip, known as Niagara, which uses eight processors but requires only 70 watts to operate. Sun also markets its Galaxy line of servers as energy-saving equipment. “The manufacturers are getting better now,” says Paul Froutan, VP of product engineering for Rackspace, which manages servers for clients in its five data centers. With more than 18,000 servers to watch over, Froutan has

Vol/1 | I ssu E/17

Feature.indd 43

Most chip manufacturers are ramping up production of so-called dual-core processors that are faster than traditional chips and yet use less energy. been worrying about energy costs for years. He’s seen the company’s power consumption more than double in the past 36 months, and in the same period has seen his total monthly energy bill rise five times to nearly Rs 1.35 crore. Latimer, who oversees Notre Dame’s Center for Research Computing, first appreciated the power consumption problem when the university decided to hire a hosting company to house its high-performance computers offcampus. On-campus electrical costs associated with data centers have generally been rolled together with other facilities costs, and so the Rs 1.35 lakh monthly utility bill from the hosting company — for running a 512-node cluster of Xenon servers — came as a shock. Notre Dame’s provost recently called Latimer and other leaders together to talk about how to handle the increasing demands that a growing research program was beginning to place on the campus utility systems and infrastructure. Faculty members are requiring more space, greater electrical capacity and dedicated cooling for high-powered computers and other equipment such as MRI (magnetic resonance imaging) machines. Latimer’s recent conversations with Intel, AMD, Dell and Sun about his plans to buy new computer clusters “have been very focused on power consumption,” he adds.

SOLUTION

The Latest in Cooling

2

In September 2005, officials at Lawrence Livermore National Laboratory switched on one of the world’s most powerful supercomputers. The system, designed to simulate nuclear reactions and dubbed ASC Purple, drew so much power (close to 4.8 megawatts) that the local utility, Pacific Gas & Electric, called to see what was going on. “They asked us to let them know when we turn it off,” says Mark Seager, assistant deputy head for advanced technology at Lawrence Livermore. What’s more, ASC Purple generates a lot of heat. And so, Seager and his colleagues are working on ways to cool it down more efficiently than turning up the air-conditioning. The lab is trying out new cooling units for ASC Purple and the lab’s second supercomputer, BlueGene/L (which REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

43


Energy was designed with lower-powered IBM chips, but is hot). Lawrence Livermore recently invested in a spray cooling system, an experimental method in which heat emitted by the computer is vaporized and then condensed away from the hardware. Seager says this new method, which holds the promise of eliminating air-conditioning units, would allow the lab to save up to 70 percent on its cooling costs. It’s not only supercomputers that create supersized cooling headaches. Tisdale, with NewEnergy Associates, says maintaining adequate and efficient cooling is one of the hardest problems to solve in the data center. That’s because as servers use more power, they produce more heat, forcing data center managers to use more power to cool down the data center. “You get hit with a double whammy on the cooling front,” says Rackspace’s Froutan. To address the cooling dilemmas of more typical data centers, hardware makers such as Hewlett-Packard, IBM, Silicon Graphics and Egenera have offered or are coming out with liquid cooling options. Liquid cooling, which involves cooling air using chilled water, is an old method that is making a comeback because it’s more efficient than air-conditioning. HP’s modular cooling system attaches to the side of a rack of HP computers and “provides a sealed chamber of cooled air” separate from the rest of the data center, says Paul Perez, vice president of storage, networking and infrastructure for HP’s Industry Standard Server group. More efficient servers help too. Last spring, Tisdale discovered that his data centers had reached their airconditioning limit. While he had always imagined that a lack of physical space would be his biggest constraint,

as part of the information-gathering process, CIos should establish metrics for power consumption in their data centers and measure how much electricity they consume. he discovered that if he ever lost power, his main problem would be keeping the air-conditioning going. Tisdale had replaced all 22 of his company’s Intel servers in its Houston data center with two dual-core Sun Fire X4200 servers. The new servers are more energy-efficient, according to Tisdale. And so when he proposed installing the servers in Atlanta, he justified the purchase by arguing that he could avoid having to buy a bigger air conditioner that would have used even more power. Tisdale said that according to company projections, the move will save electricity and reduce heat output by 70 to 84 percent. 44

Feature.indd 44

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

There are better ways to use traditional air-conditioning. Neil Rasmussen, CTO and co-founder of American Power Conversion (APC), a vendor of cooling and power management systems for data centers, says CIOs should consider redesigning their air-conditioning systems, particularly as they deploy newer, high-density equipment. “Instead of cooling 100 sq ft, it makes sense to look for the hot spots,” concurs Vernon Turner, group vice president and general manager of enterprise computing at IDC. Traditional cooling units “sit off in the corner and try to blow air in the direction of the servers,” Rasmussen says. “That’s vastly inefficient and a huge waste of power.” Rasmussen argues that the most efficient way to cool servers is with a modular approach that brings cooling units closer to each heat source. Meanwhile, he adds, CIOs who manage data centers in colder climates should use air conditioners that have ‘economizer’ modes, which can reduce power consumption in the dead of winter. Newer air conditioners have compressors, fans and pumps that can slow down or speed up depending on the outside temperature.

SOLUTION

3

A More Efficient Data Center

Just as aging cars are not as fuel-efficient as newer models, the majority of data centers in the US are using a lot more energy than they should. A survey of 19 data centers by the consultancy Uptime Institute found that 1.4 kilowatts are wasted for every kilowatt of power consumed in computing activities, more than double the expected energy loss. However, like many people who aren’t going to junk their older cars right away, many companies aren’t ready to tear out their data centers to build new ones with a more efficient layout. “We haven’t reached the point yet where it makes financial sense to rebuild most data centers from scratch,” says Rackspace’s Froutan. And so for most companies, the journey toward an energy-efficient data center will be a gradual one. For NewEnergy Associate’s Tisdale, that means retiring aging servers in one data center seven at a time and replacing them with more energy-efficient equipment. But redesigning your data center also means making the most of what you have through server consolidation and, more specifically, the use of virtualization software. Virtualization allows several operating systems to reside on the same server. Froutan says that virtualization will help his data centers make do with fewer servers by allowing them to perform more tasks on one machine. In addition, he says, energy can be saved by deferring lowerpriority tasks and performing them at night, when the cost of power can be three times less expensive. IDC’s Turner agrees that CIOs need to improve server utilization in order to cut both power and cooling costs. Instead of building one server farm for Web hosting and another for application

Vol/1 | I ssuE/17

7/12/2006 1:10:05 PM


Search For Efficiency Begins At Home G

oogle, typically tight-lipped about the technology behind its data centers, builds its own servers to save costs and because standard products don’t exactly meet its needs. Hardware makers invest heavily in researching and developing reliable products, a feature that most businesses value. but Google doesn’t actually need very reliable servers because it has written its software to compensate for hardware outages, said urs Holzle, senior vice president (operations), Google. Instead of buying commercial servers at a price that increases with reliability, Google builds less reliable servers at a cheaper cost, knowing that its software will work around any outages. “For us, that’s the right solution,” Holzle said. another reason that Google builds its own servers is equally simple: it can save

costs on power consumption. Energy efficiency is a subject Holzle speaks passionately about. about half of the energy that goes into a data center gets lost due to technology inefficiencies that are often easy to fix, he said. the power supply to servers is one place that energy is unnecessarily lost. one-third of the electricity running through a typical power supply leaks out as heat, he said. that’s a waste of energy and also creates additional costs in the cooling necessary because of the heat added to a building. rather than waste the electricity and incur the additional costs for cooling, Google has power supplies specially made that are 90 percent efficient. “It’s not hard to do. that’s why to me it’s personally offensive that standard power supplies aren’t as efficient,” he said. While

development, for example, they should use virtualization to share servers for different types of workloads. Finally, advises APC’s Rasmussen, if you are building a new data center, it’s better to design it to accommodate the equipment that you need right now, rather than building facilities designed for what you might eventually need as you grow, as many companies have done. By using a more modular architecture for servers and storage — so capacity can be added when needed — a company can avoid such waste and still be prepared for growth.

How to Start Saving As CIOs search for more energy-efficient data center equipment and design, they need to educate themselves about which solutions will work best for them. As part of the information-gathering process, CIOs should establish metrics for power consumption in their data centers and measure how much electricity they consume. There aren’t many generally accepted metrics for keeping tabs on power consumption. But according to Turner, such metrics could include wattage used per square foot, calculated by multiplying the number of servers by the wattage each uses and dividing by the data center’s total

Vol/1 | I ssu E/17

Feature.indd 45

he admits that ordering specially made power supplies is more expensive than buying standard products, Google still saves money ultimately by conserving energy and cooling, he pointed out. Google has data centers scattered around the globe but is usually reluctant to divulge details of the hardware and software running in the centers. Holzle spoke to journalists during his visit to Dublin for the final day of the European Code Jam, a contest for programmers sponsored by Google in an effort to identify talented potential workers.

— nancy Gohring

square footage. Sun has come up with a method called SWaP, which stands for Space, Wattage and Performance. The company says this method, which lets users calculate the energy consumption and performance of their servers, can be used to measure data center efficiency. John Fowler, executive VP of the network systems group at Sun, says sophisticated customers are installing power meters at their data centers to get more precise measurements. It also pays to be an energy-aware buyer. As Fowler says, “Don’t just take the vendor’s word” on the matter. He suggests having a method of testing the server and its energy use before buying. The industry is still working on methods to compare servers from vendors in a live environment. Ultimately, vendors’ ‘eco-friendly' messages may resonate only slightly. NewEnergy’s Tisdale, for example, still cares most about maintaining server performance. But he is impressed that new equipment will help him add more computing capability while maintaining current power usage levels. “Like a lot of people,” he says, “I’m not interested in turning off the servers.” CIO Susannah Patton is a senior writer with CIO. Send feedback about this feature to editor@cio.in

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

45

7/12/2006 1:10:06 PM


BY Rahul Neel MaNi

46

Govern_Main.indd 46

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Vol/1 | ISSUE/17

7/12/2006 1:03:59 PM


Railways

The Centre for Railway Information Systems continues to innovate and enhance an application that has controlled one of the world’s largest rail networks.

Reader ROI:

How handhelds can boost customer service How the Indian Railways is dealing with capacity under-utilization What business intelligence can do for the Railways

Vol/1 | ISSUE/17

Govern_Main.indd 47

on us to refurbish the PRS application with more functions,” says S. S. Mathur, general manager, IT infrastructure, CRIS. And if CRIS can pick its way through a number of implementation issues, the PRS will be a ubiquitous way to provide information for passengers on the move. At a strategic level, the new enhancements will buff a shine on system efficiency, and herald in demand forecasting and the ability to streamline railway traffic. It will also give train ticket examiners (TTEs) the power to do their jobs more dynamically and introduce flexibility to customer interfaces and “keep malpractices at bay,” says Mathur.

Getting Onboard the New PRS It wasn’t so long ago that the Railways worked with a system in which a central authority allocated ‘ticket quotas’ to each station along a train’s route. But as traffic increased manifold and the number of longhaul trains, the quota solution was driving the Railways into the ground. Today’s PRS is the third avatar of its original form, which worked as a host-based system. Individual applications ran on four host-based computers, and from there, to terminals at various locations in different zones. For example, Delhi’s PRS host system connected to terminals in Mumbai, Chennai and Kolkata. It was succeeded by a networked version. But even this version, REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

Illust ration by an il t

T

he Indian Railways’ Passenger Reservation System (PRS) has long been the poster-child of a Indian e-governance projects. But unlike many winning projects, this one isn’t being left alone — for the better. “The PRS is the lifeline of the railways. As the custodians of this application, we crave to make it better,” says Shashi Bhushan Roy, group general manager, PRS, CRIS (Centre for Railway Information Systems). Custodian, it would appear, sounds a tad too earnest. But the PRS is indeed a legacy of the Indian Railways. Right from when it was conceptualized in the 1970s, it’s been a shiny blue success that’s made the Railways proud. And with good reason. It’s lived up to the expectations of millions of travelers. It is a giant, complex application that juggles with nine train-types, 102 coach models and 40 different quotas. It can intimidate the best network managers with its 10 million reservations, cancellations and train enquiries everyday. It caters to 11 lakh people on a daily basis and moves over 3,000 trains, covering 1,250 locations with over 4,600 terminals. These statistics have grown steadily from 1985, when the PRS was piloted. And so have passenger expectations. Initially, the PRS was created purely to automate the process of reserving tickets. “Today passengers want reservations quickly and they don’t want to go to a reservation counter. This puts pressure

47

7/12/2006 1:04:00 PM


Infograp hics: ANIL V.K

Railways Impressive as these developed in FORTRAN, was improvements are, they are basic. Travelers requesting a only extensions of the basic ticket would get one if there application; the front-end were available seats, or were was tweaked and extended placed on a waiting list. SNAPSHOT to the passengers. “Now’s “The PRS application has PRS the time for fundamental come a long way from when Transactions per day: changes that will introduce only five zonal railway data Rs 1 crore transparency and ease of centers (Delhi, Mumbai, Kolkata, Passengers data access,” states Roy. Chennai and Secunderabad) handled per day: were networked. Today, it’s 9.95 to 10.85 lakh all-pervasive and provides the Coupling Trains controlled: ability to make reservations Information >3,080 from anywhere to anywhere. Until today, trains pulling out Locations covered: But it’s time to make it an of a station were disconnected 1,232 application that can provide us from the PRS. There was Terminals: enhanced facilities,” says Roy, no way a railway official 4,169 who’s overseen the application could tally the number of IT Budget: for many years. passengers canceling their Rs 272 crore To meet rising passenger journeys at the last minute — (2005-06 CAPEX) expectations, CRIS’s first which meant that their seats strategy was to increase the went empty all the way to the number of offline reservation last stop. “It’s a big drawback. counters. Today, 1,300 counters have sprung We aren’t able to utilize that capacity and up where there were 200 in 1994-95. lost additional revenue,” says Roy. Roy and his team are determined to push Among the enhancements planned for the system. In just over three years, the PRS the PRS is a move to empower TTEs with has gone from being an application that only handheld terminals connected to the backpermitted ticketing at reservation counters end of the PRS. The handhelds allow TTEs to allowing passengers to buy tickets over the to ‘give back’ vacant berths to the system. Internet. And now, reservation information By updating the PRS in real-time, TTEs can and ticket status are even available over signal the next station of the number of vacant mobile phones through SMS. seats aboard — after the train has departed. “This [e-ticketing] was a major boost “The effort has been possible with GPRS to the facilities provided by the Indian connectivity provided by BSNL. It is a Railways to its passengers. We literally took dedicated link provided to us. We’re piloting the ticket to the customer,” says Mathur. the project right now but will replicate it on other routes soon,” says Roy. One of the problems they faced during early trials was that CRIS was forced to

The Indian Railways IT Journey

48

restart the whole process of data gathering every time a link went down. Broken links, however, are a reality on India’s 108,706 km of track, and CRIS installed a connection manager — Websphere — to pick up signals through patches of service providers. When the link is disrupted, Websphere waits till it’s restored and then synchronizes data with the backend by keeping tabs on when the link went down. Roy says that they plan to use the handheld terminals as an extension of the PRS. This will allow TTEs to issue tickets on the fly to people traveling without a reservation. “More important is the fact that the transaction is done with the system’s knowledge and it’s all happening in real time,” adds Roy. In short, the system will introduce transparency. The handheld will ensure that that every seat is accounted for. It also facilitates better customer service. Since the terminals are connected with the back-end, they can be used as a source of information for traveling passengers. CRIS plans to provide information on connecting trains and the PNR status for travelers en route, but with unconfirmed reservations on connecting trains. CRIS officials say that in the future, these terminals could also be used to book retiring rooms. Extending the PRS to a moving train will add both revenue and boost customer services. Soon, CRIS will enable the handhelds to report air-conditioning failures or mechanicals faults — giving approaching stations time to prepare. “With access to information on moving trains, the train travel will undergo a significant improvement,” says Roy. Initially, the PRS used with the handhelds crashed. And CRIS continues to tackle

1970s

1993

The Passenger Reservation System is conceived and introduced in Delhi, and then rolled out in three other metros. Each application runs independently.

CRIS creates the Country-Wide Network for Computerized Enhanced Reservation and Ticketing. This enables ‘anywhere to anywhere’ reservation from any counter.

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Vol/1 | ISSUE/17

Terminals.

Govern_Main.indd 48

7/12/2006 1:04:01 PM


ridinG On BuSINeSS INtellIgeNce Internally, enhancements to the PRS are making increased Business Intelligence (BI) possible. According to Roy, the old PRS ran on a Flag File system, which worked perfectly as a transaction processing system but failed to generate good MIS reports. “Data generated by the system isn’t of any use unless it ends

2001

PRS migrates to a new platform: VMS OS, Reliable Transaction Router Middleware. Written in C, charting routines in FORTRAN. Host-based systems and VT220

Vol/1 | ISSUE/17

Govern_Main.indd 49

— Shashi Bhushan Roy, group general manager, PRS, CRIS

with business intelligence analysis and is used for commercial benefits,” explains Roy. The improvements will allow the Railways more granular information of passengers’ travel behavior and pattern. This will, in turn, enable a more dynamic re-assignment of capacity and will allow the railways to strategically add or cancel the coaches. “Tatkal and other quotas can be re-allocated based on this information.” says Mathur. “In the next stage of BI, we will collect data on customer needs. We’d like to know which passenger groups travel on which lines, why they travel, when they travel, and their class preference,” says Roy. Some of the data already coming out of the system shows that places like Hyderabad and Chennai are becoming popular medical destinations. “Today, we know the quantum of demand, but with the help of BI, we’ll know the reason behind demand swings,” says Roy.

2002

Indian Railways introduces Internet-based reservation and the Reservation Status Enquiry System. Today, it covers over 180 cities and draws in Rs 1 crore everyday.

It will lead to better demand forecasts and will streamline railway traffic and increase efficiency, he adds. The spadework of the last few years is paying off. During the last fiscal, railway revenues rose by 20 percent compared to the previous fiscal — despite the direct competition from low-cost airlines. “The enhancements on PRS have contributed a lot already,” says Roy. The effort of moving to a dynamic computer interface from a static one has resulted in improved customer satisfaction, better utilization of capacity — all without increasing cost. It will also help create special ticket prices in both busy and off seasons. All aboard the new PRS. CIO

Photo by PraVEEn KU mar

problems with the device. It still takes more time to read data off the handheld than off a manual chart. CRIS will need to make it faster and train TTEs. They’ve also tempered their plans to load the handhelds with passenger information to facilitate TTEs scrolling for passenger names. “The pilot results are giving us a great amount of learning and we are constantly correcting ourselves,” says Roy. The new enhancements will also flatten cumbersome refund procedures. Today, the process of getting a refund requires passengers to locate a counter three hours before a train leaves. Once it departs, getting a refund is very hard, which is gives the Railways a reputation for poor service. To counter this, CRIS has introduced the Computerized Coaching Refunds, which provides data on passengers who have canceled their travel plans. Clubbed with handhelds, immediate changes to the PRS make it easier to get a refund,” says Mathur. Since handhelds make data available in realtime, they allow the Railways to process refund requests more efficienctly. It also allows a group of passengers to be upgraded. “Unlike air travel, when a passenger is upgraded on in train, we try not to split a group traveling on one ticket. The system makes it easier to move groups,” says Roy.

“The PRS application has come a long way. but now’s the time for fundamental changes that will introduce transparency and ease of data access.”

Bureau head North Rahul Neel mani can be reached at rahul_m@cio.in

2005-06

e-ticketing introduced on all express trains and PRS implemented on a nation-wide basis with counters at over 1,300 cities.

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

49


Re KPTCL seeks 100 percent computerization to give consumers the best access to information, says Bharat Lal Meena, MD of KPTCL.


Interview | Bharat Lal Meena

The Electric

R eforms Using IT-powered solutions, Bharat Lal Meena, MD, Karnataka Power Transmission Corporation and chairman of the state’s electric supply companies, has managed to track and control transmission and distribution losses — and brought the consumer into the loop too.

By Balaji NarasimhaN Narasimha N

ImagIng by an Il t

I

Photo by Sr IVatSa Shan dIlya

Last year, Karnataka Power Transmission Corporation (KPTCL) kicked off the centenary

Vol/1 | ISSUE/17

celebrations of electric lighting in Bangalore. Lighting in the city may be a hundred years old, but KPTCL itself was incorporated only as late as 1999 as part of the power reforms in Karnataka. It was formed by carving out the transmission function of the Karnataka Electricity Board, while four electric supply companies (ESCOMs) were set up in 2002 to distribute power across the State. Through its corporatization phase, technology has become a part of KPTCL’s plans. Though its IT spends weren’t too flattering to begin with, KPTCL has been investing in IT — in the installation of micro-controllers to track power supply for stipulated periods, the computerization of its cash counters, for instance.

As with most power companies, transmission and distribution (T&D) losses have been an issue for KPTCL, touching almost 30 percent in rural areas. Bharat Lal Meena, managing director of KPTCL and chairman of ESCOMs, believes that IT can help to cut these losses and eventually enable KPTCL to provide 24-hour power supply — even to the rural hinterland.

CIO: KPTCL’s IT spend has risen from a few lakhs until 2003 to Rs 15 crore in recent times. What triggered this surge in IT investment?

Bharat LaL Meena: We noticed that the usage of IT was low and that we were not making optimum use of computerization, despite being in India’s IT capital. We felt the need to improve this, streamline our processes, REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

51


and to use IT properly in order to improve customer service and enhance MIS. That is why we decided to increase our IT spend. SNAPSHOT Do you plan to increase investment in this area?

Yes, we will be spending on managing our information systems, as well as on an installation of SCADA (supervisory control and data acquisition) systems. We want to have last-mile SCADA too, so that we can monitor all power consumption meters online. We have already done this for our HT (high tension) systems, which have remote automatic meter reading. We also have RRAMR (Real-time Remote Automated Meter Reading), which we are going to extend from HT-consumers to other consumers.

KPTCL TURNOVER (2005-06)

rs 1,859 crore

CUSTOMERS SERVED

134 lakh

AREA COVERED

1.92 lakh sq km SUB-STATIONS

830

DISTRIBUTION TRANSFORMERS

150,000

TRANSMISSION LINES

33 KV+ 32,407 km 11 KV 130,000 km

LOW TENSION LINES

357,000 km

What difference can IT make to an organization like KPTCL?

First of all, IT can remove the manual interface, making things better for the consumer. Hundred-percent computerization gives consumers better access. For instance, we have put meter information of the consumers on the website, and BESCOM consumers’ bills for a period of 12 months are already on the Web. So, a consumer can easily track when he has paid and how much. T&D losses are a major problem for most power companies. How can IT help tackle such issues?

IT has helped us a lot in this sphere. We have introduced a transmission monitoring system, whereby the power output can be metered at the level of the transformer that supplies power. In all urban areas, we have put meters in these transformers. These meters can be read remotely using Real-time Remote Automated Meter Reading. This will help evaluate losses at the transformer level. Different consumers are connected to these transformers. We get readings on their meters on the same day and feed 52

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

that into the system. We can thereby compare the readings of individual users against the transformer to which they are connected. This would have been impossible using a manual system. Since we have data on the location of individual transformers, we can derive a lot of valuable information. Similarly, Geographic Information Systems (GIS) can also play an important role. We are already using it in Bangalore for linking location to consumer data. For example, if somebody is calls and complains, the system can tell us who called, from which area he called, and to which transformer he is linked. Such information is captured and used.

How does KPTCL plan to minimize power losses in rural Karnataka, which are at around 30 percent as opposed to around 9 percent in Bangalore?

We will be putting meters at the transformer level. We already have meters placed at the premises of different consumers in rural Karnataka. We will allow minimal losses that are on account of genuine reasons and then measure consumption, so we know what exactly is happening. The core issue here is: since we have IT, we can work along these lines. We are also monitoring all the installation work online. Thanks to computerization, we are in a position to know where installations are taking place. The status of the work in progress can be measured and analyzed online. Earlier, this was not possible. So, IT is also helping us a lot in the area of project management. Can you tell us more about the SCADA initiative?

SCADA means that you have a control system to manage your equipment remotely. By using SCADA, you can minimize outages and isolate the fault areas quickly. Without

this technology, you will not know if a fault occurs until somebody calls you to complain. Thanks to SCADA, I can find out the status of all my equipment sitting in my office. We have introduced this in Bangalore. Do you plan to extend SCADA to other parts of Karnataka?

In the first phase, we have implemented SCADA in Bangalore’s urban and rural districts. In Phase II, we will target other places. In the third stage, we may cover smaller sub-stations. Ultimately, we want to cover all places. Does the Rs 100-crore outlay for SCADA cover the whole state?

No, it covers only Bangalore urban and rural districts. It is hard to guess how much SCADA implementation across the state would cost without inviting bids, but this could be anywhere in the range of Rs 200 crore to 300 crore. What is the status of your effort to attain 100 percent computer literacy among the KPTCL staff?

We have attained IT literacy of around 60 to 70 percent in most areas. We have targeted employees in A, B and C categories. As per our standards, even if somebody has some basic knowledge, we don’t treat him as being computer literate until he operates some system. If an employee has undergone training, but if his skills are still not up to the mark, we send him for training again. We rely on the certificate from the training company to ascertain how well the employee has learnt IT. You also have other innovations to your credit, such as IVRS. What has the response been?

We have had huge response, particularly to IVRS. Thanks to IVRS, the number of complaints has come down. And complaints don’t get missed. Before IVRS, complaints had to be written down manually using pen and paper. Since the voice is recorded directly into the system, we can monitor the status of the complaint with the aid of a docket number, which is also generated automatically by the computer. This way, IVRS has helped us to respond to

Vol/1 | ISSUE/17


Remote management of transformers can go a

long way in reducing distribution and transmission losses,

and this would not have been possible without IT.

complaints in a timely manner. When things were manual, a lot of touts were operating and creating problems. The moment such elements were removed from the system, the efficiency automatically went up. Karnataka Chief Minister H.D. Kumaraswamy recently said he had earmarked Rs 5,700 crore for KPTCL and the ESCOMs to set up new stations. Can you tell us more about IT implementations in these stations?

The biggest advantage to the new stations is that we can integrate IT right from day one. For instance, the new specifications of stations have SCADA built right into the architecture, so we don’t have to start a station the old way and then worry about integrating IT automation later. What are the other plans you have for KPTCL in the IT context?

We have a lot of plans. I have personally listed out best practices and key parameters in my chairman’s agenda. The items include distribution transformer audit to reduce losses, upgradation and ‘reconductoring’ of old conductors in rural areas, introduction of rural load management system, etcetera. If these things are implemented, we can give 24-hour power supply in rural areas. Many of the IT initiatives that we successfully implemented in Bangalore will be ported to other parts of Karnataka.

Vol/1 | ISSUE/17

Have KPTCL’s IT initiatives drawn anything from similar systems in India or abroad?

No. Everything we have done in Bangalore has come from experience. There was an IT task force initiated by the ministry of power, which gave us some useful recommendations. What we did was not something new — actually, many of the things we did have already been around — but we did the implementation effectively. What is the status of IT initiatives in other urban centers like Mysore, Mangalore and Hubli?

These centers have separate distribution companies or ESCOMs. We plan to follow the same practices there as the ones initiated in Bangalore. This is where the chairman’s agenda helps. The roadmap is based on the implementation in Bangalore, which is used as a benchmark to drive efficiency in other ESCOMs.

Once you know these objectives, you have to understand the difficulties, and this will help you to understand how you want to use the technology. Everything depends on proper usage. You can list out objectives, but until you put the technology to good use, there is no point. When we first started, we monitored the progress of IT on a weekly basis. Once things stabilized, we started monitoring projects on a fortnightly basis. That was how the message went — since we were serious, we were able to achieve a lot. We have also started outsourcing some activities to private companies now. This has been possible because of computerization, which allowed us to closely monitor the progress and identify activities that could be outsourced. CIO

You spearheaded IT initiatives at KPTCL at a time when IT was viewed with animosity. What is your advice for somebody on a similar journey?

It is easy to list out initiatives, but the big challenge lies in implementation. What we did — and what others should also do — was to closely monitor the situation. When you apply technology, you must be clear about your objectives.

Special Correspondent Balaji Narasimhan can be reached at balaji_n@cio.in

REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

53


Essential

technology Illustration by U nnikrishnan A .V

From Inception to Implementation — I.T. That Matters

Done well, consolidation and virtualization can cut computing costs while improving performance.

54

Essentisl Tec.indd 54

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

The Shrinking Servers BY CHRISTOPHER LINDQUIST

| You may be installing virtualization tools so that one server can do the job of five. You could be using configuration management tools to swap applications from one machine to another, depending on load. You may simply be looking to retire old hardware in order to run apps on new, more energy-efficient multicore systems. But no matter what your strategy, your goals or your tactics, you still have a problem. How the heck do you even know what’s out there to consolidate? In large, dispersed environments, identifying consolidation opportunities can be a time-consuming job, requiring the combined efforts of engineers and systems architects working with everything from asset management tools to network discovery applications, to performance monitoring utilities, to homegrown spreadsheets, to big, old-fashioned whiteboards in order to determine what pieces of your hardware and software infrastructure might be better off someplace else. But a new segment of products — called data center intelligence or consolidation management tools — promise to help automate consolidation, freeing up some of your most valuable employees while providing the hard numbers to justify a consolidation project. While these tools come primarily from smaller vendors, the big guys are gearing up to include these

DATA CENTER INTELLIGENCE

Vol/1 | I SSUE/17

7/12/2006 12:57:13 PM


essential technology

functions in their own business systems management suites. Here’s the hot news from the consolidation front.

Kill the Beige Ones First A simple, practical scheme for hardware consolidation.

A Consolidation Tale Bell Mobility, one of the largest mobile phone service providers in Canada, had a problem. More accurately, it had one problem (or crash) after another. Recovering some critical systems could take hours, disrupting services and costing the company serious money — Rs 9.45 crore per day for one system alone. So in 2005, Bell Mobility launched a study to find a way to speed disaster recovery. One of the recommendations that emerged was to consolidate applications and servers in order to centralize recovery efforts, with the hoped-for side effect of improving server utilization. At the beginning of the study, Bell brought in a consultancy to evaluate its operations and offer suggestions for where consolidation made sense. Michel Tremblay, manager for OSS network engineering at Bell Mobility, thinks that’s a good way to go. “If you support those systems, it’s hard to say, I’m going to cut off my system by so much percent,” he says. It’s more likely, Tremblay says, that an internal IT staff with relationships to the systems might be tempted to say, ‘It’s working, so why should I do this?’ Disaster recovery benefits aside, estimates showed that if the company could consolidate its server pool by 25 percent (its goal for 2006), it would save Rs 6.45 crore in hardware replacement expenses over two years — even without including ongoing support costs for systems that would no longer exist. Numbers like those put Bell on a path to find a tool that could help identify which systems were ripe for consolidation. Capacity analysis and asset management tools could do part of the job, but Bell found a product from a small vendor that seemed to address the heart of the issue.

Tool Talk Bell originally had used startup Cirba’s Data Center Intelligence tool purely for auditing purposes. But Cirba believed its tool could also do capacity planning and analysis.

Vol/1 | I SSUE/17

Essentisl Tec.indd 55

Neal Tisdale, VP of software development at NewEnergy Associates, an energy market services provider owned by Siemens, didn’t require a lot of analysis to determine which of his machines needed consolidation. He found many of his best candidates by their color: beige. “We looked at our oldest, beige, putty-colored, 1990s highest-wattage, lowest-performance servers and started virtualizing those,” says Tisdale. And while he’s aware that a wide variety of tools exist for measuring nearly every aspect of data center performance, he says that just by getting rid of the old boxes (23 servers consolidated to a pair of Sun 4100s running VMware virtualization software) he avoided an expensive upgrade to the cooling and power systems in his data center. That in turn helped him to postpone buying extra tools. About the only consolidation tool Tisdale will recommend is a physical-to-virtual conversion tool called PowerConvert from startup PlateSpin, which helped him completely clone some older boxes right down to the MAC addresses on their network cards, thereby saving him from having to recreate from scratch ancient hardware configurations in a virtual space. PowerConvert is “a good time- and risk-saver,” says Tisdale. — C.L.

“They took it away and came back with a quick tool that could do just that,” says Bell Senior Systems Analyst Lou Fachin. Cirba claimed that its new tools could provide data center intelligence: detailed reporting of asset utilization in the data center combined with cross-referenced information about what systems could be consolidated based on factors such as a server’s operating system version, its utilization percentage, its available memory, or seemingly trivial but often critical details such as the time zone setting of the system clock. The tool generates reports that help users identify consolidation opportunities without resorting to extended whiteboard sessions or trial and error. “What I really like is the way they can set you up for consolidation,” says Andi Mann, senior analyst at IT consultancy Enterprise Management Associates. “The Cirba stuff gives you some easy-to-use graphics and metrics on utilization and compatibility. It provides sort of a one-stop shop for this specific functionality.” Consolidation-specific tools aren’t a necessity, however, particularly for smaller companies, says Michael Minichino, director of infrastructure at marketing services provider Parago. Heading into 2006,

Minichino had new IT initiatives slated that would require either more space or better use of the existing racks. Minichino was intrigued by the power-per-rack-space claims of new Sun hardware — the T2000 series of servers — and decided to try the latter route. But he didn’t bother looking for a tool to help him figure his savings; he went straight to spreadsheets. “We’ve purchased an asset management tool to track workstations,” says Minichino. “[But] I haven’t really seen anything that would give me more of a return than spreadsheets.” His calculations led him to cut 10 servers from his co-location facility. For anyone looking to jump into the consolidation toolset on the cheap, Sun offers a free, downloadable Sim Datacenter Java application that can calculate the power, heat and space requirements of your current data center versus one with different hardware. Larger companies may see consolidation tools as a way of saving time and effort. As part of an application consolidation and tracking effort, David O’Neill, executive director of IT at Boise State University, used a service discovery tool from startup software vendor nLayers. Previously, identifying assets and connections between systems for audit or troubleshooting purposes meant making REAL CIO WORLD | J U LY 1 5 , 2 0 0 6

55

7/12/2006 12:57:13 PM


essential technology

demands on systems engineers. “You put your engineering staff at the whiteboard, give them a couple cans of soda, and they spend all day drawing pictures,” says O’Neill. With the nLayers tool, O’Neill is able to “let the machine do the inventory”, and he can dedicate his engineers to more important tasks. nLayers also claims that its products can map the connections between systems and identify underutilized servers. If all these tools sound to you like features that should be part of larger-scale asset management, configuration management or business service management tools, BMC Software, IBM and other large vendors want to meet you. BMC says it already has a suite of tools capable of initial device discovery, performance monitoring and analysis, configuration management and ongoing optimization. What BMC’s suite lacks, according to Dave Wagner, solutions management director for capacity management and provisioning at BMC, is an easy interface to tie all those operations together. But, he says, customers can expect to see bigger vendors expand and improve their product lines, while the smaller vendors will consolidate or cooperate in order to provide the more wide-ranging management solution large corporations will need.

Who’s Being Served? No matter how insightful they may become about the technical configuration of your infrastructure, these tools can never map your consolidation efforts to the political and contractual landscape of your corporation. A word to the wise: get in touch with the server owners well before you intend to absorb their beloved boxes and applications into your data center. This will help smooth your path as well as help you identify relatively early on in the process if there are good reasons (compliance, security or otherwise) for keeping some seemingly underutilized hardware right where it is. It’s also worth noting that internal politics might be the least of your hurdles. “Quite often, [the difficulties lie in] vendor relations,” says Bell Mobility’s Tremblay. He notes 56

Essentisl Tec.indd 56

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

Virtualization Reality Setting in Analyst says hardware vendors could be hit. The virtualization phenomenon is getting too real for hardware server builders, as one Canadian research firm predicts server sales will fall by the end of next year. More companies now realize the benefits of server optimization and consolidation to attain the full potential of their hardware investments. Virtualization technology has emerged as an effective way of implementing consolidation and maximizing computing capacity while reducing server count, according to Darin Stahl, research analyst at London, Ontario-based Info-Tech Research. “Some [hardware vendors] are going to take the hit on [increasing server virtualization]. Obviously, from consolidating and virtualizing, customers are buying less hardware,” explained Stahl. The analyst added that even the bigger margins that the manufacturers have from selling bigger-capacity servers “is not going to make up for the loss of sales” from multiple servers. Server virtualization tools allow organizations to reduce the number of boxes within their IT environments by creating virtual instances of servers within one or two high-capacity x86 physical servers. This effort enables organizations to efficiently utilize and manage server capacities which, according to market research firm IDC, have generally been underutilized, running only at 10 to 20 per cent capacity. An organization with 60 distributed physical servers, for example, can implement virtualization and end up with only two multiprocessor servers running 10 virtual servers, read a recent InfoTech research paper entitled ‘The ROI of Server Consolidation’. And each of those virtual servers could have its processing power and storage capacity raised or lowered as necessary. With virtualization, the enterprise can achieve reduction in the number of physical servers by a factor of five, 10, and even 20 to one,” the report stated. Such reduction, it added, spells definite cost savings in support and server maintenance. The same research document claimed that through virtualization, organizations could reduce server asset requirements and administrative support by up to 40 per cent. “If, for example, an enterprise spends $ 50,000 in new server acquisitions per year, the enterprise can reduce this amount to $ 30,000 per year. If an enterprise were to reduce IT network operational overhead from 10 staff to six, it could realize savings of up to $ 40,000 .” “As server consolidation initiatives close out, x86 server builders will face a decline in server volumes by the end of 2007,” according to Stahl. Virtualization software developer VMWare has developed a‘relationship’with hardware makers in an effort to work on standards that enable integration between server hardware and the virtualization tools, said Brian Byun,VP (products and alliances) at VMWare.“You’d think that [hardware manufacturers] might not like [virtualization], but we accelerate the refresh of hardware. They may sell less hardware, but they are (now) selling them in larger configurations,” said Byun. — Mari-Len De Guzman

that in one case, Bell Mobility discovered a system comprising 19 servers installed for one application by a vendor that Tremblay’s team found could be consolidated to nine. Such inefficient installations will be history, says Tremblay, noting that Bell has created a policy of examining vendor architecture plans for efficiency before it agrees to an

implementation. “[Vendors] have to start thinking of redesigning what they are selling” to make systems more efficient, he says. If every IT organization does the same, your next consolidation effort could be your last. CIO Send your feedback on this feature to editor@cio.in

Vol/1 | I SSUE/17

7/12/2006 12:57:13 PM


Pundit

essential technology

The Endpoint of Endpoint Security When vendors come up with jargon like endpoint security,it's essential to deconstruct it. BY SCOTT BERINATO

| If you’ve ever watched youth soccer, you instantly understand the term ‘swarm ball’. Security marketing isn’t so different from youth soccer. Vendors swarm toward the ball — the jargon that will resonate with buyers — and then have a mad battle to control it, only to have the ball squirt out in another direction, whereupon they all swarm that way. Recently, the ball was compliance solutions, until the

SECURITY

for a firewall. One way or another, all these products and others like patch management and anti-spam are associated with failure, management burden and, of course, money spent. For what? Endpoint security, on the other hand, might comprise some of these products while carrying none of the negative connotations of those products. There’s a more manipulative progenitor of new jargon: the analyst community.

other hand, doesn’t promise anything, so it can’t really fail. I can anticipate one reaction to my sportive dig at vendors: Stu would say, “Look, Mr. Cynical, this is just semantics. We have to call products something, so why not focus on the positive? Would you have the cola companies call their product category Cavity-Causing Beverages? It’s just marketing. What’s your problem?”

The words we use,in many ways,show what we are.And that matters in an information security industry that profits not from fixing the problem,but from perpetuating it. swarm caught up and jarred that around, and eventually kicked it out to where the vendors swarm now: endpoint security. Look, endpoint security is jargon, and jargon is spin — an attempt to create buzz while downplaying potential negatives. It’s antithetical to substance. It’s saying certified pre-owned when what you mean is used. So we work with it, with our grain of salt, but it’s useful to deconstruct jargon. Endpoint security, for example, doesn’t carry the baggage of older, more specific terms for products that did zilch to dam a rather steady and torrential flow of security failures. Antivirus sounds positively antediluvian. Intrusion detection implies there’s already been an intrusion, and intrusion prevention sounds an awful lot like a quixotic epithet 58

ET-Pundit.indd 58

J U LY 1 5 , 2 0 0 6 | REAL CIO WORLD

White papers, market reports and mystical squares can get crowded, and the big vendors often dominate them. But what if there were more squares? “No, no,” says Stu, the vendor sales and marketing guy. “We don’t belong in the same category as BehemothCo, because they do IPS, and we’re more of a dynamic endpoint security solutions provider.” Magically, a new quadrant is born, and Stu’s company is rocking in that one, according to an analyst report. (Or, do the analysts themselves create these new categories to attract new clients?) It’s worth going back to endpoint security’s vagueness. When you think of it, it’s bold to call something intrusion detection. Because what if it doesn’t? Endpoint security, on the

I respectfully disagree. The words we use, in many ways, show what we are. And that matters in an information security industry that profits not from fixing the problem but from perpetuating it; that’s slow to adapt to new and converged threats; that attacks the problem at its frayed edges, implicitly indicting end users instead of addressing the inherent flaws at the core of infrastructure; that’s happy to sell post-facto bandages instead of creating a culture of preventive health. In an industry like that, words like endpoint security speak volumes. They just don’t say anything. CIO Scott Berinato is a senior writer for CSO. Send feedback on this column to editor@cio.in.

Vol/1 | ISSUE/17

7/12/2006 12:58:33 PM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.