Informationweek December 2012

Page 1


Edit

Why you shouldn’t be losing sleep over data management… ’ve often asked CIOs what keeps them up at night. Is it the rigid mandate from the regulator in their industry? Perhaps audit compliance or internal policy-based issues? Is it about making IT systems and applications available to the business 24 x 7, and the commitment to 99.999 percent uptime? The problem that most CIOs talk about is storage. But if you think deeply about it, storage is a ‘container’ for data. So the problem is really about ‘Data’ — how to manage it, store it, and derive meaningful business insights from it. And these CIO concerns are really about not having enough budget to invest in incremental storage, to contain the exponential growth of the data. You’ve heard the maxim: Data is the crown jewels of your business. Life would be a struggle if your database or data warehouse was outdated, with old, redundant and invalid data. How would you be able to make business decisions or reach out to prospects and existing customers, to sell new products and services? Forget about business transformation if your data isn’t in order. To get round the problem, people create their own departmental databases (often using Excel worksheets), and so you have fragmentation. We have titled this issue, the Data Management Conundrum because we believe that data management is a confusing and difficult proposition for many organizations. Even with the availability of high-end storage and data management solutions, it is a struggle for many. And we see this struggle more so in the banking, healthcare, telecom, and BPO/KPO sectors. Yes, I know that there is a huge amount of data generated in oil & gas, scientific research, and space exploration. But we’ll restrict the discussion to the aforementioned data-intensive sectors. Our principal correspondents Jasmine Kohli, Amrita Premrajan and Ayushman Baruah go into the heart of the companies in these sectors to find out their data management strategies. After discussions with CIOs, analysts, vendors, experts, and consultants, we have a story that is an eyeopener. We now know how certain companies are using technology like virtualization to get round the challenges. For instance, unlike the BPOs, most KPOs do not store client data on their site. So the biggest challenge for a KPO is to make sure the client data is accessible to its employees for processing, while ensuring the confidentiality of client’s data and preventing any form of information leak. Hence, the KPOs establish a VPN tunnel to the client site and access its data using virtualization or VDI. Incidentally, BPOs and KPOs generate terabytes of data, and these companies have been extensively using top notch data management technologies such as data storage, deduplication, backup & recovery, archiving, etc. So if you are caught in the data management conundrum, you know which CIOs to talk to, and what questions to ask. In ending my last editorial for the year, I wish all our readers and partners a Merry Christmas and a Happy and Prosperous 2013.

I

The problem that most CIOs talk about these days is storage. But if you think deeply about it, storage is a ‘container’ for data. So the problem is really about ‘Data’ — how to manage it, store it, and derive meaningful business insights from it

Follow me on Twitter

@brian9p

4

informationweek december 2012

u Brian Pereira is Editor of InformationWeek India. brian.pereira@ubm.com

www.informationweek.in


contents Vo l u m e 2 | I ss u e 02 |

December 2012

20 Cover Story Tackling the data deluge By 2020, digital information in India will grow from 40,000 petabytes to 2.3 zettabytes (2.3 million petabytes), twice as fast as the worldwide rate, according to a report by EMC. The astounding rate of data growth means new challenges for businesses of all sizes. However, enterprises in the telecom, banking, healthcare and BPO/KPO sector are mainly struggling to cope with this data deluge, owing to the sheer nature of their business and stringent regulatory and compliance guidelines. InformationWeek takes a detailed look into the key trends that are influencing information needs, and the unique storage requirements and challenges for each sector

32

Data flood opens new opportunities for vendors With organizations increasingly adopting a more matured approach to data management, vendors are aggressively pursuing the opportunity Cover Design : Deepjyoti Bhowmik

interview

34 36

‘Machine-to-machine data poses greater challenge’

40

Daya Prakash, CIO, LG Electronics

‘Data management is all about enhancing customer expectations’ Ravikiran Mankikar, GM - IT Department, Shamrao Vithal Co-op Bank (SVCB)

DLP solution helps Hitachi Consulting ensure confidentiality of data While the outsourcing industry has matured, customers are still wary about the way their intellectual property is handled. To address this issue, Hitachi Consulting Services deployed a comprehensive DLP solution Do you Twitter? Follow us at http://www.twitter.com/iweekindia

6

informationweek december 2012

By using an automated backup solution from Druva Software, Royal Sundaram Alliance Insurance, has reduced its storage costs by a significant percentage

opinion

41

case study

38

Automated backup helps Royal Sundaram Alliance Insurance slash storage space

42

Find us on Facebook at http://www.facebook. com/informationweekindia

The value of data deduplication Data deduplication can pay great dividends if used in the right situation. Consider it for backup, primary data storage, and wherever flash storage is used

How data deduplication technology can enhance disaster recovery According to Gartner, 50 percent of tape backups fail to restore. As the corporate world moves from terabytes to petabytes, it’s time for organizations to consider a disc-based backup solution

If you’re on LinkedIN, reach us at http://www.linkedin.com/ groups?gid=2249272

www.informationweek.in


THE BUSINESS VALUE OF TECHNOLOGY

feature

44 46 49

13 14 15 17 18

Big Data means big storage choices It’s tough to keep up with what Big Data you’d like to store, especially when much of the data is unstructured text from outside — perhaps from blogs, wikis, surveys, social networks, and manufacturing systems

case study solution helps DIMTS slash operational 55 ERP costs by 30 percent By deploying an ERP solution in the form of Microsoft Dynamics AX, the Delhi Integrated Multi-Modal Transit System (DIMTS), has gained complete visibility into all aspects of the business such as trends, routes, and revenues

global cio 6 lies about Big Data Our 2013 Big Data Survey shows we’re not lacking facts, figures, or tools to wrangle them. So why do just 9 percent of respondents rate themselves as extremely effective users of data?

Marketing analytics: How to start without data scientists You don’t need a team of highly paid math whizzes to get started with data analytics, says one marketing analytics expert

Suzuki CIO’s next goal: Training 57 Maruti 100,000 people Rajesh Uppal, Executive Director (IT) and CIO, Maruti Suzuki India established the Maruti Suzuki Training Academy earlier this year to bridge the skill gap between the job roles of an individual. The Academy has the herculean task of training everyone in the company’s 100,000 strong value chain. Here’s the plan:

feature Dell is helping businesses shed legacy flab 58 How to accelerate transformation Acquires Clerity Solutions and Make Technologies to help customers port applications to new and open architectures; focus shifts to tightly integrated, converged systems

News

EDITORIAL.........................................................4

It’s time to value information that’s driving thirst for data analytics, says EMC CTO

INDEX..................................................................8

23 percent of users are running old or outdated web browsers

Chirpings from twitterati .................. 10

Indian ecosystem geared up for software products: NASSCOM

social sphere...............................................11 the Month in technology..................... 12

Cloud adoption on the rise in India; usage grows by 25 percent Year-on-Year

outlook 2013.............................................. 60

McAfee: Security will be a challenge with 50 bn connected devices

events........................................................... 62

Indian Big Data solutions market to boom

cio profile................................................... 64

Mobile data traffic to grow at 50 percent CAGR through 2018

analyst angle........................................... 65

Enterprises still cautious in implementing APM SaaS, says CA survey

blog.................................................................67

Nearly 50 percent of companies in India have security policies that prohibit BYOD Wipro joins car connectivity consortium

global cio................................................... 68 practical analysis................................. 69 down to business..................................... 70

december 2012 i n f o r m at i o n w e e k 7


Imprint

VOLUME 2 No. 02 n December 2012

print online newsletters events research

Managing Director : Sanjeev Khaira Printer & Publisher : Kailash Pandurang Shirodkar Associate Publisher & Director : Anees Ahmed : Brian Pereira Editor-in-Chief Executive Editor : Srikanth RP Principal Correspondent : Jasmine Kohli Principal Correspondent : Ayushman Baruah (Bengaluru) Senior Correspondent : Amrita Premrajan (New Delhi) : Shweta Nanda Copy Editor Design Art Director : Deepjyoti Bhowmik Senior Visualiser : Yogesh Naik Senior Graphic Designer : Shailesh Vaidya Graphic Designer : Jinal Chheda Designer : Sameer Surve Marketing Deputy Manager : Sanket Karode Deputy ManagerManagement Service : Jagruti Kudalkar online Manager—Product Dev. & Mktg. : Viraj Mehta : Nilesh Mungekar Deputy Manager—Online : Nitin Lahare Web Designer Sr. User Interface Designer : Aditi Kanade Operations Head—Finance : Yogesh Mudras Director—Operations & : Satyendra Mehra Administration Sales Mumbai : Marvin Dalmeida Manager- Sales marvin.dalmeida@ubm.com (M) +91 8898022365 Bengaluru : Kangkan Mahanta Manager—Sales kangkan.mahanta@ubm.com (M) +91 89712 32344 Delhi : Rajeev Chauhan Manager—Sales rajeev.chauhan@ubm.com (M) +91 98118 20301

Head Office UBM India Pvt Ltd, 1st floor, 119, Sagar Tech Plaza A, Andheri-Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400072, India. Tel: 022 6769 2400; Fax: 022 6769 2426

Production Production Manager : Prakash (Sanjay) Adsul Circulation & Logistics Deputy Manager : Bajrang Shinde Subscriptions & Database Senior Manager Database : Manoj Ambardekar manoj.ambardekar@ubm.com : Deepanjali Chaurasia Assistant Manager deepanjali.chaurasia@ubm.com

associate office- pune Jagdish Khaladkar, Sahayog Apartment 508 Narayan Peth, Patrya Maruti Chowk, Pune 411 030 Tel: 91 (020) 2445 1574 (M) 98230 38315 e-mail: jagdishk@vsnl.com International Associate Offices USA Huson International Media (West) Tiffany DeBie, Tiffany.debie@husonmedia.com Tel: +1 408 879 6666, Fax: +1 408 879 6669 (East) Dan Manioci, dan.manioci@husonmedia.com Tel: +1 212 268 3344, Fax: +1 212 268 3355 EMEA Huson International Media Gerry Rhoades Brown, gerry.rhoadesbrown@husonmedia.com Tel: +44 19325 64999, Fax: + 44 19325 64998

Editorial index Person & Organization A Balakrishnan, Geojit BNP Paribas Financial Services...................64 Alpna J Doshi, Reliance Group ................................23 Amrita Gangotra, Bharti Airtel.................................23 Anand Naik , Symantec ..............................................60 Arindam Dutta, HP........................................................22 Bhaskar Basak, Delhi Integrated Multi-Modal Transit System.................................................................55 Charlotte Davies, Ovum ............................................30 Daya Prakash, LG Electronics ...................................34 Deep Roy, NetApp.........................................................32 Deepak Advani, IBM.....................................................51 Deepak Varma, EMC ....................................................32 Katrina Troughton, IBM ..............................................51 Katyayan Gupta, Forrester Research......................22

Japan Pacific Business (PBI) Shigenori Nagatomo, nagatomo-pbi@gol.com Tel: +81 3366 16138, Fax: +81 3366 16139

Mahesh Shinde, Hinduja Hospital..........................30

South Korea Young Media Young Baek, ymedia@chol.com Tel: +82 2227 34819; Fax : +82 2227 34866

Philip A. Davis, Dell ......................................................58

Printed and Published by Kailash Pandurang Shirodkar on behalf of UBM India Pvt Ltd, 6th floor, 615-617, Sagar Tech Plaza A, Andheri-Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400072, India. Editor: Brian Pereira, Printed at Indigo Press (India) Pvt Ltd, Plot No 1c/716, Off Dadaji Konddeo Cross Road, Byculla (E), Mumbai 400027.

Rajeev Batra, MTS India...............................................23

RNI NO. MAH ENG/2011/39874

Sachin Jain, Evalueserve.............................................21

Narayana Menon K, Sanovi Technologies...........32 NS Kannan, ICICI Bank.................................................53

Prashant Tewari, IBM India.........................................51 Rahul Singh, HCL Technologies...............................21

Rajesh Uppal, Maruti Suzuki.....................................57 Ramesh Nagarajan, Wipro Technologies.............21 Ravikiran Mankikar, Shamrao Vithal Co-op Bank......................................36

Sam Abraham, Royal Sundaram Alliance Insurance Company.....................................................40

ADVERTISERS’ INDEX Company name Page No.

Website Sales Contact

Sandeep Kejriwal, EMC Corporation.....................42

APC

2 www.schneider-electric.com esupport@apc.com

IBM

3 www.ibm.com

stgflash@in.ibm.com

Sesanka Pemmaraju,Hitachi Consulting Software Services India...............................................38

Ctrl S

5

marketing@ctrls.in

Sudipta K Sen, SAS Institute India..........................33

Virtual

9 www.interop.in

IBM - Gatefold

24 - 29

Symantec

71 www.symantec.com

Microsoft

72 www.windowsserver2012.in microsoft.in/readynow

www.ctrls.in/mumbai-data-center

salil.warior@ubm.com

www.ibm.com www.symantec.com/in/nbu

Suresh Kumar, SevenHills e-Health........................30 Venkatesh Swaminathan, Attachmate India......33 Viswanathan N, L&T Infotech....................................50

Important Every effort has been taken to avoid errors or omissions in this magazine. In spite of this, errors may creep in. Any mistake, error or discrepancy noted may be brought to our notice immediately. It is notified that neither the publisher, the editor or the seller will be responsible in respect of anything and the consequence of anything done or omitted to be done by any person in reliance upon the content herein. This disclaimer applies to all, whether subscriber to the magazine or not. For binding mistakes, misprints, missing pages, etc., the publisher’s liability is limited to replacement within one month of purchase. © All rights are reserved. No part of this magazine may be reproduced or copied in any form or by any means without the prior written permission of the publisher. All disputes are subject to the exclusive jurisdiction of competent courts and forums in Mumbai only. Whilst care is taken prior to acceptance of advertising copy, it is not possible to verify its contents. UBM India Pvt Ltd. cannot be held responsible for such contents, nor for any loss or damages incurred as a result of transactions with companies, associations or individuals advertising in its newspapers or publications. We therefore recommend that readers make necessary inquiries before sending any monies or entering into any agreements with advertisers or otherwise acting on an advertisement in any manner whatsoever.

8

informationweek december 2012

www.informationweek.in


‘Analytics helps you see your customer as a person’ Three IBM experts Prashant Tewari, Country Manager - Business Analytics, IBM India, Deepak Advani, VP, Business Analytics Products and Solutions, IBM, and Katrina Troughton, VP Business Analytics, IBM Software Growth Markets discuss the application and use of Big Data and Analytics in business and government bit.ly/REUKgQ

Jung Kim @azn_cybersleuth tweeted:

InformationWeek – Software > ‘Analytics helps you see your customer as a person’ http://t.co/oGKus2bb via @iweekindia

Tim Powers @ timjpowers tweeted:

#Analytics helps you see your customer as a person http://t.co/VWEG8cTB (from @InformationWeek) #smarteranalytics

Abhay @ mathurabhay tweeted:

InformationWeek – Software > #Analytics helps you see your customer as a person http://t.co/c6PsCAkk via@ iweekindia

It’s time to value information that’s driving thirst for data analytics, says EMC CTO

In an era of price wars, commoditization of core voice-based services, slowing subscriber growth rate and falling profit margins, telecom service providers are increasingly investing in certain technologies that will help them control churn by providing value added and intelligent services http://bit.ly/Ytxsj7

Kishorekumar@indykish tweeted:

InformationWeek – Telecom > Preparing for the next telecom revolution http://www.informationweek.in/ telecom/12-11-29/preparing_for_the_next_telecom_ revolution.aspx?page=2 … via @iweekindia

emkay@emkaaay789 tweeted:

Preparing for the next telecom revolution - data analytics on Call Data records to customise services http://www.informationweek.in/telecom/12-11-29/ preparing_for_the_next_telecom_revolution. aspx?utm_source=newsletter&utm_medium=email&utm_ campaign=291112 …

Analytics helps Usha International respond to dynamic market needs By implementing SAP HANA Database along with SAP NetWeaver Business Warehouse component, Usha International is able to quickly churn out real-time business-critical information and rapidly respond to dynamic market needs http://bit.ly/UyifGV

As part of its innovation and technology roadmap, the company has spent USD 10.5 billion on R&D during 2003-10 http://bit.ly/W8JvNp

Usha Int’l reports 50 percent reduction in inventory levels from #SAP #HANA and SAP BW. http://t.co/rq7tYPGU@iweekindia @businessobjects

Dave O’Donoghue @ storagesport tweeted:

Kumar Mayuresh@ kumarmayuresh tweeted:

#EMC save 112M and 66M USD of #CAPEX and #OPEX savings with their journey to the cloud - http://t.co/ UrvbQW7D

pauline hannemann@ pjhannemann tweeted:

InformationWeek - it’s time to value information that’s driving thirst for data analytics, says EMC CTO http://t.co/ mjkDOhdM via @iweekindia

10

Preparing for the next telecom revolution

informationweek december 2012

Nancy Quick Martin@ nancyqmartin tweeted:

InformationWeek – Software > #Analytics helps #Usha #International respond to #dynamic #market needs http://t.co/Jj0jmqmO via @iweekindia

Follow us on Twitter Follow us on Twitter @iweekindia to get news on the latest trends and happenings in the world of IT & Technology with specific focus on India.

www.informationweek.in


Social Sphere Join us on facebook/informationweekindia

Home

Profile

Friends

Inbox

Search

Wall Info Photos Boxes Discussions

InformationWeek India Share:

Post

Photo

Write something...

InformationWeek India 64 percent Indian adults are more comfortable sharing information online than in person: Study http://www.informationweek.in/Mobile/12-11-16/64_ percent_Indian_adults_are_more_comfortable_sharing_ information_online_than_in_person_Study.aspx

Like Comment Share l

l

Sai Manohar Boidapu, Manoj Gupta, Laxmi Singh and 2 others like this. Sai Manohar Boidapu yep dats true !!! November 16 at 4:16 pm Like l

InformationWeek India Wall

960,000 IT jobs will be created in Asia Pacific by 2015 to support Big Data, says Gartner

Info

http://www.informationweek.in/ Storage/12-11-14/960_000_IT_jobs_will_be_created_ in_Asia_Pacific_by_2015_to_support_Big_Data_says_ Gartner.aspx

Friend Activity Photos Events Like Comment Share

Website

l

l

Subramanian Ls and Sudharsan Iyengar like this. Mohan Chandrasekaran I don’t think so, as majority of business house and leaders don’t know even about Data then where it is Big Data

About InformationWeek is the leading news and information source for information...

Write a comment...

More

Fan of November

Like, Tag and Share us on Facebook

Suman Kumar Jha is a Data Adminstrator at Tulip Telecom. He completed his graduation from L N M S College, Birpur. He is also a Microsoft Certified Information Technology Professional (MCITP).

Get links to the latest news and information from the world of IT & Technology with specific focus to India. Read analysis, technology trends, interviews, case studies whitepapers, and blogs and much more…

Participate in quizzes and contests and win prizes!

Like Comment

Tag Photo

Like Comment l

l

Share

december 2012 i n f o r m at i o n w e e k 11


News

T h e m o n t h i n t e c h n o lo g y Infosys signs multi-year business project with Ministry of Corporate Affairs India’s second largest IT services company Infosys recently signed an agreement with the Ministry of Corporate Affairs (MCA), government of India, to implement MCA21 v2, a multi-year business transformation

project. This latest agreement aims to deliver enhanced services to all of the ministry’s stakeholders: businesses, citizens, and government employees. Infosys will revamp the existing application, after the transition from

the incumbent vendor and manage the overall transformation to MCA21 v2. The company will provide services, application support, and infrastructure operations for at least five years in a contract valued at approximately USD 50 million. The revamped application will cater to the rapidly changing needs of the industry in the new digital era. The MCA21 project introduced electronic filing, doing away with the need for the filing of paperbased records among its registrars of companies and regional directors. Sanjiv Mital, CEO, National Institute of Smart Government, said “MCA21 is a transformational program which brought a lot of efficiency and enabled businesses in India. We expect the next phase of MCA21 to enhance the current solution and establish a model for what we like to call the ‘Transition and Transformation of Technology and Partner within Public-Private Deals’.”

EMC to acquire web intelligence provider Silver Tail Systems EMC Corporation announced that it has signed a definitive agreement to acquire privately-held Silver Tail Systems, a leader in real-time web session intelligence and behavioral analysis. Upon closing, Silver Tail will operate within the RSA security division and is expected to extend the capabilities of RSA’s Identity Protection and Verification (IPV) solutions, as well as other areas across RSA’s enterprise security portfolio. The acquisition is expected to be completed in the fourth quarter of 2012, ; terms of the deal were not disclosed. The acquisition is not

12

informationweek december 2012

expected to have a material impact to EMC GAAP or non-GAAP EPS for the full 2012 fiscal year. Within RSA, Silver Tail is expected to contribute to multiple areas across RSA’s enterprise security portfolio. Silver Tail’s products will add a disruptive fraud-fighting technology designed to install and begin providing value in a matter of days, lowering complexity and cost of ownership. In addition, it is expected Silver Tail’s core transaction and behavioral analysis technology will help further extend the security analytics capabilities across RSA’s enterprise security solutions portfolio.

Rolta establishes global OEM partnership with SAP Rolta announced that it has entered into a strategic partnership with SAP to deliver their Business Intelligence technology platform as part of RoltaOneView Enterprise Suite for customers worldwide. With this approach the company’s customers will have single-window access to SAP’s BI technology and domain specific OneView analytics solution from Rolta. SAP OEM Head, John Stark said, “SAP is pleased to partner with Rolta, and provide SAP technologies as part of RoltaOneView. This powerful combination helps SAP customers to leverage their investments for achieving operational excellence.” “With this agreement, Rolta has gained the additional capability of delivering RoltaOneView integrated with the SAP software platform comprising Business Objects, BODI, WEBI, Xcelsius, and Mobile, resulting in a higher ROI for customers,” said Pankit Desai, President, Enterprise IT Solutions, Rolta. “This will allow a wider range of customers to benefit from the strong value proposition of RoltaOneView,” he added. RoltaOneView is an BI solution for asset-intensive process industries including upstream and downstream oil and gas, petrochemicals, chemicals, and metals, to significantly improve operational efficiencies and reliability.

www.informationweek.in


Storage

It’s time to value information that’s driving thirst for data analytics, says EMC CTO EMC has chalked out its technology roadmap for 2013 with focus on cloud computing and Big Data analytics built on a pillar of trust. “All our product roadmap and innovation are targeted at these three pillars,” Jeff Nick, Senior VP and CTO, EMC told InformationWeek. Nick said that though cloud is over hyped, there is a fundamental disruption and shift occurring in the IT industry with the cloud. “It has taken a long time to get us here but it’s finally here and enterprises and public Internet companies today are adopting cloud capabilities ranging from private, public and hybrid cloud.” Given its huge potential, EMC has internally adopted cloud infrastructure and virtualization for hosting internal apps as well as outsourcing certain apps to public cloud providers. The company claims to have achieved USD 112 million and USD 66 million of CAPEX and OPEX savings respectively with its private cloud journey.

Commenting on their second area of focus Big Data, Nick said that while cloud is changing IT, big fast data is really changing the world. “It’s time to value information that is really driving the thirst for data analytics and the ultimate value proposition is to go for cloud data analytics. More and more data is being generated outside the data center firewall which is more complex.” Disruptive technologies like cloud and Big Data require constant innovation and EMC has been investing significantly on R&D to stay ahead of the curve. EMC has spent USD 10.5 billion on R&D between 2003 and 2010 and this figure should only rise over the next eight years, according to EMC Chairman and CEO Joe Tucci. EMC spends 12 percent of its revenue on R&D, which is significant in the IT industry. EMC’s India center of excellence (COE) is the largest R&D hub outside of its Boston headquarters. EMC India has over 3,500 employees most of which are

Jeff Nick, Senior VP and CTO, EMC

involved in R&D related activities. In tune with its focus on innovation and R&D, EMC recently concluded its sixth Innovation Conference aimed at promoting innovative thinking across EMC’s presence globally. Over 2,200 ideas were submitted into this year’s Innovation Showcase from across 28 countries, of which, 401 ideas were contributed by EMC’s India COE. —Ayushman Baruah

security

23 percent of users are running old or outdated web browsers Using anonymous data collected from the cloud-based Kaspersky Security Network, Kaspersky Lab analyzed web browser usage patterns of its customers around the world, and made some alarming discoveries. According to the company’s report, 23 percent of users are running old or outdated web browsers, creating huge gaps in online security: 14.5 percent have the previous version, but 8.5 percent still use obsolete versions. Nearly 77 percent of Kaspersky Lab’s customers use up-to-date browsers (the latest stable or beta versions). When a new version of a browser is released, it takes more than a month for most users to make the upgrade. Cybercriminals can move to exploit known browser vulner-

abilities within hours. The proportion of users with the most recent version installed (August 2012) are: Internet Explorer – 80.2 percent ; Chrome – 79.2 percent; Opera – 78.1 percent ; and Firefox – 66.1 percent. Transition period is estimated to be 32 days for Chrome, 30 days for Opera and 27 days for Firefox. The survey revealed that out of users running outdated browsers, 23 percent, almost two-thirds (14.5 percent) have the previous version of a browser, and the remaining 8.5 percent use obsolete versions. That means nearly 1 out of every 10 Internet users is using a woefully outdated web browser to check bank accounts and other personal information. The most

notable examples of obsolete browsers are Internet Explorer 6 and 7, with a combined share of 3.9 percent, which represents hundreds of thousands of users worldwide. Andrey Efremov, Director of Whitelisting and Cloud Infrastructure Research, Kaspersky Lab, said, “Our new research paints an alarming picture. While most users make a switch to the most recent browser within a month of the update, there will still be around a quarter of users who have not made the transition. That means millions of potentially vulnerable machines, constantly attacked using new and well-known web-born threats.” —InformationWeek News Network

december 2012 i n f o r m at i o n w e e k 13


News S o f t wa r e

Indian ecosystem geared up for software products: NASSCOM For quite some time now, industry body NASSCOM has been saying the next wave of growth in the IT industry will come from the software products space. On the sidelines of NASSCOM Product Conclave 2012 held in Bangalore, Som Mittal, President, NASSCOM , explained how the social, financial and technological ecosystem has changed in India to make software products viable in the country. We have changed socially and failure is accepted today more than ever before, Mittal said. He explained how mobile and the Internet offers great opportunities today such that products can be created within weeks. The way we market our products has also changed with online application distribution systems such as app stores. The financial ecosystem has also evolved with venture capital (VC), angel, and private equity funds entering the market, he said.

Big opportunity for the small

While the software products ecosystem in India is all set for growth, a bulk of this growth is likely to come from the small and medium enterprises (SME) segment. NASSCOM also presented the findings of a unique survey titled ‘NASSCOM AWSME India Survey 2012’ conducted in association with Nielsen India which says that it’s time for software

developers to tap the SME potential. The survey highlighted that SMEs have a highly favorable disposition to usage of software, but are currently working with a very limited level of automation. The data revealed that although 90 percent of SMEs surveyed were aware of the kind of software that can be installed on laptops or PCs, only 64 percent were actually using such technology. This presents a major opportunity for Indian technology and software providers as the survey identifies that the Indian SME market is open to the use of productivity enhancement software. “We truly believe that small is the next big for the Indian IT industry and it is these SMEs that will be the growth drivers of the economy. Our intention of conducting the survey was to determine the software usage of Indian SMEs and then provide the relevant information to the stakeholders in the Indian software industry. We recognize the immense potential business software and technology can bring to these companies, and this makes them an attractive market to developers provided the right opportunities are leveraged. The survey validates this recognition,” said Mittal. — Ayushman Baruah

Cloud Computing

Cloud adoption on the rise in India; usage grows by 25 percent Year-on-Year The findings of the third annual VMware Cloud Index revealed a surge in cloud usage in India where half (50 percent) of respondents in the country said that they have already adopted cloud solutions or approaches — a 25 percent growth over last year. An additional 30 percent of respondents declared that they are currently planning to deploy cloud solutions within the next 18 months. However, 40 percent of respondents said that there is internal resistance to change that is hindering the adoption of cloud, suggesting that faster cloud adoption is possible for Indian organizations if these hindrances can be overcome. The study also revealed that 54 percent of senior IT professionals surveyed in India consider cloud computing as

14

informationweek december 2012

a top business priority. There has also been an improvement in the understanding of cloud computing with 72 percent of respondents claiming a good understanding, compared to 59 percent last year. In India, while 80 percent of respondents believe that cloud computing will help enable their organizations to reduce IT costs, 82 percent believe that cloud computing will help them optimize their existing IT management and automation capabilities. Also, 81 percent of organizations expect cloud will help them compete more effectively in the market. According to the survey, data privacy, legacy or “loss of control” (64 percent), integration with existing on-premise systems (62 percent) and security (60 percent) are the top barriers to cloud adoption in India. — InformationWeek News Network

www.informationweek.in


Security

Storage

McAfee: Security will be a challenge with 50 bn connected devices Like everything else in Las Vegas, McAfee’s annual conference Focus 12 was also big — both in terms of scale and ideas. Security visionary and vendor McAfee shared at the conference its vision of making the world a safer place to live in. “Cyber security is a global challenge with no borders,” Joe Sexton, EVP, Global Sales said setting the stage for the event. Delivering his keynote, Michael De Cesare, Co-President, reinforced that McAfee is a company that is 100 percent focsed on security. He explained how factors such as social media, cloud, Big Data and application explosion make the job of a security vendor harder. Industry experts predict that by 2020 there will be 50 billion devices connected to the Internet and IT will have to detect and assess these devices as they connect to corporate networks. IT will need to have realtime visibility and knowledge of these connected devices. This is a major challenge in enterprises today and a serious problem in today’s networks: IT is often unaware of new potentially unsecured or compromised systems on the network. The gap between connection and awareness is a gap exploited by attackers and places compliance initiatives at risk. “You can’t protect and secure what you can’t see. McAfee is taking a significant step and fulfilling the promise of enterprise security intelligence by providing deep asset visibility with context and putting in place a fundamental layer in achieving continuous monitoring,” said Ken Levine, Senior Vice President and General Manager, Management Systems, McAfee. With McAfee Asset Manager, McAfee promises to close a key gap in enterprise security by offering intelligent, automated real-time asset monitoring. Part of its Vulnerability Manager solution, McAfee Asset

Manager performs asset discovery that is automated, intelligent and connectivity aware. By providing realtime awareness of newly connected devices, McAfee Asset Manager helps organizations strengthen their security by alerting McAfee Vulnerability Manager to execute targeted vulnerability scans as soon as new or unknown devices connect to the network. McAfee Asset Manager also passes asset information directly to McAfee ePolicy Orchestrator software to allow IT to bring unmanaged systems under security management. As part of its Security Connected approach, McAfee announced advancements in its endpoint security products to deliver innovative context-aware security to defend against advanced threats. While first generation security focused on finding and reactively fixing known threats, McAfee next-generation endpoint security claims to protect businesses from both known and unknown threats. McAfee’s innovative new solutions include user-centric dynamic whitelisting, day-zero intrusion prevention of master boot records, secure containers for mobile devices, and encrypted remote management to address advanced threats. “A new generation of mobile devices and users requires the next generation of protection,” said Candace Worley, Senior Vice President and General Manager - Endpoint Security at McAfee. “McAfee advancements in endpoint security protect businesses from both the known and unknown across all platforms. By delivering innovative security technology with the highest levels of protection, McAfee is addressing the customer demand for optimum application performance, without impacting user experience.” — Ayushman Baruah

Indian Big Data solutions market to boom Big Data solutions market opportunity is estimated to grow to USD 153.1 million by 2014, according to a NetApp commissioned IDC study. This represents a CAGR of 37.8 percent for the period 2011-2014. Big Data solutions market opportunity stood at USD 58.4 million in 2011, with IT services and software contributing the major chunk of the

overall Big Data solutions market. According to the study, information deluge has become the order of the day among Indian enterprises, with 40 percent of organizations in verticals like BFSI, media & entertainment, telecommunications, and government having more than 100 TB data currently. About a third of Indian organizations are witnessing a 60 percent year-onyear growth in data. Such acceleration was unheard of in the last 5-6 years. The study revealed that Big Data solutions market in India is currently at a nascent stage, with only 5 percent of the organizations in the large and very large segment (more than 1,000 employees) having embraced this technology. By 2014, IDC expects the market penetration to reach 18.4 percent of the organizations in the large and very large segment across different verticals. — InformationWeek News Network

december 2012 i n f o r m at i o n w e e k 15


News Mobile

Mobile data traffic to grow at 50 percent CAGR through 2018 Mobile data traffic doubled between Q3 2011 and Q3 2012, and is expected to grow at a compound annual growth rate (CAGR) of around 50 percent between 2012 and 2018, driven mainly by video, as per the Ericsson Mobility Report. The report also revealed that 40 percent of all phones sold in Q3 were smartphones and online video is the biggest contributor to mobile traffic volumes, constituting 25 percent of total smartphone traffic and 40 percent of total tablet traffic. The report further revealed that Asia Pacific (which includes India) had the largest share of the total traffic in 2012. The region is expected to increase its share of data volume from around one third today to almost 40 percent in 2018. Fredrik Jejdling, Head of Region, Ericsson India said, “APAC represents a large share of the global population. The WCDMA/HSPA population coverage is higher in APAC than the global average. It is estimated that by 2017, 90 percent of the population will be covered by WCDMA/HSPA networks.” Total mobile subscriptions are expected to reach 6.6 billion by the end of 2012 and 9.3 billion by the end of 2018.

Overall, global mobile penetration reached 91 percent in Q3 2012, and mobile subscriptions now total around 6.4 billion. Mobile subscriptions have grown by around 9 percent year-on-year and 2 percent quarter-on-quarter. Douglas Gilstrap, Senior VP and Head - Strategy, Ericsson, said, “Expectations of mobile-network quality have been elevated by the availability of smartphones and tablets that have changed the way we use the Internet.” — InformationWeek News Network

Cloud Computing

Enterprises still cautious in implementing APM SaaS, says CA survey Despite escalated industry talk about APM SaaS, a CA Technologies sponsored survey found a majority of enterprises are taking a cautious approach to its adoption today. The research found most organizations (61 percent) have no plans to implement APM SaaS. Nearly 24 percent use APM SaaS in some capacity, but only 4 percent use an APM SaaS vendor to monitor all of their critical applications. Conducted by IDG Research Services, the survey polled more than 100 IT managers and executives. The survey also revealed that approximately 15 percent of respondents plan to implement APM SaaS within next year, with more than half planning to have their managed

service provider deliver it. “We believe one reason for cautious adoption is that many APM SaaS offerings are limited in functionality and therefore don’t meet the needs of the enterprise,” said Mike Sargent, General Manager, Enterprise Management, CA Technologies. “The ability to proactively identify issues and rapidly diagnose root causes is where the value is, and today’s APM SaaS offerings don’t provide this level of capability.” The survey also found use of onpremise APM and APM SaaS is expected to rise as the DevOps movement matures. By introducing APM early in the application lifecycle, survey respondents expect to derive multiple

benefits including improved application quality, reduced risks and costs, and accelerated time to market for new services. As more comprehensive, fullfeatured APM SaaS solutions become available that can provide equivalent capabilities to on-premise solutions, broader adoption in large enterprises is expected to increase. “We recommend organizations evaluating APM products to ensure they are well suited to the needs and capabilities of both pre-production and production phases and that they will perform well across all environments, whether in the data center, the cloud or a hybrid environment,” added Sargent. — InformationWeek News Network

december 2012 i n f o r m at i o n w e e k 17


News security

Nearly 50 percent of companies in India have security policies that prohibit BYOD India stood first among its global counterparts in prohibiting BYOD, with nearly half (46 percent) of Indian enterprises successfully deploying a BYOD policy to prohibit the use of personal mobile devices for work to mitigate the risk to the enterprise, revealed a survey by ISACA. This trend was followed by Europe (39 percent), China (30 percent) and the U.S. (29 percent). Interestingly, the survey revealed that IT professionals in India continue to remain resistant to the BYOD trend — more than half (56 percent) reported that the risk outweighs the benefits. Regarding security controls for employees’ personal devices, nearly half (47 percent) of Indian enterprises reported deploying password management controls as a security layer, compared to China and Europe (44 percent) and US (42 percent). India registered lower interest on remote wipe capability (29 percent), which allows employers to erase the contents of an employee’s personal device as a security measure, compared to the U.S. (46 percent), China (39 percent) and Europe (37 percent).

The survey also unveiled some interesting trends regarding company policies about personal use of work devices. It was observed that 58 percent of Indian respondents say their enterprises prohibit access to social networking sites from a work-supplied device. This was registered as highest when compared with China (33 percent), Europe (30 percent) and US (32 percent). Additionally, 45 percent of Indian respondents reported that their enterprise prohibits employees from shopping online using work-supplied devices, whereas enterprises in Europe (21 percent), U.S. (20 percent) and China (19 percent) are more permissive. The survey also highlighted that 33 percent of the respondents felt that the business heads are not fully engaging in risk management and 21 percent said that the budget limits remain an issue to effectively addressing risk. At the same time, 39 percent of the Indian respondents felt that the situation can be improved by increasing risk awareness among employees. — InformationWeek News Network

S o f t wa r e

Wipro joins car connectivity consortium Wipro Technologies recently announced it has joined the Car Connectivity Consortium (CCC), to develop smartphone-based connected-car solutions. The Car Connectivity Consortium is a body dedicated to developing cross-industry collaboration global standards and solutions for smartphone and in-vehicle connectivity. Members of the CCC include world leaders in auto manufacturing, mobile communications and consumer electronics. MirrorLink, a technology standard created by the consortium, ensures seamless communication between compliant smartphones and in-car systems like steering wheels, dashboard buttons and screens. “Wipro’s deep experience in automotive infotainment and mobile

18

informationweek december 2012

technologies will help the consortium build industry standard solutions for the growing market” said Shoji Suzuki from Clarion Co. Ltd., Japan, a manufacturer of in-car infotainment systems, also a core member of CCC. “Membership in the Car Connectivity Consortium is the logical next step in our endeavor to create the best driving experience for consumers. We’ve been working on similar technology for more than two years and have built solutions for interfacing with Terminal Mode and Non-Terminal Mode mobile devices. Our membership in CCC will further enable us to provide MirrorLink certified well tested and up-to-date solutions to our customers.” said John Slosar, GM, Automotive Electronics, Wipro Technologies. He

added, “We envision a world where built-in and brought-in devices in a car work in perfect harmony with each other.” Wipro has been working on standardizing experiences across different devices that will help achieve seamless infotainment. In the automotive space, the company provides product development solutions for infotainment and telematics systems, body electronics, instrument clusters, power train & engine management systems. More than 15,000 engineers are engaged with Wipro’s Engineering R&D and Product Engineering Services. R&D services contributed to 12.4 percent of Wipro’s IT services revenue in FY 2011-12. — InformationWeek News Network

www.informationweek.in


20

Tackling the data deluge

38

DLP solution helps Hitachi Consulting ensure confidentiality of data

32

Data flood opens new opportunities for vendors

41

The value of data deduplication

34

‘Machine-to-machine data poses greater challenge’

44

Big Data means big storage choices

december 2012 i n f o r m at i o n w e e k 19


Cover Story

20

informationweek december 2012

www.informationweek.in


We record agent’s screen interactions and calls for compliance and quality assurance purposes. This forms a large chunk of the data that we deal with Ramesh Nagarajan CIO, Wipro Technologies

BPO/KPO

G

overned by stringent regulatory and audit compliance needs, Business Process Outsourcing (BPO) and Knowledge Process Outsourcing (KPO) are the two key industry verticals that are required to store huge volumes of exponentially growing customer data. These industry verticals also need to maintain the current and historical state of business transactions and data with respect to customer interactions. In addition, they have to ensure that proper security and controls are in place to safeguard the confidential customer data. The key challenge for each of these industry verticals is to efficiently store and preserve huge volumes of information generated, which runs into terabytes. To understand what exactly contributes to the generation of such overwhelming amounts of data, let’s take an example of Wipro BPO, which is currently dealing with 50 terabytes of data. “At Wipro’s BPO operations, we record agent’s screen interactions and customer calls for compliance and quality assurance purposes. Also, in some cases we have to preserve the data for long periods of time, running into many years. Combined across various locations of our BPO, this type of information forms a large chunk of the total data that we deal with,” says Ramesh Nagarajan, CIO, Wipro Technologies. This stands true for all the BPOs as it is mandatory to record calls and

screen interactions owing to regulatory and compliance norms. Similarly, as KPOs manage core business activities of their clients, they are required to preserve huge amount of raw data that comes from clients. These KPOs also generate voluminous amount of data in the form of research work for the clients. A case in point is Evalueserve, a major KPO firm that is currently dealing with data in excess of 20 terabytes. “A major part of our data lies in flat files that we generate as a part of our research work for our clients, which involves a lot of raw data coming in from various sources, and further doing analytics on these large data sets. Other types of data include databases, system or log files, the data we need for live projects and the data we have in our archives. Data retention as part of compliance governed by clients and local authorities also leads to the increase in data volumes,” informs, Sachin Jain, CIO (Global IT Operations) & CISO, Evalueserve.

Critical data management technologies

With BPOs and KPOs grappling with terabytes of data, these verticals are increasingly looking at data management technologies to address their storage and retention, backup and se-

curity, and data archival requirements. “Considering the voluminous data churned out in BPO and KPO verticals, the key technologies that should be deployed by them include data deduplication, data archival and indexing. Another important point is data-tiering from Tier I Fiber Channel (FC) storage to TIER III Serial Advanced Technology Attachment (SATA) storage (which is low cost) to keep in race with data volume for cost optimization,” says Rahul Singh, President, Financial Services and Business Services, HCL Technologies. Resonating the thoughts on data tiering, Evalueserve’s Jain says, “It is extremely important for the dataintensive verticals like BPOs and KPOs to segregate different forms of data and after its evaluation, plan the type and capacity of storage required. It is important for these verticals to ensure that such capacity planning is done regularly on the basis of data usage and growth trends while taking into account emerging data management technologies, as well as the dynamic business needs.” Since BPOs and KPOs are required to ensure continuous availability of business critical applications and business continuity at all times, these verticals need to have a strong data backup and archival strategy in place to ensure that all the data is backed

It is extremely important for BPOs and KPOs to segregate different forms of data and after its evaluation, plan the type and capacity of storage required Sachin Jain

CIO (Global IT Operations) & CISO, Evalueserve

december 2012 i n f o r m at i o n w e e k 21


Cover Story up and can be retrieved quickly in case something goes wrong. As this directly affects their business, most of the BPO and KPO organizations have already deployed these technologies and are ready to invest in the newer technologies emerging in this area. “Our backup strategy employs multiple levels of backup such as physical and logical database backups, and direct storage backups. The complementary recovery strategies enable data recovery based on recovery requirements,” says Arindam Dutta, Head- Global Business Analytics & Asia Business Services, HP. Similarly, Wipro BPO is using a centralized reporting mechanism that is enabling the company to regularly evaluate data size, check the backup and tape status. Nagarajan of Wipro Technologies asserts that this technology is giving them intelligence to smartly address the cost-intensive challenge of retaining huge amount of data quite effectively. Another important aspect of data management that is highly critical within the BPOs and KPOs is ensuring the safety of client data. To make sure the client-related data is stored safely and doesn’t get corrupted during the contract period, companies in these verticals are looking at data loss prevention (DLP) technology. For example HCL’s BPO has deployed Data Loss Prevention (DLP) technology to ensure security of critical data, along with ad-

vanced storage with deduplication capabilities. Apart from this, it is using a secure IPSec tunnel that involves 3DES/ AES encrypted communication for call center agents interacting over the Internet. The BPO’s entire call recording data is encrypted and all passwords are protected and encrypted. Although KPOs deploy all the regular data management technologies like data deduplication, data backup and archiving, unlike the BPOs, most KPOs do not store much of client data on their site. Thus, the biggest problem for a KPO is to make sure the accessibility of client data to employees for work, while ensuring the confidentiality of client’s data and preventing any form of information leak. And this is where virtualization comes to rescue. “To tackle the twin challenges of ensuring accessibility of data to employees while ensuring complete security of client data, most of the KPOs have deployed virtualization technologies where the employees are restricted to work on an image of the file on their virtual

Telecom

he telecom industry is another vertical grappling with the challenges of storage of petabytes of information. Strict compliance and regulatory requirements governing this industry vertical is the primary reason why this sector is required to store huge volumes of data for specific periods of time, while ensuring easy retrieval of the data. Apart from regulatory requirements, the second major reason that is driving the storage of voluminous data within this vertical is the need to crunch data sets —

Our backup strategy employs multiple levels of backup such as physical and logical database backups, and direct storage backups Arindam Dutta, Head- Global Business Analytics & Asia Business Services, HP

T

22

informationweek december 2012

desktops, while the actual file is saved in real time on the client servers. And once the project is delivered by the KPO to its client, they are usually taken off the client server,” says Katyayan Gupta Analyst and Connectivity Lead, Telecom and Networking Services, APAC & Emerging Markets, Forrester Research. With BPOs and KPOs generating terabytes of data, they have been extensively using regular data management technologies that range from data storage, deduplication, backup, recovery, and archiving, etc. What is new is the evolving bunch of new data management and analytics technologies that have the potential to provide a strong competitive edge to the business. Since reliance on data is huge, and is tied down to the core business activities, most CIOs in these verticals are now focusing on tapping the next level of data management technologies to build a solid foundation from which insights can be gained for driving better decisions.

both structured and unstructured — to bring out critical and actionable business insights about customers. Staggering amount of data related to call records, customer IDs, subscription information, etc., is generated every day within this vertical, which requires to be stored and preserved. For example, MTS India, currently dealing with data volume nearing a petabyte (1,000 terabytes), generates cumulative data at a whopping rate of 42 gigabytes per hour across all their systems for their current subscriber base of about 17 million. “Data generation

www.informationweek.in


in telecom environment is primarily attributed to activities by and for the customer, e.g. Call Detail Records (CDRs) produced due to voice calls, value added services, data sessions created and used, SMSes sent and received, collation of geographical information of the network and distribution networks, employee generated data and so on,” says Rajeev Batra, CIO, MTS India.

Deriving insights from data

In the fiercely competitive industry vertical of telecom, the primary motto of the operators is to retain current customers and prevent any customer churn. Till recently, operators were working with only the structured data, like CDRs and tapping it to analyze the customer usage patterns in almost real-time to dynamically create campaigns for instant customer gratification, revenue assurance and detection, and prevention of any fraudulent activities. However, with the huge volumes of unstructured user-generated data streaming through social networking sites, blogs, and wikis, more and more operators are realizing that it is not just the structured data that they have to crunch for critical business insights. Alpna J Doshi, CIO, Reliance Group says, “We manage few petabytes of data, which include both structured and unstructured data. Earlier, regulation and legal causes were the key drivers for storing data. However, with advances in Big Data technology now this is becoming a gold mine of information, which could enable operators to extract the right kind of intelligence.” For example, earlier, in case a

Since, we are frequently asked to recover old CDRs, we have put down data archiving policies, which are much more formal than other verticals Amrita Gangotra

Director - Information Technology, Bharti Airtel

customer was dissatisfied he/she would generally call the customer care and voice his/her issues. Today, one of the first things that a dissatisfied customer does is to vent his issues on social networking sites, which holds the potential to severely tarnish the brand image of the company. “One of the major concerns for telecom companies today is, if a customer dislikes a service and puts it upon a social networking site like Twitter, then how can they track it, analyze it and draw patterns.” says Forrester’s Gupta. In order to retain data to meet primary motives of ensuring compliance with regulatory norms and deriving important business insights from huge data sets available, companies in the telecom sector are adopting several data management technologies. Data archiving is one such technology that is increasingly being adopted by the telecom vertical. “We frequently face situations where we are asked to recover a lot of old CDRs owing to financial, regulatory or legal requirements. To enable this, we have put down proper data archiving policies, which are much more formal than what is

To ensure data integrity, we have implemented tools and packages for data replication, Extract, Transform and Load (ETL) and data quality Rajeev Batra

CIO, MTS India

implemented in some other industry verticals,” informs Amrita Gangotra, Director for Information Technology at Bharti Airtel. Apart from a robust data archiving solution, technologies like data deduplication that ensure data integrity are proving to be highly relevant for companies within this vertical. Data deduplication technology when deployed, analyzes the data, eliminates any duplicate sets and stores only a single version of data. This also drastically reduces the storage requirements for a dataintensive vertical like telecom. A case in point is MTS India, which has two solutions in place for data deduplication: an in-house solution for basic deduplication and a commercial deduplication package for heuristics-based deduplication. “To ensure data integrity is maintained throughout the enterprise, we have also implemented tools and packages for data replication, Extract, Transform and Load (ETL) and data quality,” states Batra. The need to manage and tap into huge data volumes clearly points towards the high relevance of robust data management technologies and practices within the telecom vertical to ensure easy management of voluminous data stream. As data volumes continue to surge, some major telecom operators are looking at drafting a comprehensive data life cycle management policy for unstructured data and evaluating various Big Data analytics technologies.

december 2012 i n f o r m at i o n w e e k 23


Cover Story

With hospitals increasingly relying on advanced diagnostic imaging technologies, image management and storage needs are overwhelming Suresh Kumar

CIO, SevenHills e-Health

n the healthcare sector, applications like tele-medicine, digitization of patient records and the storage of high resolution medical images all generate mountains of information. Hospitals also store information related to patients, such as demographics, family history, insurance coverage, lab results, medications, etc. As Indian hospitals go on the digital path and automate their manual processes, they are generating huge amount of data, leading to data management issues. Healthcare firms are also required to preserve data owing to strict regulatory compliance governing this sector. To comply with these regulatory norms, hospitals are not only preserving structured data residing in systems such as Hospital Information System (HIS), Hos-

pital Management System (HMS), and Clinical Support System (CSS), but also unstructured data in the form of images. For example, most hospitals today use Picture Archival and Communication Systems (PACS), which store medical images such as X-Rays, CT scans, and radiology images. As more images are added to these systems, storage needs too increase by an exponential percentage. “Medical image files are extremely large data files and the scale of data generation depends from hospital to hospital. Storing these digitized images of CT Scan and X-rays for a period of 10 years as mandated by regulation and in case of medico-legal case (MLC) for a life time is a challenge on the storage front,” says Mahesh Shinde, Director IT & Telecom, Hinduja Hospital. Echoing similar emotions, Suresh Kumar, CIO, SevenHills e-Health, says, “With healthcare organizations increasingly relying on advanced, complex diagnostic imaging technologies, the image management and storage needs are overwhelming.”

BFSI

one are the days, when people were required to stand in long queues in the bank branches for every single transaction. Thanks to increased digitization and automation of banking processes, today, a customer is able to access banking services over various channels — phone, Internet, and ATMs. Banks are transmitting images of cheques and digitizing forms and documents in order to store everything in a digital format — leading to the information explosion in the banking sector. This massive digitization of information is leading to huge storage

Healthcare

I

G

30

informationweek december 2012

Apart from the storage of these medical images, querying this huge unstructured database is another major challenge being faced by the healthcare industry, as searching a medical image for a particular patient from the database is extremely time consuming. “The present data storage capabilities and the PACS, limit information dissemination from the medical imaging data,” says Shinde. With increasing digitization, the data deluge is further going to increase in the sector. And hospitals are contemplating several data storage technologies to ensure effective storage of patient data. “Healthcare providers will have to work with a range of technology providers to ensure they meet requirements for data in terms of security, privacy and compliance. By doing so, they can maximize the benefits of information and deliver more effective treatment regimes and better co-ordinated care,” says Charlotte Davies Lead Analyst Healthcare & Life Sciences, Ovum.

requirements for banks, as every transaction is digitized, recorded and preserved. This is corroborated by a report by EMC. Though not restricted to the BFSI sector, the report shows a glimpse of storage requirements in India. The report titled ‘The Digital Universe in India’, reveals that by 2020, digital information in India will grow from 40,000 petabytes to 2.3 zettabytes (2.3 million petabytes), twice as fast as the worldwide rate.

Regulations Leading To Data Explosion Stringent data regulatory and

www.informationweek.in


compliance requirements is another factor that is adding to data management-related challenges within the BFSI sector. The banking industry faces mounting pressure to comply with industry and government regulations, which are tighter than many other sectors, as this sector deals with customer’s sensitive and private information. As per the directions of the Reserve Bank of India (RBI), banks are required to maintain financial records for eight years, while tax records have to be retained for seven years. “We are required to save voice files — client and dealer interactions — for two years, which are then archived and stored in LTO tapes. The voice files are stored in an encrypted manner in NAS Boxes and retrieved periodically through a software solution,” says A Balakrishnan, Chief Technology Officer, Geojit BNP Paribas Financial Services. These regulations are resulting in further data growth in the banks, which is also getting stored at an atomic level. This need for storage is further bound to grow at an exponential pace, as a majority of India’s population is still unbanked or does not have a banking account. According to a report by RBI, 41 percent of the adult population in India doesn’t have a bank account, which indicates a large untapped market for banking players. The report also identifies about 73,000 unbanked villages and this indicates the growth potential for banks to extend their reach to the unbanked areas. As the number of customers increase, so would the storage requirements for banks. In order to offer banking services in these untapped markets, banks are considering various forms of Information and Communications Technology (ICT)-based models rather

Data is a gold mine for enterprises to drive their business. The challenge is to draw meaningful insights from voluminous data being generated Ravikiran Mankikar

GM - IT Department, Shamrao Vithal Co-op Bank

than establishing branch offices. The banks are mainly considering mobile as a serious platform to connect with end-customers, keeping in mind the rising number of smartphone users in the country. “The mobile subscribers have outnumbered the number of bank customers in the country. Mobile devices too have become much smarter and cheaper,” says Sanjay Sharma, MD, IDBI InTech. A recent report by Google, which ranks India as the second biggest consumer of mobile Internet after the U.S., also substantiates this trend. The strategy to tap these markets will lead to increase in subscriber numbers, which will result in further data growth.

Value of data driving need for storage

As companies in the BFSI vertical offer similar products, they are increasingly turning to huge data sets available to get business insights related to customers and gain a competitive edge over their peers. The sector is realizing the importance of data as an important business enabler, which can help them provide customized services, retain customers, reduce costs, minimize frauds and increase revenues. This is another major factor compelling the sector to ensure effective storage of data sets available. “The challenge is to draw

We are required to save voice files — client and dealer interactions — for two years, which are then archived and stored in LTO tapes A Balakrishnan

CTO, Geojit BNP Paribas Financial Services

meaningful insights from voluminous data being generated through myriad sources. Data at one point of time was just data. Today, the game has changed and data is no more just data, it is a gold mine for enterprises to drive their business. This voluminous data confronts us with the challenge to manage, store, retrieve, govern and archive data, while keeping the cost at its lowest,” says Ravikiran Mankikar, GM - IT Department, Shamrao Vithal Co-op Bank.

Technology adoption for data management

To manage this voluminous amount of data, banks are adopting a host of technologies for data archival, backup and recovery. For example, Dhanlaxmi Bank recently virtualized its storage boxes, as majority of its storage boxes had become obsolete. The Bank moved its non-core banking applications to the Hitachi Universal Storage Platform VM (USP VM). These included the core lending system, loan processing, check truncation systems and the collaboration suite running on the intranet. The Bank also moved the core finance system, as well its database to the USP VM. This has enabled the Bank to set up a three-way disaster recovery mechanism, along with extending the life of the obsolete legacy boxes. The data volume in the financial services sector is expected to increase further at an exponential pace. The need of the hour for CIOs in this sector is to take decisions that shield them from expensive system reconfigurations and inefficient storage migrations. u Jasmine Kohli

jasmine.kohli@ubm.com Amrita Premrajan amrita.premrajan@ubm.com

december 2012 i n f o r m at i o n w e e k 31


Cover Story

Data flood opens new opportunities for vendors

With organizations increasingly adopting a more matured approach to data management, vendors are aggressively pursuing the opportunity By Ayushman Baruah

D

ata is often considered to be the family jewels of an organization — something you can’t afford to compromise. With huge amounts of data being generated every day, organizations are increasingly facing challenges related to data storage and retention. In order to overcome data challenges such as reducing data complexity, retention, indefinite scaling of data storage capacity, etc., organizations are constantly looking at newer technologies in this space. To tap this opportunity, data management vendors are delivering solutions that can help organizations manage their data more efficiently. For example, storage vendor NetApp is offering a single integrated platform for an agile data infrastructure that it claims is intelligent, immortal, and infinite. “Intelligent helps organizations deliver more impact, faster with automatic data management and efficiency. Immortal means non-disruptive operations — continuous data access even during system maintenance and upgrades. And infinite means growing the business without limits with performance, capacity and operational scalability,” informs Deep Roy, Consulting

32

informationweek december 2012

Systems Engineer, Technology Solutions Organization – APAC, NetApp. According to Disaster Recovery (DR) software company Sanovi, enterprises are concerned primarily about two aspects of data management — protection and recovery. It says application recovery today is largely manual and error prone and CIOs do not have the confidence that their data and application recovery will meet the set recovery metrics of their business. “Sanovi DR management software provides automated recovery of databases, applications and infrastructure that includes servers, storage and replication technologies. This makes recovery predictable and less dependent on people and experts” says Narayana Menon K, Lead – Marketing, APAC &

Middle East, Sanovi Technologies.

MANAGING BIG DATA

As per IDC, Big Data will be among the key challenges going forward, as the volume of digital content has grown to 2.7 zettabytes (ZB), up 48 percent from 2011. Over 90 percent of this information would be complex, unstructured and difficult to analyze and manage. This is expected to go up to 8 ZB by 2015. For organizations across all industries, addressing these limitations requires the deployment of new classes of storage solutions (scale-out storage solutions) that are optimized for rapid data ingest, efficient storage management, and reliable access. “The growth of web- and cloudbased apps, and other data-intensive

The growth of web- and cloud-based apps, and other data-intensive apps have amplified the challenges and ushered in an era of Big Data Deepak Varma

Regional Head-Presales, EMC India & SAARC

www.informationweek.in


apps have amplified the challenges and ushered in an era of Big Data, forcing companies to find new ways to scale and manage their storage environment,” says Deepak Varma, Regional HeadPresales, EMC India & SAARC. EMC’s Isilon portfolio of hardware and complementary software claims to help IT organizations implement a number of best practices that can boost storage asset use, reduce operational overhead, and meet high data availability expectations. “EMC has recently introduced the industry’s first scale-out NAS system with native Hadoop support where EMC Isilon’s scale-out NAS with HDFS, combined with EMC Greenplum HD, delivers powerful data analytics on a flexible, highly-scalable and efficient storage platform. With this solution, EMC eliminates CIOs’ need to spend valuable time investigating and integrating products from multiple vendors,” says Varma. Another offering in the space of Big Data management is Attachmate’s Dynamic File Services from Novell, which claims to address the continued need for enterprises to manage unstructured data more efficiently and intelligently. “The release of Novell Dynamic File Services is an answer to the rapidly growing concern of managing massive

It continues to be a challenge to control the spiraling cost of new disk storage capacity without hurting the productivity of users Venkatesh Swaminathan Country Head, Attachmate India

amounts of unstructured data being developed in the enterprise. Data growth is difficult to manage and it continues to be a challenge to control the spiraling cost of new disk storage capacity without hurting the productivity of users. Novell Dynamic File Services solves this problem by getting the most out of the existing hardware, while at the same time getting maximum value from the businesses initial storage investment without changing the way users work,” says Venkatesh Swaminathan, Country Head, Attachmate India. Also, Attachmate’s Sentinel Log Manager from their NetIQ bouquet of offerings attempts to address the issue of log management within Big Data. “Sentinel Log Manager provides secure, simple and powerful log management that reduces deployment and management costs, while leveraging existing

TECHNOLOGIES Processing of huge data sets is made possible by four primary technologies: l

Grid computing: A centrally managed grid infrastructure provides dynamic workload balancing, high availability and parallel processing for data management, analytics and reporting. Multiple applications and users can share a grid environment for efficient use of hardware capacity and faster performance, while IT can incrementally add resources as needed.

l

In-database processing: Moving relevant data management, analytics and reporting tasks to where the data resides improves speed to insight, reduces data movement and promotes better data governance. For decision makers, this means faster access to analytical results and more agile and accurate decisions.

l

In-memory analytics: This technology empowers you to solve complex problems using Big Data and sophisticated analytics in an unfettered manner. Data exploration and visualization is faster and easier with in-memory analytics. It also helps organizations to quickly create and deploy analytical models.

l

Support for Hadoop: Apache Hadoop is a fast-growing Big Data processing platform. A good analytics solution must provide seamless and transparent data access to Hadoop. (Source: SAS)

hardware investments to establish a scalable and flexible enterprise compliance and security foundation,” says Swaminathan. Most of the data management vendors InformationWeek spoke with have their key customers from verticals such as defense, BFSI, telecom and the government. This is because of two main reasons — first these verticals are data intensive and second they are required to comply with standard operating and best practices directed by the government and RBI.

THE ROAD AHEAD

With CIOs looking to manage their data more efficiently, vendors are investing a big deal on data management solutions including Big Data solutions. Going forward, companies will be giving more importance to data governance and compliance to norms such as HIPAA. There is a need for agile data warehousing. Cloud and database virtualization will grow as business requirements will warrant an agile ecosystem. Vendors are seeing good traction in data management technologies and more importantly on the process, from various organizations. “We are witnessing rising interest in solutions which can help in analyzing the data once it is managed well. We believe that data management should be treated more as a continuous process and not as a one-time event. This ensures that data, which is the foundation of decisionmaking process, is complete, consistent and correct,” says Sudipta K Sen, Regional Director – South East Asia, CEO & MD, SAS Institute India.

u Ayushman Baruah

ayushman.baruah@ubm.com

december 2012 i n f o r m at i o n w e e k 33


Interview

‘Machine-to-machine data poses greater challenge’ Are you confronted with the challenge of data deluge? If so, what are the various sources of data? For an organization like ours, data is generated at an exponential rate. This huge data is generated from various functions of business vis-à-vis customer services, sales, accounting operations and HR. So the kind of data generated from these functions is a challenge for us. Another bigger challenge is from unstructured data, which is emerging from social media. Today, there is a dire need to know the customer better, as today’s customers are more tech-savvy. In order to retain the customers and woo new customers, we need to collaborate with them through these mediums. In our endeavor to know our customers better and service them efficiently, we need to manage this huge data. Social media tools and technologies have to be implemented in the right manner to gauge insights into customer behavior. Data has to be collected with respect to customer interest areas, expected features and expected functionalities. We have to mine this data intelligently and leverage the insights to decide our future business strategy. By capturing and analyzing this data, we can benefit from insights and take real-time decisions in order to improve the quality of our products. These insights can also be leveraged to tap any opportunity or address any issue. This enables organizations like us to take some proactive decisions and actions and improve our products. What are the data related challenges with respect to

34

informationweek december 2012

Daya Prakash, CIO, LG Electronics shares his views on how social media and machine-tomachine related communications are taking the challenge of data management to a different level. Excerpts from an interview with Jasmine Kohli www.informationweek.in


machine-to-machine communications? Apart from the structured and unstructured data, there is a lot of data being generated from machine-to-machine interfaces. For example, in manufacturing, we have integration of ERP with Manufacturing Execution System (MES), where a lot of data is generated. In terms of payments and collection, we have tight integrated systems with various banks we operate with. Hence, huge data gets generated between the two systems. We have product goods moving out accompanied by a huge bunch of scanning activities. At any point in time, machine-to-machine communication is manifold. The amount and variety of data through this medium poses a great challenge for us. Our industry is more driven by a push-sale strategy rather than a pullsale strategy i.e. we need to push our goods, to the trade partners. So, while demand forecasting, it becomes important for us to have a holistic view of each of the trade partner’s inventory levels. For example, we push sales to a particular trade partner. Let us assume a particular model/product is more in demand in a particular place and there is shortage of supply of that model. This will definitely impact our sales. In order to overcome such issues, we need to have a holistic view of inventory with trade partners. Thus, gauging insights from this machine-to-machine data is very important. With the ability to better understand, where a particular model would sell, demand and supply ratio can be adjusted accordingly by tweaking forecasting and other things. This in turn helps us improve our sales. Please brief us about the solutions that you have adopted within your organization to address this issue? To address this issue, we have

an ERP in place, which takes care of our end-to-end solutions that are tightly integrated. So our operations are limited not only to organizational boundaries but we have integration with our trade partners. We have integration with the third-party service providers and with the suppliers. All this works in cohesion. Apart from operation critical data, other non-business critical data gets shared between various systems to give a holistic view regarding what is happening, with respect to various channels we operate with, be it trade partners, suppliers or service providers. ERP is our backbone and we have business intelligence solutions in place, which gives management the power to take real-time decisions. This also helps us to periodically monitor our KPIs and make adjustments in any of the operational part, if needed. Hence, we have ERP, data warehouse, and BI dashboard to manage the data deluge. Brief us about the storage initiatives taken at LG? Our storage services help us multifold — data backup, data deduplication, and data retention is taken care by the Hitachi Storage solution that we have deployed. The main chunk of data that we collect pertains to the customers — personal information, bank details, credit card details, mobile number and demographics. This has various privacy and confidentiality issues attached to it. There is a challenge to maintain this confidential information too. It has become mandatory for organizations to comply with the IT Act 2000, and those who fail, may face legal implications. Hence, the foremost challenge for CIOs is to prevent data leakage.

“In our endeavor to know our customers better and service them efficiently, we need to manage the huge unstructured data emerging from social media”

u Jasmine Kohli jasmine.kohli@ubm.com

december 2012 i n f o r m at i o n w e e k 35


Interview

‘Data management is all about enhancing customer expectations’ According to you, how important is data management from a business point of view? Banks are custodians of personal information of customers. Banks utilize this information to gain insights into customer buying habits. When we analyze information with respect to a number of customers, we understand the unique needs of each customer, and customize our products accordingly. Data is hence, extremely important for us from a competitive point of view. This data, however, is in a raw form, which we then transform into information using business intelligence. For extracting meaningful information from the data, we have to ensure that data is clean and updated. The result of any transformation of data into information depends on the quality of data. We need to ensure that there is no duplication of data. We have to also ensure that pertinent data is stored according to the current context. Data management is all about enhancing customer expectations and customer retention by intelligently using data. Can you give us an example to explain this? From the business operation point of view, banks need to understand the profitability, segmentation of customer and profitability vis-a-vis the cost of the product. Data plays a big role in taking decisions. A case in point is Dharavi, a lower middle class area with unbanked population. To sell our banking products in this area, we need to assess whether our products are in line with their needs. Accordingly, we collect data through ancillary sources and then map this data to the products we

36

informationweek december 2012

Data management is a science to manage customers; harness technology to be competitive, increase customer retention, and respond better to regulations, says Ravikiran Mankikar, GM - IT Department at Shamrao Vithal Co-op Bank (SVCB), in an interview with Jasmine Kohli from InformationWeek www.informationweek.in


have. Depending on the business requirement, we take decisions to modify our products to cater to this segment. The masses in these areas hold a huge potential for us in the urban co-operative space, as these people would not go and open bank accounts with large banks. Hence, data management is extremely important.

terabyte. However, hypothetically, if you look at pricing your data at one terabyte, you will not know when one will reach one terabyte. This is because other than business applications, we have backups and storage and many other things that keep increasing. Hence, it is a real task to sit and remove unwanted data.

Please elaborate how regulations play a role in overall data management? Storage of data is mainly driven by regulations in the sector. It is mandated by the Reserve Bank of India (RBI) to store the data for a period of 12 years, of which data is kept online for three years and later on, it is archived and kept offline. Storing the data for 12 years is a big challenge. We also need to ensure that the retrieval and storage are in sync. It should not happen that we start storing on a device and the device becomes obsolete within two to three years. For this, archival of data has to take place, as velocity and variety of data is huge. Thus, storage, retrieval and archival are interconnected. As a part of our BCP and DR, we conduct tests in six months. We have replicated the data at an offsite location. We then run the operations out of that location and then sync them back. At SVCB, we have a defined library and periodicity for storing data. This enables us to access this information as and when required by the regulatory authorities. Data management in my perspective should be seen in a holistic view. One should consider customer focus, business operation focus, and lastly, storage and retrieval.

What are the problems faced in managing structured and unstructured data? What kind of data management challenges are you facing within your bank? The challenge of managing the unstructured data is not much, as we do not have unstructured data from social media. Co-operative banks are not into social media due to regulations by RBI. So, the unstructured data for us is mostly from e-mails. Structured data rooting from within the organization is also not a challenge, as they are periodically stored and archived. However, we face a challenge in removing the multiple copies of files, which users create on their own desktops. Data in any form is important for every individual and this typically results in one file being saved ‘n’ number of times by an individual. We are confronted with 2x and 3x type of files, which are mostly copies of the same file. So, the user should be cognizant of these things and try to limit their copies and should take backups of the most critical data or files. From an IT perspective, we have allocated users a specific area where they can store their critical backups. When these users take the overall backup, these files too get backed up. This has reduced multiple copies of files and backups on desktops, which in turn has freed a significant amount of backup space.

Can you give us an estimate of the scale of data that your bank handles? We are a growing organization. As business grows progressively, data grows geometrically. In a geometric progression, there is huge multiplication of data. The size of data we handle is more than one

Data management should be seen in a holistic view. One should consider customer focus, business operation focus, and lastly, storage and retrieval

u Jasmine Kohli jasmine.kohli@ubm.com

december 2012 i n f o r m at i o n w e e k 37


Case Study

DLP solution helps Hitachi Consulting ensure confidentiality of data While the outsourcing industry has matured, customers are still wary about the way their intellectual property is handled. To address this issue, Hitachi Consulting Services deployed a comprehensive DLP solution By Srikanth RP

I

n the highly competitive world of offshoring, customers are highly paranoid about the way their data and intellectual property is being handled. Like several other organizations, Hitachi Consulting Services, the offshore arm of Hitachi Consulting, was finding it difficult to predict user-behaviour that could compromise company data. Although, no major data disaster had occurred in the past, a pre-emptive measure to avoid one in the future seemed

38

informationweek december 2012

necessary. “We noticed that our customers were getting increasingly apprehensive about the way their data and intellectual property was being handled. And we needed to assure them that we had a robust system in place to deal with the huge volumes of data we handle every day,� states Sesanka Pemmaraju, IT Director and CISO, Hitachi Consulting Software Services India. While it was extremely important for the firm to safeguard business data

and intellectual property, it was also equally critical to ensure control of information, especially with the frequent use of mobile devices within and outside the corporate network. Data thefts often have the potential to destroy a company’s image, compliance, competitive advantage, finances and customer trust. Hitachi Consulting Services hence decided that it needed to take proactive steps to manage its data across all levels. Bringing in a data loss prevention

www.informationweek.in


mechanism had become a matter of high urgency and needed immediate action. For organizations like Hitachi Consulting that routinely handle extremely sensitive information, it is crucial to build a dynamic and exhaustive data loss prevention solution. Hitachi Consulting required a dependable data loss prevention solution that would help it manage data breaches and risks associated with them effectively. “We wanted to reap measurable benefits and consistent results. We explored various resources that offered solutions for data loss prevention and decided to adopt Symantec Data Loss Prevention (DLP) as it offered a multifaceted capability to significantly increase our firm’s ability to manage risks and become a highly efficient, compliant and secured working environment,” says Pemmaraju. Symantec advised a Data Loss Prevention (DLP) deployment, which included end-point prevent and end-point protect solutions to secure its critical information. The Symantec DLP solution reduces proliferation of confidential information across enterprise data centers, client systems, remote offices, and enduser machines. across networks and endpoints. In addition, it supports a bevy of standards and predefined rules based on the industry segment that Hitachi Consulting works in. This DLP solution was highly

Benefits l

l

l

l

Ability to secure endpoints inside or outside corporate network Reduced data breach risk has enabled the company to acquire more customers Ability to flag and notify policy violations and fix broken business processes Intangible savings on account of eliminating the risk of legal and compliance issues

“We have accrued a significant amount of intangible savings by mitigating data breach risks using the DLP tool”

Sesanka Pemmaraju

IT Director and CISO, Hitachi Consulting Software Services India

comprehensive and programmed to discover, monitor and protect critical information. “As data traverses through networks, there is a high possibility for it getting compromised. Data thefts are seldom predictable as hackers always design newer methods to steal and destroy critical data. A preemptive and proactive approach to manage security issues is therefore a necessity,” explains Anand Naik, Managing Director- Sales, India and SAARC, Symantec. A three-pronged procedure in the form of Discover, Monitor and Protect, and Manage, provides complete protection of information to effectively secure data. In the Discover phase, confidential data is identified and an inventory of critical, sensitive data is created and managed automatically. In the Monitor and Protect phase, information is monitored across locations, whether the user is accessing it over the corporate network or not, and an analysis of how this data is being used at various end-points is scrutinized. Security policies relevant to securing confidential data are automatically enforced and proactive measures are taken to prevent leakage and loss. In the Manage phase, universal policies are defined to be followed across the organization. Any unauthorized incidents are detected, reported and remediated instantly.

End to end protection

Today, the DLP agent monitors computers for data outgoing through different channels like e-mail, IM, web or FTP; removable media — compact flash, SD, CD/DVD;

print screen captures and print etc. Expressions of different codes like C, C++, DotNet, Java and Oracle along with patterns like headers and footers are collected and fed into the system. Making use of specific standards, patterns, phrases and protocols applicable to Hitachi Consulting, the team established total control over the traffic moving in and out of the corporate network. The solution has helped in protecting business critical data and intellectual property, improved its compliance posture, helped in identifying broken business processes and in safeguarding confidential data on mobile devices both within and outside the corporate network. “Data breaches are a risk to our overall business achievements, and the DLP solution has provided us significant assurance and helped us prevent data breach incidents, educate our workforce, flag and notify policy violations and fix broken business processes. We have accrued a significant amount of intangible savings by mitigating the risks through the DLP tool, thus safeguarding company and customer confidential data,” states Pemmaraju. By taking a proactive approach, Hitachi Consulting Services today enjoys its customers’ utmost trust about its data management capabilities, as the DLP solution addresses the key IT risks that threaten customer trust and reputation.

u Srikanth RP srikanth.rp@ubm.com

december 2012 i n f o r m at i o n w e e k 39


Case Study

Automated backup helps Royal Sundaram Alliance Insurance slash storage space By using an automated backup solution from Druva Software, Royal Sundaram Alliance Insurance, has reduced its storage costs by a significant percentage By Srikanth RP

A

s a fast growing insurance firm with more than 100 branches all across the country, Royal Sundaram Alliance Insurance (RSAI), was facing a problem with respect to managing its backup related needs. Typically, employees stored multiple copies of their data for redundancy purposes. However, this created problems for backup, as the same data in multiple forms used to be backed up multiple number of times. With approximately 1,500 desktops and 300 laptops, the amount of duplicate data that was getting backed up was huge and time consuming. Elaborating on the need for an enterprise automated backup solution, Sam Abraham, AVP - IT, Royal Sundaram Alliance Insurance Company, explains, “We were looking for an efficient and easy-to-use enterprise backup software that would backup PC or laptop files incrementally. Full backups were rapidly becoming a thing of the past and differential backup was the need of the day.” Abraham’s team evaluated several solutions but found that they could not handle applications like Lotus Notes mailboxes. After due diligence, the firm zeroed in on Druva’s inSync solution, which had an applicationaware deduplication capability. The team started by purchasing two licenses to evaluate the software for a few months for client backup and it worked without a hitch. Moreover, the load on the server was not particularly heavy. Post evaluation, the first deployment was done at the corporate office. When it worked without problems, the team expanded

40

informationweek december 2012

“Our backup window time has considerably reduced, and we have cut down on overhead, maintenance and deployment-related costs”

Sam Abraham, AVP - IT, Royal Sundaram Alliance Insurance Company

the solution to its regional offices. As RSAI’s local servers run at Calcutta, Mumbai and Delhi, the firm used inSync to backup nearly 750 PCs and laptops. All its backups are done over the network.

Road to deployment

Royal Sundaram deployed the solution for business-critical users so that enterprise data on their endpoints was protected and readily available for restore. Firstly, RSAI identified the business critical PCs/ laptops that needed to be backed up. Once this number was determined, the client software was deployed on these end-points. Next, the firm defined the endpoint data (drives/files/folders), which needed to be backed up. At the server-end, it created backup profiles and assigned the required profile to each user depending on the directory structure to be backed up, time for backup and other relevant details. The software solution performs differential back up for all kind of files and applications, resulting in backups which are faster and lighter as only the files which have changed are backed up. The end-user can keep a reasonable number of backups

(versions) so that they are able to restore files from any one of these backups without any IT involvement. In case of system crash or any similar incident, a person from the IT team can recreate the system and restore the data for the user.

Saving space

Due to the automated backup facility, today, RSAI can not only cut down storage-related needs, but also identify the backup status of each user so that the IT team can take required steps in case of any issues. “Our backup window time has considerably reduced, and we have cut down on a lot of overhead, maintenance and deployment related costs. We have even cut down on our storage needs by a significant percentage,” says Abraham. Additionally, features like differential backup, application-aware deduplication, have ensured that users can save a lot of storage space and enable faster backups. Multiple restore options ensure users always have quick access to their data.

u Srikanth RP srikanth.rp@ubm.com

www.informationweek.in


Opinion

The value of data deduplication

D

George Crump

Data deduplication can pay great dividends if used in the right situation. Consider it for backup, primary data storage, and wherever flash storage is used

http://www.on the web Measuring The State Of Primary Storage Deduplication

Read article at:

ata deduplication has become a ubiquitous term commonly used to describe a process that can identify redundancy within a data set and only store unique segments. The value of investing in deduplication is dependent on three variables that need to be considered: percentage of redundant data, the speed at which data can be transferred to the device, and the cost of the media on which the data is being stored. These variables explain why backup data was the first and most obvious place to focus the technology on. While the cost of the media was relatively inexpensive, it was more expensive than the competitor that it was attempting to replace: tape. Backup, of course, also had a high level of redundant data so the effective capacity being delivered as a result of eliminating this redundant data was high. Finally, while performance mattered, it was competing with tape so it was less important than in other environments. When looking at archives, these variables explain why deduplication has less value in disk-based archive appliances and why those systems have struggled to gain acceptance. Most archives are storing the last copy of data (maybe two) but not the multitude of copies like backup. If a backup system is getting 20x data efficiencies, archives will get 3x to 5x. In fact, a compression system may have more value here since compression performs optimization on every file instead of just redundant files. In the final analysis, tape may still be the ideal long-term retention area for archive, especially when augmented by a disk front end. The big focus for data deduplication vendors lately is primary storage. The challenge with

primary storage is that while the cost of the physical media is more expensive, it also has a low level of redundancy. The good news is that the redundancy level may be higher than the archive use case. Performance is a bigger concern on primary storage. The deduplication system has to be able to deliver the storage efficiency while having little or no impact on performance. Another area that is getting increasing attention for vendors is applying deduplication to solid state storage. While there is greater risk of a performance impact, the benefits of efficient capacity utilization on flash-based solid state systems can be large. The cost of the solid state media is at such a premium that even a relatively small efficiency improvement (5x) is well worth the investment in and potential overhead of deduplication. And deduplication can be applied to all flash and still provide significantly better performance than a hard disk alternative. Deduplication, like every other storage technology, can pay great dividends if used in the right situation. It clearly makes sense in backup, may make sense in primary data storage, and I think especially makes sense where ever flash storage is used. All deduplication techniques are not created equal though, and some may perform better than others in given situations. As always, it is important to ask a lot of questions and do your own testing on your own data.

u George Crump is Lead Analyst of

Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments

december 2012 i n f o r m at i o n w e e k 41


Opinion How data deduplication technology can enhance disaster recovery

A

Sandeep Kejriwal

According to Gartner, 50 percent of tape backups fail to restore. As the corporate world moves from terabytes to petabytes, it’s time for organizations to consider a disc-based backup solution

http://www.on the web Online backup vs cloud storage Read article at:

42

informationweek december 2012

study done by the research firm Vanson Bourne shows that 80 percent of the Indian organizations have experienced data loss or downtime in the last 12 months. Historically, businesses have been using tape-based backup solutions and storing them physically in offsite locations. However, according to Gartner, 50 percent of all tape backups fail to restore. Few months ago, a large number of computers in the Maharashtra Secretariat got damaged in a fire. The government’s failure to have a proper disaster recovery (DR) plan drew widespread mockery and criticism. But ironically, this issue is not limited to just government offices in India. In Asia Pacific & Japan, 44 percent of the companies still rely on tape backups. However, a study done by Boston Computing Network Data Loss Statistics shows that 77 percent of those companies that tested their tape backups, found backup failures. But are companies planning wisely? The answer is a shocking “No”. Research shows that 52 percent of the companies review backup and recovery plans only after a disaster strikes.

Looking back: A lesson from the Japan crisis

The crisis that followed the massive earthquake in Japan last year highlighted the need for companies to strengthen their IT disaster readiness. As witnessed in Japan, many major roads were damaged and had to be closed which restricted transportation. For companies using tapes, this would result in a major issue that could affect tape transportation and tapebased recovery. Another significant challenge during the Japan crisis was power shortage. When the disaster occurred, the existing power supply was not enough to meet the demand, so there was a need for rationing and control. As companies weren’t

prepared for this outage, they had to contend for intermittent power supply to both their primary and disaster recovery sites. A robust disaster recovery plan should therefore be able to consolidate and automate data protection and recovery processes across the entire organization. It is extremely important for businesses to ensure minimal impact to their data in times of disasters, which might deter business continuity.

Regulatory environment in India

India has a fast changing regulatory environment, where compliance requirements related to data protection and security are becoming tighter. Corporate acts and taxation laws typically determine data retention schedules based on the type of records. For example, financial records must be retained for eight years at least, while the tax records must be kept for seven years. The Reserve Bank of India (RBI) has released business continuity planning (BCP) guidelines for the banking sector. The RBI specifies technology aspects of BCP including high availability and fault tolerance for mission-critical applications and services, Recovery Time Objective (RTO) and Recovery Point Objective (RPO) metrics, testing, auditing and reporting guidelines. The RBI also suggests near-site disaster recovery architecture in order to enable quick recovery and continuity of critical business operations. India adopted the new data protection measures last year, called the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011. This is designed to protect sensitive personal data and information across all industries. The new rules require that organizations inform individuals whenever their personal information

www.informationweek.in


is collected by e-mail, fax or letter. Furthermore, they demand that organizations take steps to secure personal data and offer a dispute resolution process for issues that arise around the collection and use of personal information. The law applies to all companies in India getting any information from anywhere, which may also have implications for foreign companies that outsource services to India.

The changing technology landscape

Tapes are archaic, unreliable and inflexible. In this traditional architecture, both backup and restore are too slow. On top of that, tapes require manual intervention in the case of disaster recovery. In 2005, a prominent U.S.-based financial institution reported a loss of backup tapes during transition which accounted for financial information of more than 1.2 million customers including 60 U.S. senators. Data storage approaches currently vary widely. According to a study done by Forrester Research across 550 companies in the Asia Pacific, disks are viewed most favorably in mature IT markets like Australia, Japan, Korea, New Zealand and Singapore, while internally managed offsite location is preferred in growth markets like China, India, Indonesia, Malaysia and Philippines. Of late, there has been an increase in the adoption of data deduplication technology as it enables companies to restore their critical data or applications in a matter of minutes rather than hours or days using the traditional tape-based recovery.

What is Data Deduplication?

Deduplication (often called “intelligent compression” or “singleinstance storage”) is a method in which one unique instance of the data is actually retained on the storage media, while the redundant data is replaced with a pointer to the unique data copy. For example, a typical e-mail system might contain 100 instances of the same one megabyte

(MB) file attachment. If the e-mail platform is backed up or archived, all the 100 instances will be saved requiring 100 MB of storage space. With Data deduplication, only one instance of the attachment is actually stored; each subsequent instance is just referenced back to the one saved copy. In this example, a 100 MB storage demand could be reduced to just 1 MB. Data deduplication also reduces the data that must be sent

social technologies and mobility are driving major changes in the amount of data being generated, as well as the types of data being stored (both structured and unstructured). At the same time, there has also been an increase in government and industry-specific regulations surrounding the privacy, accessibility, and retention of information. These are creating new challenges for backup and recovery processes.

Data deduplication is a ground-breaking technology that changes the economics of disk-based backup and recovery in this era of data proliferation across a WAN for remote backups, replication, and disaster recovery. Data deduplication is a groundbreaking technology that changes the economics of disk-based backup and recovery in this era of data proliferation. It has helped companies reduce the data stored by up to 30X, shorten backup window from 11 hours to 3 hours, and cut down the average restore time from 17 hours to 2 hours, while optimizing productivity through RPO and RTO. According to IDC, customers that use data deduplication will significantly reduce the overall data footprint under management, which will enable them to have more effective and greener data center operations, resulting in reduced manpower, space and energy requirements. Deduplication decreases the overall cost per GB for disk, which drives disk costs down to become equal to or less than tape costs. The average payback period for this technology is seven months. The economic advantages of deduplication are truly overwhelming if one also considers the improved recovery over LAN/WAN connections without requiring offsite tape retrieval.

The road ahead

Trends such as cloud computing,

Hence, storing, accessing and leveraging business-critical data is becoming a strategic imperative for organizations of all sizes. To enhance data storage processes, CFOs must partner with their CIO counterparts to implement technology that ensures that the data can be retrieved when required. Lack of adequate technology infrastructure and governance can sometime cause irreparable damage to a company’s reputation, apart from attracting financial penalties as well as legal actions. The companies must also analyze their storage, backup and archival approaches every 12-18 months. This is essential for not only ensuring compliance, but also for containing the cost of managing growing data volumes and meeting service level expectations of the business. As the corporate world moves from terabytes to petabytes, it’s time to move beyond generation and think about smart disc-based backup solutions.

u Sandeep Kejriwal is CFO, EMC Corporation, India Center of Excellence

december 2012 i n f o r m at i o n w e e k 43


Feature

Big Data means big storage choices It’s tough to keep up with what Big Data you’d like to store, especially when much of the data is unstructured text from outside — perhaps from blogs, wikis, surveys, social networks, and manufacturing systems By Kevin Fogarty

B

ig data can improve the operational efficiency of companies using it by as much as 26 percent, according to a report released this month from Capgemini North America. That’s a huge leap that will grow even larger — to 41 percent — within three years, if opinions of the 600 C-level executives and senior IT people Capgemini surveyed for the report are to be believed. Two thirds of respondents said Big Data will be an important factor in business decisions, and will accelerate decision-making processes that have been slowed by excessive, inefficiently managed data. Eighty-four percent of respondents said the goal is to analyze Big Data in real time and act on it immediately to keep on top of changes in the market. So why hasn’t Big Data taken over the market for customer-behavior analysis and marketing? It has, at least to the extent most companies can manage it, according to analysts. Big data, like cloud computing, is a technology reference only by fiat; there is no “Big Data” SKU an IT department can order to get into Big Data management. There’s not even a common definition. Any data a CIO can manage that directly affects topline revenue is data that is valuable, no matter what its size, said Forrester analyst Vanessa Alvarez. “Big Data means big value,” she

44

informationweek november 2012

said at the May Interop show in Las Vegas. The problem with Big Data isn’t defining what it is; the problem is in keeping up with what you’d like to store, especially when most of the data that becomes “big” is unstructured text from outside the company--from blogs, wikis, surveys, social networks, and other sites as well as operational data coming in from intelligent monitors built into manufacturing and transportation systems, said Alvarez. However valuable the insight from Big Data, every project comes with a major downside: the cost of “big storage”. Traditional databases don’t write or process data fast enough to handle giant pools of data, which is why the

Hadoop delivers the goods at a huge discount. When you’re talking about data sets such as the 200 petabytes Yahoo spreads across 50,000 network nodes, you get into real money. In March, IDC released the first projection of the worldwide market for big data. It predicted the market would grow 40 percent per year — about seven times as fast as the rest of the IT industry. Most of that cost, or at least the biggest part, will come from infrastructure-investment-caliber storage projects that will drive spending in the storage market to growth rates above 61 percent through 2015, according to IDC analyst Benjamin Woo. The data sets themselves are growing as well. Though most Big Data

However valuable the insight from Big Data, every project comes with a major downside: the cost of “big storage”

open-source database Hadoop has become so popular, according to John Bantleman, CEO of Big Data database developer RainStor, in an article for Forbes. An average Hadoop cluster requires between 125 and 250 nodes and costs about a million dollars, Bantleman wrote. Data warehouses cost in the tens or hundreds of millions, so

sets are not overly large yet, they are growing in size by an average of 60 percent per year or more, according to IDC. The result, according to a February Aberdeen Group report, is that many companies will have to double the volume of their data storage every 2.5 years just to keep up. Data compression and

www.informationweek.in


deduplication can reduce the amount of storage required by almost a third and data tiering can reduce per-unit costs by putting data in low demand on low-cost media such as DVDs or tape. The most effective way large companies deal with out-of-control data growth, however, is with scale-out NAS deployments whose costs rise much more slowly than those of more sophisticated storage area networks, whose costs rise linearly with the volume of data stored, Aberdeen concluded. Many companies, especially mid-size ones, have avoided some Big Data projects because the high-end, high-performance storage they specify as standard for those projects costs half a million dollars to store 20 to 40 terabytes, according to an interview consultancy Sandhill Partners did with Fred Gallagher, General Manager of Big Data cloud developer Actian. Actian’s main product, Vectorwise, scales more efficiently as users add more processing cores than it does simply adding more servers. That approach — making relatively inexpensive storage hardware perform up to the level of its more-expensive relatives — is the more effective way to scale storage networks to keep up with data that gets bigger and bigger ad infinitum, Gallagher said. Scale-out NAS boxes do much the same thing: they make Big Data projects with budget-busting levels of growth slightly more palatable, or at least more affordable. Dell launched a Big Data storage package on July 23 — a rack of products based on Apache Hadoop, that starts with two TB of storage and ranges up into petabytes. The product includes Dell’s Cloudera data-management software, Apache Hadoop, Force 10 networking, and Dell PowerEdge servers. It also includes data compression technology capable of shrinking data at ratios of 40:1. This frees up disk space, reduces the number of units needed for a Big Data project, and saves customers money — but they still must spend vast amounts on storage for data that, in most cases, has yet to prove its worth. It will, predicts Alvarez and other

analysts. Massive amounts of data and the ability to analyze it quickly enough that the results are still useful are so important as decision-making tools that data forms the fourth paradigm of computer science — a whole new way of considering, analyzing, and making use of data, according to computer scientist Jim Gray, who just published a book of essays called The Fourth Paradigm: Data-Intensive Scientific Discovery. The first three paradigms, according

programs will eventually develop their own intelligence. The idea is a little abstract for IT, but each paradigm brought with it a new way of analyzing problems: first according to observation, then by theory, and finally by simulation. Big Data goes beyond all of those by promising to deliver insights so integrally concealed in massive amounts of data that direct observation and analysis by humans can never coax it out. Finding those answers requires enough data to make indirect

to the British computer scientist Amnon Eden, differed radically in the assumptions with which they approached computers. The first treated computers as a branch of mathematics in which applications were formulae designed to produce a practical result. The second treated computer science as an engineering discipline and programs as data. The third, the scientific paradigm, treats applications as processes on a par with those of the human mind, an approach that assumes

correlations clear, however. Having enough data to mine for indirect correlations requires having enough storage hardware to house all that data and access it quickly. Having that much storage hardware — there’s no other way to say it, according to Alvarez, and Woo —means spending a lot more money on storage, no matter how efficiently it can be made to run or how cheaply it can be bought. Source: InformationWeek USA

november 2012 i n f o r m at i o n w e e k 45


Feature

lies about Big Data Our 2013 Big Data Survey shows we’re not lacking facts, figures, or tools to wrangle them. So why do just 9 percent of respondents rate themselves as extremely effective users of data? By Michael Healey

T

o paraphrase an old saying, if you torture data long enough, it’ll tell you what you want to hear. And putting Big Data through that torture only lets us tell bigger lies. Marketing can justify crazy ad campaigns: “Sentiment analytics shows our latest campaign is actually a huge hit with the under-25-urbanvegan demo!” The supply chain team can use it to get more funding: “Our geolocation analysis shows if we invest in robotic warehouse automation, we’ll reduce costs by 15 percent.” Sales can explain why it missed its numbers: “We don’t have an iOS app, and smartphone data shows that’s what 87.4 percent of customers use. It’s not our fault.” Don’t get us wrong. The ability to collect and analyze data is a core IT value proposition. Companies such as Wal-Mart, FedEx, and Southwest Airlines gained strategic advantage by digging into their core business data long before it was labeled “big.” And there’s no question that more data is available than ever before, especially information from the Web and smart mobile devices. Our beef, though, is that most businesses aren’t good at using the data they have now. What are the odds they’ll get better at analysis by adding volumes without changing their strategies? Our InformationWeek 2013 Big Data Survey shows that some companies are making progress. For example, most have built the required infrastructure and support various roles, in terms of primary data users; about one-third say they encourage wide access to information for business users. However, when it comes to

46

informationweek december 2012

data acquisition and use models, the wheels start to fall off. There are major gaps in data analysis, even for the most common types of information: transaction data, system logs, e-mail, CRM, and web analytics. Worse, fewer than 10 percent of the respondents to our survey say that ideas for promising new data points are primarily driven by a collaborative or cross-functional team within their companies. The stats we gleaned from our survey suggest this percentage should be much higher: Nearly half of respondents have 500 terabytes of data or more under management; 13 percent have more than 10 petabytes. Surely there are untapped riches. IT organizations clearly know

Lie 1: We understand how much data we have today

We asked in our survey which of seven key data sources are actively managed, hoping to see respondents widen their view beyond servers, storage arrays, and archives. Unfortunately, only 30 percent of respondents factor in their organization’s cloud data, and just 11 percent include supply chain information. All that information zipping around on mobile devices? Considered by just 35 percent of survey respondents. If you don’t include dynamic data sets, you’re setting your analysis up for failure. How can you do vendor performance reviews without details on how well suppliers do getting the

Before you buy more storage, upgrade your warehouse platform, or spin up a massive Hadoop instance, let’s take a reality check there’s a problem, as only 9 percent of respondents rate their companies as extremely effective users of the data they have. However, just 4 percent admit they stink at putting their data to its best use. Fact is, many organizations are deluding themselves into thinking they’re empowering their businesses. So before you buy more storage, upgrade your warehouse platform, or spin up a massive Hadoop instance, let’s take a reality check. Here are six Big Data lies organizations tell themselves. How many have you heard lately?

right goods to you at the right time for a competitive price? Likewise, if you’re studying customer behavior, how can you get a true picture without web or cloud-based CRM data?

Lie 2: The data we have is good

We’ll bet money that most respondents’ data sets are inaccurate, incomplete, and/or misaligned with one another. Do you really have a single source of truth? Do different groups slice data in different ways? Are you making decisions based on

www.informationweek.in


inaccurate or incomplete data? Case in point: 19 percent of companies in our survey use geolocation as part of their analysis strategy, pulling information from smart devices and web visitors to understand behavior. However, web location tracking is notoriously inaccurate when it comes to enterprise and institutional traffic. That’s because most companies and government agencies work in private clouds with a limited number of egress points. If you’re using web location data to track the success of your sales and marketing programs by region, you’re likely basing decisions on bad information. That big block of traffic from Boston may actually come from an enterprise with offices in the Midwest. Who’s checking data quality? Just one in four respondents identified a dedicated business analyst group as one of the top two users of data within their company. It’s simply amazing how many reports and graphs we see without sampling or accuracy notes. For example, almost every company does customer surveys, yet very few indicate confidence levels or bias results. Got 25,000 customers? Your customer service survey should have 1,843 respondents if you want a 99 percent confidence level with a plus or minus 3 percent margin of error. Furthermore, results should be biased by revenue level. Reality is, we just don’t see that done with any type of data.

Lie 3: Everything will be OK if we can just get more tools A quarter of respondents plan to use more Big Data tools over the next 12 months. Now, we like Hadoop, NoSQL, Splunk, and the plethora of other Big Data options out there, but we recommend looking at what data sets are sitting idle before cutting a check. Given the low levels of use of the 20 internal and external data sets we asked about, it’s clear the problem is related more to staffing than systems. Unfortunately, fewer respondents plan to invest in Big Data staff versus spending money on technology. Only 33 percent plan to grow their training and development programs; 9 percent are cutting back. Net new hiring ranked at the bottom of our list, with 17 percent growing staffing levels compared with 14 percent cutting. Nowhere is this “tools over people” focus more evident than in healthcare. The federal government’s electronic records incentives have driven the industry to a new level of data collection and reporting. But now that healthcare providers have all this data, they’re trying to figure out how to use it. “There is big money to be made in healthcare Big Data, so everyone and their brother was throwing up solutions,” says Bill Gillis, CIO of Beth Israel Deaconess Physician Organization. But it’s important to work with people who understand healthcare organizations and the complexity of their data, Gillis says.

Who are the primary users of your company’s data? Department-level analysts

39% Senior business management

32% A wide array of business users; we encourage wide access

31% Information technology

30% Dedicated business analyst group

25% Business users dedicated to analysts, though not full time

22% Data: InformationWeek 2013 Big Data survey of 257 business technology professionals at companies with 50 or more employees, September 2012

“The business need should drive the process,” he says. “The tool alone will not change much. Finding a skilled hand that can effectively wield that tool will.”

Lie 4: There’s an expertise shortage

Speaking of staff, an oft-quoted McKinsey & Co. study estimates a shortfall of 140,000 to 190,000 people in “Big Data staffing” by 2018. Our own InformationWeek Staffing Survey shows that 18 percent of respondents focused on Big Data want to increase staff in this area by more than 30 percent in the next two years, but 53 percent say it will be difficult to find people with the right skills. Roll the clock back a few years and substitute the words “virtualization engineer” or “Cobol programmer” or even “webmaster” for “Big Data specialist” and you’ll similarly find people predicting doom. Don’t get sucked in again. You already have much of this talent within your organization; you just need to set it free. Consider that 39 percent of respondent organizations have department-level analysts as the main users of their information. Break those people out of their department silos and move them toward a more holistic view of data. For example, a U.S. retailer we worked with always had separate data teams aligned with various departments. The strongest team was within the catalog group, banging out circ plans, catalog yields, conversion rates, even profit per page. Great stuff, but that team was limited to using catalog and financial data. Siloed from the web team, they were missing the transformation happening within the customer base. Separate departments, separate views of the truth. Don’t blame the analysts. The company built the structure, and IT wasn’t involved enough to identify the information gaps between departments. Our point is, IT not only has to understand the data itself, but it must also become an integral part of identifying and growing the centralized talent pool. That means

december 2012 i n f o r m at i o n w e e k 47


Feature putting more emphasis on training and talent development. Competitors will steal some of your more talented Big Data pros if you don’t give them a reason to stick around. We scanned the online listings of the major hiring sites and found that the salary levels for data analysts still range from about USD 55,000 to low six figures. However, if you add “Big Data” to the standard titles, the average salary doubles. Expect everyone’s LinkedIn titles to change in the next 12 months.

Lie 5: We know what data we need

In our survey, we asked about 10 internal and nine external data types. Internal sources include financial accounting applications, detailed sales and product data, CRM data, unstructured network data such as Office files and images, and unstructured data stored on end user devices. External sources include government statistics and other public records, geolocation data, data collected from sensors on company products and services, social network data (Facebook, Twitter), and unstructured data stored in the cloud (Office365, Google Docs). Clearly, there’s a lot of information out there. But when we asked who’s driving data analysis ideas, we were surprised to find that only 5 percent of respondents have a centralized team to drive Big Data strategy; an additional 3 percent use a looser collaborative effort.

We’re not the biggest fans of committees, but given the fact that the users of your data are likely spread far and wide, it makes sense to create a cross-functional group to identify new sources or elevate the importance of an existing stream. It’s staggering to see some of the great data that’s all but untouched. Take CRM, phone, e-mail, and web analytics. These four data points cover most of the communications relationship with your clients. Tying them together isn’t rocket science, especially if you have decent baseline customer data to start with. Not only can you determine the number of conversations your company typically has with customers, but you can also understand how e-mail relates to phone calls and web traffic. If you have an outside sales force looping in, your CRM data gives you a profiling capability to model everything from product rollouts to customer service problems. All that intelligence exists today, yet few companies have this level of analysis integrated into their Big Data strategies. While 35 percent of survey respondents say their IT organizations include CRM in their integrated plans, only 29 percent include e-mail, 22 percent web analytics, and 14 percent phone logs.

Lie 6: We do something with our analysis

There’s nothing more frustrating for an analyst than to work for days or weeks

Eye on critical Data How effective is your company at identifying critical data and using it to make decisions? Not at all effective

Extremely effective

16%

4% 9%

Slightly effective

33% 38% Moderately effective

Very effective

on a project, present the findings, have a great meeting with execs, then watch those recommendations die on the vine. Everyone focuses on the positive aspects of data analysis — helping find new customers or discover more productive logistics routes. But the reality is that Big Data analysis will find some negative things — about your sales team’s effectiveness, your online presence, your true costs of operations. The slow economy of the last four years has weakened multiple parts of most companies. Adding data sources and a more holistic analysis will help find and prioritize the problems you need to fix.

IT Truth Tellers

Want to raise IT’s profile as a business enabler? Step in and assume responsibility for data quality across the company. Here’s a quick check of items IT should review today: l Is there a centralized data quality team? If not, set one up ASAP. l Does the team do regular or, at minimum, spot audits of various analyses? Does it regularly look to add new data sources? l Are critical external events annotated within your data warehouse or as part of your reporting process? For example, think about major system upgrades that would change the underlying data related to order flow. l Do you require statistical notes, including sampling statements? l When it comes to customer or vendor surveys, are sample sizes validated against your base customers or total market size to ensure accuracy? l Do you run regular “stress tests” for current data sets with crossfunctional teams, challenging assumptions and sacred cows? l Do you look outward? Most respondents, 75 percent, report some public cloud use. Yet many companies aren’t capturing associated data--think WebEx conferencing for customer behavior analysis or Google analytics for sales tracking. Source: InformationWeek USA

48

informationweek december 2012

www.informationweek.in


Feature

Marketing analytics: How to start without data scientists You don’t need a team of highly paid math whizzes to get started with data analytics, says one marketing analytics expert By Jeff Bertolucci

T

he looming data scientist shortage, whether real or perceived, has become a hot topic recently. Some industry analysts predict companies will have a hard time finding qualified people to pull insights from their growing stockpiles of data. The McKinsey Global Institute, for instance, says the U.S. could face a shortage of up to 190,000 data scientists by 2018. There’s good news, however, particularly for businesses interested in marketing analytics. In a phone interview with InformationWeek, Anil Kaul, CEO of AbsolutData, an analytics and research firm based in Alameda, Calif., said the data scientist shortage is real, but added that companies shouldn’t panic if they don’t already have a squad of statistical whizzes on staff. In addition, there’s no reason to invest heavily in a sophisticated data science platform, at least not when you’re getting the operation off the ground. “I’ve seen this happen at many organizations, where people say, ‘I need Ph.D.’s to do this. Where am I going to find them?’ Because there aren’t too many of those people around,” Kaul says. Often, indecision leads companies to simply do nothing. Wrong move. “When you start in analytics, in your first six months you should not worry about investing tremendous amounts of money on technology, or worry about hiring a big team,” says Kaul, who has a Ph.D. in marketing analytics from Cornell University

and more than 17 years of experience in the analytics field. AbsolutData provides analytics support for major Fortune 500 companies, Kaul says, either on a project basis or increasingly on an ongoing “team-support basis,” where the firm develops analytics teams for businesses. “The real impact of analytics comes when you’re doing simple, basic analytics. You don’t need huge tools. You don’t need a Ph.D. to do that,” Kaul says. More sophisticated analysis, however, requires people with serious skills in a variety of technical disciplines, such as computer science, analytics, math, modeling, and statistics. Furthermore, a data scientist must be a good communicator who is capable of understanding a business problem, transforming that problem into an analytics plan, executing the plan, and then devising a business solution. “It’s a fairly skilled roll that crosses two different areas,” says Kaul. “It takes a particular type of person who can do that very well.”

This rare combination of skills is a major reason why there’s a shortage of data scientists. But things may be improving. More business schools are starting to teach analytics courses, says Kaul, which may mean more university graduates will soon have a basic set of data science skills. “Traditionally, most people have shunned analytics as a career because it meant being good at math,” he says. The need for marketing analytics will grow significantly in the future, as the media landscape becomes increasingly fragmented. Thirty years ago, for instance, marketers had far fewer national outlets at their disposal, including three major television networks, fledging cable TV channels, radio, and a select number of magazines and newspapers. Today, of course, a marketer’s options may seem endless; in addition to traditional media, there are search engines, social networks, and sundry other Internet options. “The decision itself has become so complex that you need analytical support,” Kaul says. “People who have analytical skills have suddenly become in demand, and that’s part of what is driving the shortage.” And that development means that analytics as a profession will have a very bright future, Kaul believes. “This is something that will play a critical role in creating the next wave of companies,” he says. “Companies will win because of the way they use data and analytics to get insights.” Source: InformationWeek USA

december 2012 i n f o r m at i o n w e e k 49


Feature

How can enterprises identify a case for Big Data

D

ata has grown in acceptance and adoption in the past year. Technologies have been announced and marketed in various flavours that address the three major dimensions of Big Data, i.e. volume, velocity and variety of data. One interesting thing to note is that all the products address two basic paradigms: distributed data and parallel data processing and querying mechanisms. This was initially invented by Google in the form of Hadoop platform, which primarily addressed distributed storage of data and parallel distributed processing using Map Reduce. It is a great technology that addresses churning/analysis of data of the size that Google handles in terms of petabytes. The biggest question is whether this technology can be a game changer for enterprises? How can enterprises identify a case for Big Data? There are different ways these questions could be projected. 1) Are enterprises facing constraints, pains, limitations in existing technologies in terms of data management and analytics and achieving them in desired time and cost? 2) Are enterprises missing complex analytics which can be done beyond traditional analytics on larger data and gain more valuable insights at a lower cost and lesser time? 3) Can the newer technologies of Big Data optimize existing technology processes to deliver faster and at an affordable/cheaper price point? These questions form the basis for potential Big Data discovery.

BIG DATA - USE CASE PATTERNS

With the premise that Big Data technologies primarily address two major technical areas: distributed database storage, which can scale without limits and distributed data processing and analytics framework, which allows data processing and analytics across the distributed data, the following use case patterns are explained:

50

informationweek december 2012

Data Storage Pattern – The large scale distributed storage via Big Data databases is leveraged in this pattern. Historically, enterprises have been archiving a lot of data based on timeline in offline tapes and losing the opportunity to analyze and co-relate the information over larger time period. The Data Storage pattern helps adoption of Big Data for providing “Intelligent Online Archive” as the cost of storage has dropped over the years and the Big Data technologies also allow for querying/analyzing large distributed data stored in this low cost storage. Transformational Use Case Pattern – This pattern is the most apt low hanging fruit for adoption. This is done pre-dominantly using some analytics framework – e.g. Map Reduce. Map Reduce framework allows for transformational logic to be coded in the Map Reduce programs, which can be used for data cleaning. Data cleaning and transformation is one of the very prominent steps prior to data warehouse loading. Thus, this use case pattern can be leveraged in data warehousing solutions to replace costly ETL tools and also to achieve higher performance using distributed processing. Analytics Use Case Pattern – The analytics pattern is the core of the Big Data technologies. As a part of the Distributed Data processing across multiple nodes mathematical algorithms can be plugged in to identify patterns and trends of various forms. The significance increases as the large amount of data processed increases the number of data points available for the algorithm, providing meaningful results. This is the biggest constraint in the existing RDBMS technology as it cannot efficiently process and analyze petabytes of data in a reasonable time with an affordable price point. Real-Time Analytics – Processing of large data and coming up with situational analytics in real time becomes

another critical use case. There are many products that allow real time processing of large data. For example, in the capture and processing of sensor data where the volume, velocity and variety of data is extremely high, products like GridGain, Storm, etc., provide near realtime in-memory Big Data solutions. Data Caching Pattern – Big Data databases are primarily NOSQL databases i.e. they are not RDBMS which support SQL language. The structure of the NOSQL database allows it to be very fast as compared to the traditional RDBMS. One example is memcached that allows in-memory caching of data for very fast querying. Enterprises are exploring options of caching of RDBMS data in in-memory NOSQL caches for faster querying. These five use case patterns help to map situations and also to adopt Big Data in similar scenarios. Big Data has great potential to be leveraged in enterprises, but needs to be evaluated carefully to ensure it delivers big value in data processing and analytics at an affordable price point and at a faster speed. In the coming years, it will coexist with the traditional BI Systems and complement them.

u Viswanathan N is Head – Practice &

Alliance (Big Data, Cloud Computing & Legacy Transformation), L&T Infotech

www.informationweek.in


Interview

‘Analytics helps you see your customer as a person’

F

or the past seven years, the business and IT communities from across the country came together at a conclave called the IBM Software Universe, to discover the next technology trends and how businesses could leverage on these trends to stay ahead of the competition. Now in its eight edition, the theme for this year’s conclave in Mumbai on November 6th was ‘How software can transform the way we live, work and play’ — with the tag line ‘Meet Possible’. The overarching technical theme this year was Big Data and Analytics, but there was a lot of talk about how the focus has shifted from the back-end to the front-end, with customer facing technologies like social and mobility. Brian Pereira of InformationWeek

Deepak Advani

caught up with three IBM experts — Prashant Tewari, Country Manager - Business Analytics, IBM India, Deepak Advani, VP, Business Analytics Products and Solutions, IBM, and Katrina Troughton, VP, Business Analytics, IBM Software Growth Markets — to discuss the application and use of Big Data and analytics in business and government: Can you update us on last year’s announcement about IBM embracing Hadoop and working with Cloudera? Deepak: We’ve got a product called InfoSphere BigInsights that’s based on Hadoop. It is increasingly being used for social media analytics. With 34,000 tweets and 600 blog posts being generated every minute, that’s a lot of data to analyze and to look for

Three IBM experts Prashant Tewari, Country Manager Business Analytics, IBM India, Deepak Advani, VP, Business Analytics Products and Solutions, IBM, and Katrina Troughton, VP Business Analytics, IBM Software Growth Markets discuss the application and use of Big Data and Analytics in business and government

patterns. That’s what Hadoop can do. We also have a product called Cognos Consumer Insights that builds on Hadoop and BigInsights and it analyzes social media data using the MapReduce architecture. And then there’s InfoSphere Streams to analyze data in real-time. You don’t have to store all that data in order to analyze it. We also have a product called Cognos Consumer Insight (CCI) that does customer sentiment analysis. It also does social network analysis to determine how influential certain people are on the network. It is based on Hadoop and BigInsights. Can you give us some examples of how companies, perhaps your customers, are actually benefitting from Big Data

Prashant Tewari

december 2012 i n f o r m at i o n w e e k 51


Interview and analytics? Deepak: XO Communications (a telco) used our predictive analytics solution and reduced customer churn by 35 percent. The way to go about doing that is to look at the historical data, going back a year or two. You need to look at all the clients and the touchpoints, and all the actions that customers took. You look for patterns with the available customer demographic information. So you take all the different data points and build a model, based on mining your historical patterns. When you start to see similar behavior for different clients, the models can tell from the patterns that a person is likely to leave. The models can also suggest what type of offer you can make to reduce the chances of a customer leaving. Another customer in Europe reduced customer churn from 20 percent to 2 percent. They integrated all the different data they had on their clients, from demographic to interaction to transactional, and looked for patterns to run those models in real-time. Lots of people are now running these models in a call center. So when someone calls in you have all their details on the screen, and can identify their propensity to churn or their life-time value. Based on what the person is saying, that text is now getting analyzed in real-time, and the offers that you make can also change in real-time. It’s important to know what type of offer is likely to resonate with this person, because not everyone wants the same offer. Katrina: It is a shift that we see, about needing to know a client as a person or an individual. So it is about how to respond to this person, and not to someone who sits in this generic group. In a recent CEO study that IBM did, Indian CEOs saw this approach as one of the most important drivers for business growth. Which are the verticals where you seeing a lot of use cases for Big Data and analytics? Deepak: It is across the board. Certain industries are more aggressive in deploying analytics than others. More companies are asking what the telcos have been doing to reduce churn,

52

informationweek december 2012

because they have been doing that for decades. We also see Market Basket Analysis used extensively in retail and now manufacturing companies want to learn from retail. While the telecom industry is using analytics to reduce churn, banking is using it to acquire the right type of customers who can give the best life-time value. So they will offer certain services to high-net-worth individuals. Manufacturing is focusing a lot on predictive maintenance — the companies in this sector want to predict things that might fail and act on it before they might. For instance, BMW can alert the driver that based on all the data from the engine there’s a 70 percent chance that in the next 40 kilometers the car is going to break down. And its advises the driver to take it for service. So statistical algorithms are examining all the data and looking for patterns and predicting failure. This notion of predictive maintenance is becoming big, and now energy and utility companies, and oil companies are also asking for this. In retail, they want to predict what are the products that people want to buy, so that they can stock their shelves in advance. Katrina: Energy and retail companies are also looking at predictive analytics. For instance, a clean energy company might want to know the best places to install a wind turbine. You need to analyze large quantities of Big Data to make that decision. Weather prediction, flood alerts, natural disasters are other application areas. Can you touch upon some use cases that are specific to India? Prashant: We are doing work around taxation and inclusion (data gathering and analysis); we work with financial inclusion authorities to provide them with analytical inputs. A large financial inclusion authority uses our BI solution for analyzing collections and disbursements on a weekly basis. Micropayments in India is not finely regulated. So our solution is helping in gathering data on a weekly basis and shows it on an executive dashboard, to enable management decisions. We also work with many state

departments to assist them in the collection of information. For example, Ministry of Statistics, a Department of Economics and Statistics, collects data for doing the GDP calculation — and to figure out the deficit in terms of the index of industrial production. So these ministries and departments are tasked with carrying out statistical and data mining analysis. The third area is direct and indirect taxation — we work with both the departments and provide them with tools and techniques for carrying out in-depth analysis. Katrina: We have also worked with the education industry and universities to help improve (technical) skills around analytics, by working closely with state governments, and providing courseware, and in designing the curriculum. It is also about providing products and solutions that address a broader range of problems, and which are simpler to use. Can you elaborate more on what you are doing to address the demand-supply gap for data scientists? Which universities and institutions are you working with in India? Prashant: There are two sides to this: the technical and the management education. The SPSS solution, which is for statistics and modeling, is taught in almost every engineering college in the country. Practically all colleges use the SPSS solution for doing statistical analysis, and it is part of the course content. In fact, we have an academic team that provides courseware to colleges. And for management education we have some marquee names, such as IIM-Ranchi. They have included analytics as part of their course curriculum. Management institutions (and graduates) are beginning to realize that just having a marketing or finance degree does not make one unique or immediately employable. They are looking at analytics as an enabler to help students become employable. And we at IBM have an ambitious target to certify more people on our technologies. u Brian Pereira brian.pereira@ubm.com

www.informationweek.in


Case Study

How ICICI Bank is redefining the future of banking with its social strategy India’s largest private sector bank is leading the way in showing how banks can use social media to their advantage by using the social medium to promote financial literacy, smart banking tips and safe-banking practices in addition to product information By Srikanth RP

A

lways the first to explore new technology mediums, ICICI Bank has been in the forefront of using technology to maintain its competitive edge. So, when the Bank announced this year, that it has created a Facebook application that allows customers to see their bank statements while being on the social media platform, it was another feather in the business technology cap for India’s largest private sector lender. Like other technology savvy organizations, ICICI Bank realizes the immense potential of the platform. Facebook already has more than 65 million active users in India, and what better way to engage with end customers than be at a place where customers are already present. Facebook gives ICICI Bank one more channel to engage in real conversations with fans and prospective customers. The Bank realizes the importance of social media to interact with customers and manage reputation by understanding or influencing an individual. The end objective is to build affinity towards the brand especially among customers in the younger age group. “We believe it is imperative to be present on social media platforms and also offer banking services that no other banking institution does. Our objective is to provide various touchpoints to customers to engage with the brand at their convenience,” says NS Kannan,

Executive Director and Chief Financial Officer, ICICI Bank. Using the social platform, the Bank provides its customers with a rich offering of apps, product information, and utilizes this platform to increase financial literacy. The Bank also utilizes this space to manage its online reputation. This involves monitoring and generating online feedback and reviews, responding to customer complaints, resolving queries, taking corrective action on incorrect information and using online feedback to guide improvements in product development for enhancing customer experience.

queries/concerns of its end customers. ICICI Bank launched its Facebook page early this year. The Bank wanted the launch to be not just another medium to connect with the audience but also provide value-added services on the platform. “We launched the one-of-its-kind application that allows a registered user to check his account details while on the social media platform and also avail a host of services. We have always encouraged users to follow safe banking practices. Indeed, we have received a good response on our safe banking tips. We have built a 9.5 lakh strong community on Facebook within a span of 10 months,” states Kannan. On Facebook, ICICI Bank engages with its fans not only by providing them information on convenience of banking but also by educating and increasing financial literacy among its young fan base. The Bank encourages feedback from its fans and makes the conversation two-way. The Bank’s YouTube channel is

Presence on social media

ICICI Bank’s social media strategy is based on a simple principle to engage and interact; and its presence is spread over various platforms. The Bank has established its presence on Twitter by creating a robust infrastructure to “listen” to its customers. This has helped it to effectively and efficiently address

“We have increased positive perceptions of the brand by creating opportunities to listen to and engage with customers”

NS Kannan

Executive Director and CFO, ICICI Bank

december 2012 i n f o r m at i o n w e e k 53


Case Study a library to view its advertisements, watch interviews of its senior management, view product videos and share them among each other. The Bank’s channel has garnered over 1 lakh views in less than a year and has helped it to disseminate video information on other platforms such as Facebook and Twitter. It is also increasingly looking towards ways to increase its video inventory to further engage with its online savvy customers.

Engagement with customers using social media Over a period of 10 months, the Bank has created a bouquet of rich offerings on Facebook — a bank account app, money personality app, deal-of-theday offers sections, offers on banking products/services and a mechanism for customers to write to it. The Bank interacts with fans in a variety of ways — financial literacy, smart banking tips, safe-banking practices in addition to product information. Furthermore, it has a fully integrated method of handling customers’ queries on social media with a dedicated team handling these queries.

The Bank has also gone beyond just disseminating information and limiting its conversation to banking. It has also created games, fun applications and contests for its fans to enhance interaction. The social media platform also offers the Bank a great opportunity to manage customer expectations and experience. All customer grievances are integrated at the back-end in a CRM module built especially for Twitter and Facebook. The bank has created a separate form to capture queries from the social media platform. A real-time online dashboard helps it keep a track of all the comments about the Bank. If the Bank notices that there are unhappy customers, it reaches out to them and tries to resolve their issues. This has been endorsed by many customers whose problems have been resolved. This is in line with its brand philosophy of “Khayaal Aapka”.

Using social media as listening posts

By using social media for listening posts, the Bank is building better relationships with its end customers. One of the important

Social Banking @ ICICI Bank: l

Over a period of 10 months, the Bank has created a bouquet of rich offerings on Facebook — a bank account app, a money personality app, a deal-of-the-day offers sections, offers on banking products/services and a mechanism for customers to write to it l The Bank interacts with fans in a variety of ways — financial literacy, smart banking tips, safe-banking practices in addition to product information l The Bank has built a 9.5 lakh strong community on Facebook within a span of 10 months l All customer grievances are integrated at the backend in a CRM module built especially for Twitter and Facebook. The Bank has created a separate form to capture queries from the social media platform. A real-time online dashboard helps it keep track of all comments about the Bank

reasons why the Bank has decided to be on the social media space is to create conversations and be high on engagement with its customers and fans in their day-to-day life. As a result, the Bank has been the most-engaged Indian banking brand on Facebook. “The insights that we have gained from consumers about our products and services have helped us in improving customer satisfaction. We have also increased positive perceptions of the brand by creating opportunities to listen to and engage with customers. The social media platform not only allows a bank to connect with customers but also helps customers to interact amongst themselves. Social media platforms also help people share experiences. It is well known that affinity towards a brand/product or services increases once someone recommends or shares their experience about a particular brand,” states Kannan, on how social media is helping ICICI Bank enhance its relations with its end customers. Trust is vital in the banking environment. ICICI Bank realizes the importance of trust and has accordingly taken measures to ensure security. “We have indigenously developed the Facebook banking applications. Our technology team has taken into consideration the Facebook environment, safety and convenience while developing the application,” explains Kannan. Unlike other banks which use Facebook only for promoting their products, ICICI Bank is looking at social mediums, such as Facebook as engagement platforms. The basic objective is to take convenience to a new level by allowing customers to avail a host of banking services on Facebook itself while socializing. With a fan following of over 9.5 lakh within 10 months and a couple of nice innovations on the social platform, India’s most technology savvy bank, ICICI Bank is showing other banks and companies how relationships can truly be built using the new world of social media. u Srikanth RP srikanth.rp@ubm.com

54

informationweek december 2012

www.informationweek.in


Case Study

ERP solution helps DIMTS slash operational costs by 30 percent By deploying an ERP solution in the form of Microsoft Dynamics AX, the Delhi Integrated Multi-Modal Transit System (DIMTS), has gained complete visibility into all aspects of the business such as trends, routes, and revenues By Srikanth RP

A

n urban transport and infrastructure services company, Delhi Integrated Multi-Modal Transit System (DIMTS), has been implementing strategic projects focused on multitransport modes using integrated systems. The firm had a vision of developing and delivering world-class infrastructure to the citizens of Delhi. However, unwanted delays, manual processes and duplication of tasks due to multiple disparate applications were proving to be major stumbling blocks for implementing this vision. For example, DIMTS was using multiple disparate bespoke applications for leave and attendance management and for managing timesheets. It was also using standalone financial accounting software and Microsoft Office Excel for lead management. In addition, DIMTS relied on Microsoft Office Outlook for all e-mail communications. DIMTS was also struggling to deal with large amounts of communication and data flow, which was paperbased, and had to be filed, taped, and moved across departments to transfer information. The firm had a few redundant and time consuming transactions such as movement of papers and files across departments. This resulted in duplication of tasks. Since the firm operated from two locations, reports on project costing and employee utilization were manually generated and each took up a substantial number of hours per day. The most significant aspect was closing of accounts, which was a tedious process leading to unwanted delays.

“We wanted to reduce our time for reconciliation of accounts. Typically, our finance team worked for 3–4 days during quarter closures to reconcile and cross-check all the accounts,” states Bhaskar Basak, Vice President, DIMTS. Leads were maintained and updated manually in an Excel sheet. Project managers allocated tasks, responsibilities and schedules based on personal interaction — and skills had to be assessed by them. This often resulted in over- or under-utilization of resources and, at times, tasks were allocated to personnel inept at the particular tasks, leading to delays and ineffectiveness.

Automating operations

Keeping in mind its growth prospects, the firm decided to implement an ERP system to centralize information and improve the efficiency and scalability of its operations. After scrutinizing a range of ERP solutions, the firm decided to implement Microsoft Dynamics AX. Initially, the firm deployed the solution for 52 concurrent users at the company’s head office in Delhi. Modules deployed included general ledger, accounts receivables, accounts payable, bank, CRM, and projects. The implementation had the standard software development life cycle phases, from planning, analysis, design, and development to implementation and testing. The in-house team later extended the Dynamics AX solution to a number of functional and operational areas by implementing many workflowrelated enhancements. Several unique requirements pertaining to workflow enhancements such as sending the record to multiple users for discussion or clarification; asking queries of multiple users even

“Operational analysis, profit-loss ratio study and forecasting have helped in strategic planning and generating meaningful and accurate reports”

Bhaskar Basak

VP, Delhi Integrated Multi-Modal Transit System at the time of final approval; displaying workflow history in a user-friendly format, etc., were addressed during the implementation stage. Additionally, the DIMTS deployment team customized the solution to meet their specific needs. Some of these unique requirements included leave and attendance management, and project and department budgets. The team also integrated Dynamics AX with projects such as the Driving License Issuance System implemented earlier

december 2012 i n f o r m at i o n w e e k 55


Case Study for the Government of Delhi. Dynamics AX also managed the revenue generated by issuing driving licenses across multiple offices across Delhi. DIMTS also manages and monitors the operations of public buses in Delhi for the Government of Delhi. Accordingly, DIMTS has implemented an automatic vehicle location system, passenger information system, and electronic ticketing system. The ERP solution is customized to integrate with the bus management and tracking management system. It records all the data such as number of buses, schedules, actual distance travelled by each bus, and passengers on board. Based on this information, the revenue generated by each bus is calculated. At the same time, missed trip details, details of infractions, etc, are also directly linked to the ERP solution. This has helped in facilitating and improving the efficiency, as well as tracking the revenue generated on a daily basis.

Improved process efficiency

The solution has also eliminated the need for consolidating data from multiple disparate systems. The ERP solution has helped standardize financial processes across the organization, allowing it to reduce reporting time scales, errors, and rectification costs, while increasing reporting accuracy. In addition, the system tracks accounts payable and receivables on a daily basis. Previously, almost all business operations were paper-based. The documents had to move physically from desk to desk for approvals. Apart from the delays, the paper consumption was very high. Now, all files are managed in a central location electronically, and are available to all

relevant personnel. For example, the approval for payments made to vendors is done via the ERP. Vendor invoices, and the related purchase approval documents are electronically attached to the payment approval record in ERP, thereby eliminating paperwork. This has decreased dependency on paper and workflow-based approvals have sped up processes significantly. The most significant benefit of the ERP is the firm’s ability to quickly take decisive actions. “With online processing, we are able to administer revenues and profits, and also keep a tab on the entire team to restore efficiency. We are also able to take a decision on the spot for accepting or rejecting bids by taking into account all factors, with adequate checks and balances,” states Basak. The new ERP solution has already eliminated redundant data entries, and other manual tasks, leading to an improvement in the overall process. With automated processes and workflows, associated personnel can now make a bigger difference in driving the company’s growth. “We have been able to re-allocate the same resources to multiple other tasks, saving thousands of people hours. Overall, we have saved time and operational costs by up to 30 percent,” opines Basak. Automation of processes has also helped in gaining visibility into all aspects of the business such as trends, routes, and revenues. Operational analysis, profit-loss ratio study and forecasting have helped in strategic planning and generating meaningful and accurate reports. With transparency and immediate access to information, the management is able to exert better control across departments to ensure smooth and efficient functioning.

The ERP solution has helped DIMTS standardize financial processes across the organization, allowing it to reduce reporting time scales, errors, and rectification costs, while increasing reporting accuracy

ERP advantage l

l

56

The solution records data related to number of buses, schedules, actual distance travelled by each bus, and passengers on board, which enables DIMTS to calculate revenue generated by each bus It manages all the files in a central location electronically, which has decreased dependency on paperwork and workflow-based approvals and increased process efficiency

informationweek december 2012

u Srikanth RP srikanth.rp@ubm.com

www.informationweek.in


Global CIO

Maruti Suzuki CIO’s next goal: Training 100,000 people Rajesh Uppal, Executive Director (IT) and CIO, Maruti Suzuki India established the Maruti Suzuki Training Academy earlier this year to bridge the skill gap between the job roles of an individual. The Academy has the herculean task of training everyone in the company’s 100,000 strong value chain. Here’s the plan: By Brian Pereira

T

he USD 7 billion Maruti Suzuki India manufactures about half the number of cars produced in India, every year. With a current market share of 38.3 percent in the Indian automobile industry, and an average share of 4550 percent, it rolls out 400-500 cars a day — that’s a million Maruti Suzuki cars hitting Indian roads every year. It expects to double output next year, and then produce 2.5 million cars by 2016. Its customers have a choice of 15 models with 150 variants. Ever since its inception in 1985, IT was treated as a business function/ vertical at Maruti Suzuki, something rare even for large enterprises back then. The company has been very focused on IT, and has been able to effectively use IT for its functioning. It continues to invest in IT, and even has a separate multi-million dollar budget for pilots. Undoubtedly, IT is an intricate part of all aspects of its business Rajesh Uppal, Executive Director (IT) and CIO, Maruti Suzuki India has been with the company since it was founded. Apart from overseeing IT functions, he was also involved in other roles such as sales and dispatch, the Gurgaon CSR initiative, and logistics. Uppal’s most recent initiative is to establish a training academy that bridges the skill gap between the job roles of an individual. He established the Maruti Suzuki Training Academy earlier this year. The Academy has the herculean task of training everyone in the company’s 100,000 strong value chain. And the learning framework depends heavily on technology. Speaking to

InformationWeek on the sidelines of INTEROP Mumbai in October, Uppal said, “I believe technology is now playing a key role in learning. Earlier we had classroom training and residential courses. But it is becoming difficult to manage that, especially when people are dispersed. So at the board level we decided to set up a learning academy that is technology oriented.” The Maruti Suzuki Training Academy has an LMS (Learning Management System). This helps the institution understand the level of competencies of different people, at what level they need instruction, and the skills required as per their role. This creates a learning plan based on one’s competencies. Maruti Suzuki is creating its own IPR for courses (its own course content). It is building a library of course content that is digital in nature. There are online and offline courses, supplemented by a number of case studies. After each case study there is a checkpoint to review one’s level. The Academy already has 100 hours of learning content, its own IPR. This is mostly technical in nature but also addresses topics like negotiation skills. Maruti Suzuki Training Academy is building a learning infrastructure. There is a broadcast studio in Uppal’s office, with 32 learning centers across the country. Sessions are broadcast to the learning centers, and students can also ask questions during the session. Uppal informs that 400 - 500 persons can be trained at a time. “We had set up this Academy about nine months ago. And we plan to set up learning centers at the dealer offices within two years. The

sessions vary and cover subjects such as new product training, diagnostics tools, and skills,” adds Uppal. For now, the Academy is partnering with universities to develop course content. It is also talking to private companies like Skillsoft for course material and is hiring trainers for certain courses, though it has its own in-house trainers. Long-term, Uppal is keen to have his own IPR for course material and his own teachers. The culture of learning will also get stronger as more people are trained at Maruti Suzuki. With this company investing heavily in training, it reinforces the belief that people are a company’s biggest asset; it is important for employees to stay abreast of the latest developments through continuous learning. We are sure that Uppal is likely to succeed in his latest project and wish him the best! u Brian Pereira brian.pereira@ubm.com

december 2012 i n f o r m at i o n w e e k 57


Feature

How Dell is helping businesses shed legacy flab to accelerate transformation Acquires Clerity Solutions and Make Technologies to help customers port applications to new and open architectures; focus shifts to tightly integrated, converged systems By Brian Pereira

A

discussion with C-suite executives these days is likely to include talk on business transformation. While the exact definition of Transformation is really contextual to the nature of business, almost everyone seems to agree that technology plays a leading role in their transformation plans. At its annual Indian IT Executive Summit in early November 2012 at Macau, Dell shared its own transformation story, as it went from a PC company to a complete end-to-end enterprise solutions provider. Naturally, the theme of this symposium was ‘Transformation.’ The Symposium was attended by nearly 200 top IT leaders from key enterprises across India, Dell senior leadership, global CIOs, analysts and some prolific speakers like Thom Singer and Peter Leyden (Wired). They all came together to learn how to address key issues in the transformational journey, and to master the key forces shaping IT today. InformationWeek spoke to Philip A. Davis, Vice President, Enterprise Solutions Group, Commercial Business, Dell Asia Pacific & Japan Region about the challenges that enterprises in the region are facing today, and how Dell is addressing these concerns. “In these uncertain economic times, there is a lot of pressure on companies to be more efficient. From my discussions with customers in the region, I see two trends: they want to reduce IT spending, and they want to shift the spends towards innovation, spending less on maintenance costs. For instance, one customer I met in India has a lot of legacy Unix systems and is spending much of his IT budget on maintenance contracts, just to

58

informationweek december 2012

keep it going,” said Davis. Davis said customers are tenacious and are not in a hurry to dump their “trusted” legacy systems because doing so involves migrating all the applications to the new platform, a costly and time-consuming affair. A bigger challenge is doing that without causing disruptions to the business. “The applications are tied to legacy architectures and this prevents companies from moving to open architectures. In difficult economic times, people don’t have the budget to pay for both, application migration and application maintenance,” added Davis. Dell sees this as an opportunity and wants to help customers migrate applications to new platforms. It recently acquired two companies: Make Technologies and Clerity Solutions. Make offers application modernization software and services, while Clerity is a provider of applications modernization and rehosting solutions. Dell believes the product and services portfolio from these two companies can help its customers reduce the cost of transitioning business-critical applications and data from legacy computing systems, and move to more modern architectures like the cloud. For instance, Make solutions will help customers migrate legacy Unix applications to Linux. The Clerity portfolio is for mainframe users; it takes mainframe applications and rehosts them in an x86 environment. According to an IDC report released in Q2-2012, the Unix server market had a revenue decline of 20.3 percent yearover-year to USD 2.3 billion. Mainframes are still being used for core banking applications in certain

“In these uncertain economic times, there is a lot of pressure on companies to be more efficient. From my discussions with customers, I see two trends: they want to reduce IT spending, and they want to shift the spends towards innovation, spending less on maintenance costs”

Philip A. Davis

VP, Enterprise Solutions Group, Commercial Business, Dell Asia Pacific & Japan Region

www.informationweek.in


geographies like Japan. The Indian market is dominated by Unix and x86 systems, and to a much lesser extent, mainframes. The other area that enterprises are thinking about is agility and efficiency. Technologies like the cloud can help companies become more agile, through rapid, low-cost, self-service technology deployment. “CIOs, particularly in large companies, want public cloud agility and efficiency, but they want it in their own data centers, behind firewalls. They want more control. But the gap between where they are today and getting to that kind of efficiency is big. Part of the problem is legacy systems, proprietary networking technology, and legacy applications. So the journey from where they are today to the private cloud can take several years. As of now only 30 percent of server workloads are virtualized,” said Davis. Server consolidation and virtualization are the first steps towards cloud. Legacy systems and proprietary systems slow a company’s journey to the cloud. The tenacity towards traditional and legacy applications, tied to older architectures and platforms is also a stifling factor. But that hasn’t kept companies away from the cloud. They are using cloud for burst capacity, or certain SaaS applications may be running on the cloud. And this makes for hybrid environments, posing integration challenges. Davis said almost all CIOs have problems with storage. Users exchange e-mails and documents regardless of economic conditions, posing a perennial storage problem. One CIO we spoke to said he is moving away from tape storage, towards storage arrays that have built-in technologies like data de-duplication. BYOD is also a current concern with customers, said Davis. Dell will address mobile device management through its acquisition of KACE Systems. It also has some IP from its acquisitions of Quest Software (VDI capabilities) and Wyse (PocketCloud). Dell also works with vendors like Citrix and VMware to provide client virtualization solutions.

CONVERGENCE

System management has also been a challenge for sys admins, who had to use various management tools for different subsystems. Fortunately, it is all coming together, with new system management tools offering a single view of all the systems — except that now, it is about managing both physical and virtual machines. But for this to happen there has to be a close partnership between vendors. Dell for instance has strategic partnerships with Citrix, VMware and Microsoft. Dell’s Open Manage solution is now integrated with Microsoft System Center and VMware’s VCenter. But the integration is also on the hardware side with vendors like HP and Cisco offering converged systems. These are tightly integrated systems with storage, compute and networking pre-integrated, pre-provisioned, preconfigured, and pre-racked with a layer of management software on top. In recent months, Dell and IBM have also made the move to converged systems with IBM offering its PureSystems range and Dell offering vStart (very recently renamed and re-launched as Active Systems). Essentialy, these are bladed systems for all three elements — compute, networking, and storage. Blade architecture makes it easy to deploy, manage and scale compute resources. These are also designed for rapid virtualization (loaded up to the hypervisor level), keeping in mind a faster and simpler journey to private clouds. “According to IDC, blade servers are the fastest growing segment, and this is a proof point that people are moving to converged systems,” said Davis. Dell launched its vStart systems in the U.S. over a year ago. It has just launched Active System 800, with an integrated management console for server, storage, and networking. vStart systems had separate consoles for each. Active Systems come pre-configured for deploying 50–1,000 virtual machines, and for Microsoft or VMware hypervisors. These systems can scale up using a “pod-based approach”.

In difficult economic times, people don’t have the budget to pay for both, application migration and application maintenance. Dell sees this as an opportunity and wants to help customers migrate applications to new platforms

u Brian Pereira brian.pereira@ubm.com

december 2012 i n f o r m at i o n w e e k 59


Outlook 2013

Top 5 security predictions for 2013 In 2013, attackers will use more professional ransom screens, up the emotional stakes to motivate their victims, and use methods that make it harder to recover once compromised By Anand Naik

A

recap of the year 2012 saw the emergence of spammers taking advantage of major calendar events (The New Year, Valentine’s Day, Olympic games and Diwali) to target potential victims. Globally, information services, banking and e-commerce were the top three sectors, which were targeted by phishing attacks consistently through the year. In May 2012, all phishing attacks on Indian brands targeted the banking sector — with 1 in 4 using a .IN domain. Another trend that came through was emerging cities facing the risk of cyber attacks with a sizeable 25 percent of botinfected computers coming from cities like Chandigarh, Bhubaneshwar, Surat, Cochin, Jaipur, Vishakhapatnam and Indore. Here are top 5 predictions for 2013:

1

Cyber conflict becomes the norm

In 2013 and beyond, conflicts between nations, organizations, and individuals will play a key role in the cyber world. Espionage can be successful and also easily deniable when conducted online. Any nation state not understanding this has been given many examples in the last two years. Nations or organized groups of individuals will continue to use cyber tactics in an attempt to damage or destroy the secure information or funds of its targets. In 2013, we will see the cyber equivalent of

60

informationweek december 2012

are aggravated by an individual or a company.

“As fake antivirus begins to fade as a criminal enterprise, a new and harsher model called ransomware will continue to emerge”

Anand Naik

Managing Director- Sales, India and SAARC, Symantec

saber rattling, where nation states, organizations, and even groups of individuals use cyber-attacks to show their strength and “send a message.” Additionally, we expect more attacks on individuals and nongovernment organizations, such as supporters of political issues and members of minority groups in conflict. This type of targeting is currently seen when hacktivist groups

2

Ransomware is the new scareware

3

Madware adds to the insanity

As fake antivirus begins to fade as a criminal enterprise, a new and harsher model called ransomware will continue to emerge. Ramsomware goes beyond attempting to fool its victims; it attempts to intimidate and bully them. While this “business model” has been tried before, it suffered from the same limitations of real life kidnapping: there was never a good way to collect the money. Cybercriminals have now discovered a solution to this problem: using online payment methods. They can now use force instead of flimflam to steal from their targets. As it is no longer necessary to con people into handing over their money, we can expect the extortion methods to get harsher and more destructive. In 2013, attackers will use more professional ransom screens, up the emotional stakes to motivate their victims, and use methods that make it harder to recover once compromised.

Mobile adware, or “madware,” is a nuisance that disrupts the user experience and can potentially expose location details, contact information, and device identifiers to cybercriminals. Madware — which sneaks onto a user device when they download an app — often sends pop-

www.informationweek.in


up alerts to the notification bar, adds icons, changes browser settings, and gathers personal information. In just the past nine months, the number of apps including the most aggressive forms of madware has increased by 210 percent. Because location and device information can be legitimately collected by advertising networks as it helps them target users with appropriate advertising, we expect increased use in madware as more companies seek to drive revenue growth through mobile ads. This includes a more aggressive and potentially malicious approach towards the monetization of “free� mobile apps.

4

Monetization of social networks introduces new dangers

As consumers, we place a high level of trust in social media and use it for various things from sharing personal details, spending money on game credits, to gifting items to friends. As these networks start to find new ways to monetize their platforms by allowing members to buy and send real gifts, the growing social spending trend also provides cybercriminals new ways to lay the groundwork for attack. We anticipate an increase in malware attacks that steal payment credentials in social networks or trick users into providing payment details, and other personal and potentially valuable information to fake social networks. This may include fake gift notifications and e-mail messages requesting home addresses and other personal information. While providing non-financial information might seem innocuous, cybercriminals sell and trade this information with one another to combine with information they already have about you, helping them create a profile of you they can use to gain access to your other accounts.

5

As users shift to mobile and cloud, so will attackers

Attackers will go where users go, and

this continues to be to mobile devices and the cloud. It should come as no surprise that mobile platforms and cloud services will be likely targets for attacks and breaches in 2013. The rapid rise of Android malware in 2012 confirms this. According to the India findings of the 2012 State of Mobility Survey, there is an uptake in mobile applications across organizations with half of Indian enterprises at least discussing deploying custom mobile applications and one-third currently implementing or have already implemented custom mobile applications. As unmanaged mobile devices continue to enter and exit corporate networks and pick up data that later tends to become stored in other clouds, there is an increased risk of breaches and targeted attacks on mobile device data. Also, as users add applications to their phones they will pick up malware. In fact, according to the same survey, more than half (53 percent) of survey respondents mentioned that mobility is extremely challenging and a further 40 percent of survey respondents identified mobile devices as one of their top three IT risks. Some mobile malware duplicates old threats, like stealing information from devices. But it also creates new twists on old malware. In 2013, you can be sure mobile technology will continue to advance and thereby create new opportunities for cybercriminals. The cloud comes with its own share of security issues. According to the India findings of the 2011 State of the Cloud survey, Indian organizations are conflicted about security, rating it both as a top goal and as a top concern with moving to the cloud. Potential risks include mass malware outbreak, hacker-based theft and loss of confidential data.

As unmanaged mobile devices continue to enter and exit corporate networks and pick up data that later tends to become stored in other clouds, there is an increased risk of breaches and targeted attacks

u Anand Naik is MD-Sales, India and SAARC, Symantec

december 2012 i n f o r m at i o n w e e k 61


SUMMIT S R I

L A N K A

2 0 1 2

Event

Top tech leaders converge at Sri Lanka CIO Summit 2012 The third edition of exclusive event was attended by 86 top IT heads from Sri Lanka

T

he third edition of Sri Lanka CIO Summit, held at Hotel Galadari on 17th Oct, 2012, received a phenomenal response from tech leaders across Sri Lanka and was attended by 86 top

IT heads. The event kick-started with an interesting keynote session delivered by Sujiva Dewaraja, Chairman of SLASSCOM. Dewaraja gave deep insights into the Sri Lankan IT market and explained vast scope and opportunity with real-world examples. The keynote was followed by technology sessions from sponsors, who detailed various product offerings they have in the space. Amol Ranade, GM – Data Center, Schneider Electric talked about data center life cycle and Praveen Kandikuppa, Head of System Engineering, India, Fluke Networks spoke about application performance management. A technology session by Checkpoint gave deep insights into how organizations can mitigate advanced persistent threats (APTs). In addition, an intuitive session on the interesting subject of visual collaboration was delivered by Alok Anand, Head-Marketing, India & SAARC, Polycom. The knowledge and expertise of the speakers gave a deep understanding to technology decision makers on how good IT implementation is beneficial for the overall health of a company.

The sessions were followed by a networking cocktail and dinner, which gave everybody an opportunity to connect and collaborate. At this knowledge-cumnetworking evening, attendees got a chance to interact with internationally recognized thought leaders, as well as network with peers who are experts in the IT field. The event attracted top IT decision makers from major companies like Dialog Axiata, Hayleys, Cargills, Unilever, HNB, among others.

Sujiva Dewaraja, Chairman, SLASSCOM

62

informationweek december 2012

www.informationweek.in


SUMMIT BANGLADESH 2012

Event

High-powered summit brings together IT heads from Bangladesh Bangladesh CIO Summit, in its second edition, received a phenomenal response and was attended by 100 IT heads

T

he second edition of Bangladesh CIO Summit, held at The Ruposhi Bangla in Dhaka on 8th Nov, 2012, provided a great platform for IT heads to connect and collaborate. The exclusive event was attended by 100 top IT heads from Bangladesh. The session started with an interesting session by Munir Hasan, Secretary General, BDSON & Sr. Consultant, Ministry of ICT, who gave an overall overall perspective about the great IT potential in Bangladesh. The session was followed by another knowledgeable and insightful session by highly experienced IT professional, Sumon Ahmed Shabir, Vice President, Internet Service Provider Association of Bangladesh & Chief Security Officer, Fiber @ Home. The summit also featured a session by T.I.M Nurul Kabir, Secretary General, Association of Mobile Telecom Operator Bangladesh. Kabir spoke about the state of the telecom industry in Bangladesh. In his session, he also spoke about regulations governing this sector — a high concern area for telecom operators. The session was followed by a highly intuitive session by Praveen Kanikuppa, Head-SE India, Fluke Networks, who touched upon the interesting subject of Application Performance Management. The summit also featured a session by Kapil Awasti, Security Consultant, Checkpoint who gave insights on how can organizations mitigate the

risks related to advanced persistent threats. Apart from this, Alok Anand, Head Marketing- India & SAARC, Polycom detailed the evolution of visual collaboration. The sessions were followed by a networking cocktail and dinner, which gave attendees an opportunity to interact with internationally recognized thought leaders, as well as network with peers who are experts in the IT field. The event attracted top IT decision makers from major companies like BRAC Bank, Bangladesh Bank, HSBC Bank, Teletalk, Bangla Tel and Chevron.

T.I.M Nurul Kabir, Secretary General, Association of Mobile Telecom Operator Bangladesh (AMTOB)

december 2012 i n f o r m at i o n w e e k 63


CIO Profile our transactions today happen through mobile. Also, over 25,000 customers are tracking the market and seeing their portfolios through mobile on a per day basis.

Career Track

How long at the current company? I have been working with Geojit BNP Paribas Financial Service for the past 15 years.

A Balakrishnan CTO, Geojit BNP Paribas Financial Service

IT effectiveness can be measured by taking into account customer satisfaction index, improvement in business processes and end-user ease

Most important career influencer: Bill Gates, the business magnate and philanthropist. His views on technology have always inspired me. To quote him: “The first rule of any technology used in a business is that automation applied to an efficient operation will magnify the efficiency. The second is that automation applied to an inefficient operation will magnify the inefficiency.” I always think of this rule, before deploying any technology in my organization. Decision I wish I could do over: In the early stages, project failures and attrition disappointed me. It took me some time to analyze lessons learned and make backup plans. I later realized that it is important to improve process areas and ensure proper documentation so as to curtail the risk of attrition.

Vision

The next big thing for my industry will be… I strongly believe that the way in which we do transaction management and customer services will change in the retail capital market. Multimedia communication and content will play a major role in driving this change. Also, legacy, which the industry and regulators are carrying, will go. Advice for future CIOs: Be an integral part of the business especially in the areas of product formulation and enriching customer experience.

On The Job

Top three initiatives (in your entire career-in detail) l Mobility: I implemented and popularized trading through mobile in my organization. Today, we have applications available on Andriod, iPhone and Windows Phones. Nearly 7 percent of

64

informationweek december 2012

l

Automation of Risk Management Transactions: Another initiative that I undertook was automating risk management transactions. Capital markets are risky and due to volatility and liquidity issues, sometimes customers end up in risky positions. To address this challenge, I ensured adequate controls in place in the system. This has drastically reduced the risk of positions especially in the derivative segment because of automated alerts and actions (margin call, square off, etc). With automation, we are now able to create and manage combination orders to suit customer requirements.

l

Open Source: I have stressed on the use of open source technologies in my organization, which has resulted in significant reduction in IT cost and improved efficiencies.

How I measure IT effectiveness IT effectiveness can be measured by considering revenue and participation of customers in ITenabled channels. It can also be measured by taking into account customer satisfaction index, improvement in business processes and end-user ease.

Personal

Leisure activities: Reading books across genres, especially history. Best book read recently: Chanakya’s Chant by Ashwin Sanghi. Unknown talents (singing, painting etc): Not yet discovered. If I weren’t a CIO, I’d be... A mathematics teacher. u As told to Jasmine Kohli

www.informationweek.in


Analyst Angle

Mobility integration: A major challenge for IT

M

Saurabh Sharma

Given the diverse and complex nature of mobility integration needs, IT is struggling to find a suitable approach to enable such interactions in a time-efficient and cost-effective manner

obility integration is a multi-faceted problem: in addition to achieving seamless interaction between disparate applications, IT needs to provide a robust data security and governance framework for securing and managing data flow across a wide range of services, systems, and business processes. Then there is the need to provide a compelling user experience to different users through a wide range of access channels. Bring Your Own Device (BYOD) is driving changes in the way organizations operate and what IT is expected to deliver to support such requirements. While many unified communications and collaboration (UCC) vendors provide middleware for the development of communicationsenabled business processes (CEBPs), there are still many scenarios where custom-code development is the only viable option. Given the diverse and complex nature of mobility integration needs, IT is struggling to find a suitable approach to enable such interactions in a time-efficient and cost-effective manner.

Mobile-social app integration calls for a different approach

http://www.on the web Making peace with cloud and BYOD Read article at:

Organizations have used both traditional approaches (such as SOA and custom-code development), and cloud-based integration approaches (for example, integration PaaS) to achieve SaaS-to-on-premise, on-premise-to-on-premise, SaaS-toSaaS, and B2B integration. However, mobile and social application integration calls for a different approach. Most organizations use web applications for internal and external engagements. “Mobileenablement” of web applications is a commonly used approach for achieving interaction between mobile and web applications.

Given that this approach involves a significant amount of rework, many organizations are looking for alternatives that can deliver faster time-to-value and reduce the overall expenditure on application integration. Mobile and social applications are lightweight in nature and capable of meeting the requirements of heterogeneous user experiences. Most on-premise and SaaS applications do not provide different user experiences through different access channels. Another difference is the nature of the traffic associated with transactions performed through mobile and social applications. The “bursty” nature of the traffic associated with mobile and social applications often leads to conditions where a greater capacity is required to handle a sudden rise in traffic. IT is struggling to find a suitable solution for such issues.

Mobility integration has many facets

Mobility has had a profound impact on the way organizations operate. Organizations are using mobility solutions to interact with partners and customers on an as- and whenneeded basis and through a wide range of devices. Mobility is increasing the efficiency and effectiveness of business engagements, as well as providing other benefits such as compelling user experience across a wide range of devices and seamless access to data and applications on a 24×7 basis. The rapid proliferation of mobility solutions is creating complex integration issues that need new solutions. Lightweight mobile middleware is gaining ground as a suitable approach for integration of mobility solutions with enterprise IT systems and business processes. With the implementation of BYOD policies, organizations are allowing employees to bring in a wide range

december 2012 i n f o r m at i o n w e e k 65


Analyst Angle

of devices, operating systems, and applications. IT is under pressure to provide a robust data security, privacy, and governance framework encompassing a wide range of disparate applications, systems, and business processes. Many organizations have enforced policies that define the boundaries of interaction between different devices and applications and are using new service gateways for enabling, securing, and monitoring such interactions. Another interesting trend is the integration of BYOD policies with UCC solutions to provide an identical experience to different collaborating parties. Such initiatives focus on enabling collaboration across a wide range of devices, platforms, and applications deployed both within and outside the enterprise. Integration of BYOD policies and UCC solutions is no easy task; IT needs to ensure that UCC solutions provide compelling user experiences through user interfaces of different devices. IT also needs to ensure that different devices covered under the BYOD program support such interactions, be it smartphones and tablets or IP-PBX systems. Hosted UCC solutions are fast emerging as a suitable option for organizations that have embraced BYOD programs, providing access to various tools, applications, and systems via the cloud and ensuring that users do not need to worry about installing software on individual devices. These solutions support a

66

informationweek december 2012

wide range of mobility integration needs and provide many benefits such as cost savings, easier software upgrades, faster deployment, and flexible pricing models. Hosted UCC solutions provide the necessary resources for controlling access to different applications and take care of device management and monitoring needs. While most hosted UCC solutions work well with smartphones, a VoIP client is required for accessing different applications and systems through tablets. Many UCC vendors provide middleware that enables integration between communication systems and business applications through a single configuration interface. Vendors such as Cisco, Avaya, and Microsoft offer packaged solutions that provide realtime communication between UCC solutions and a range of IT systems and business processes, especially for development of CEBPs. Such solutions also provide the capability to develop customized communications and collaboration solutions by using web APIs. However, such solutions do not cover all scenarios, and in many cases, custom-code development is the only viable solution.

Integration of BYOD policies and UCC solutions is no easy task for IT as it needs to ensure that UCC solutions provide compelling user experiences on different devices. IT also needs to ensure that devices covered under the BYOD program support such interactions

u Saurabh Sharma is a Senior Analyst

with Ovum IT and is a member of Ovum’s IT Solutions team. Saurabh focuses on service-oriented architecture (SOA) and integration. His research covers different approaches to application, B2B, and cloud service integration.

www.informationweek.in


Blog

Why Thom Singer won’t accept your LinkedIn invite

I

n early November, I was privileged to travel with almost 200 IT executives from India to the exotic destination of Macau. We were there to attend the annual Dell India IT Executive Summit. The execution of this event was superb, and the sessions were thought provoking. Speakers such as the CIO of HongKong, the ex-CIO of London, the founding editor of Wired magazine, and of course, the leadership team from Dell India and Asia, made this event worthwhile. But what impressed me most was what Conference Catalyst Thom Singer was doing right through the event. Thom is a prolific speaker who has authored several books and you can check the list here: http:// amzn.to/TV7Dp9. He has more than 20 years of marketing and business development experience with Fortune 500 companies. And he’s trained more than 25,000 professionals in the art of building professional contacts that lead to increased business. As Conference Catalyst Thom’s mandate was to get delegates to engage or network as much as possible. He deployed a mole in the audience to check who was most active at creating discussion groups during the breaks. Those who were good at making conversation and engaging with others were fortunate to receive a copy of Thom’s book, titled The ABC’s of Networking. And the grand prize was a copy of this book, re-published in Hindi, signed by Thom. Prodded by Thom (or perhaps the incentive), even the shy executives made an attempt to make conversation. The discussions ranged from gambling in Macau, to Indian culture, Indian food, arranged marriages in India, relationships with in-laws, politics, and of course, technology. IT executives were busy exchanging business cards, often running out of cards. At one point, someone suggested

that business cards will soon become obsolete. In the near future, we’d just be touching our hand phones to exchange Vcard and contact information — with Near Field Communication (NFC) technology (you can do this today with certain smartphones). While speaking to Thom in a group during lunch, I asked him what he thought about professional networking site LinkedIn. Thom said he has a personal rule to never accept LinkedIn Invites — unless he has met that person at least once for beer, lunch or dinner. In Thom’s own words, “You will never remember a person, the event where you met, and the topic discussed unless you meet him/her several times.” And in his book, The ABC’s of Networking he writes: To have someone become a valuable part of your network, you need to have seven to 10 personal interactions. Anything less than that makes them just a person you know. For each letter in the alphabet, the book offers a nugget of advice. And towards the end, as we get to the alphabets X,Y,Z Thom says “there is no end to networking”. You’ve put in a lot

of effort to build and cultivate business and personal relationships, but keep at it. Thom writes: If you fail to nurture existing relationships, your network will just wither away. So do call up your friends whenever you are visiting their area and make it a point to meet over coffee after you’ve completed all your meetings for the day. Relationships are like plants — you must continue to tend to them to see them live. During his presentation Thom shared an experience with his 90-yearold father. One day while playing cards with his father, Thom kept checking e-mail messages on his phone, much to his father’s annoyance. His father insisted that Thom gave him his 100 percent attention so that they could fully appreciate the game. Thom tried to explain that it was a more connected world now, and one had to keep up with e-mail. But that doctrine was flatly rejected by Thom’s father, who lived in a generation where people had to get to a phone to be contacted — and e-mail was not present in the commercial world. I think we can all learn something from this encounter. In a hyperconnected world filled with gadgets to simplify our lives, some principles just don’t change. The human element and human contact supplants any technology or device. Age-old customs of meeting people, shaking hands, and exchanging business cards aren’t going to be replaced by LinkedIn invites anytime soon. And when you do go for group or one-on-one meetings, switch off your BlackBerrys and tablets and give the other person your 100 percent attention. Make eye-contact, listen attentively, and take notes. Is that too much to demand for a fruitful relationship? u Brian Pereira blogs regularly at www.informationweek.in

december 2012 i n f o r m at i o n w e e k 67


Global CIO

How do you survive the innovation hamster wheel?

I

Chris Murphy

This is how two tech leaders keep the innovation wheel spinning

LOGS Chris Murphy blogs at InformationWeek. Check out his blogs at:

68

informationweek december 2012

t really isn’t ever going to stop, is it? You had that great idea. You aligned, crowdsourced, socialized, got buy-in, managed change, transformed ... Name it, and you did what was needed to push from brainstorm idea to business success. Time to do it again. Now what have you got? We talk a lot about tech-driven innovation, but we should acknowledge that what IT and other business leaders want to do just isn’t natural. Yes, everyone wants to work on new and exciting projects. But then most normal people want to settle back into some steady state. People resist the notion of constant innovation, of constant disruption and change. It’s why CIOs need some process for innovation, some formal way not only to keep disruptive ideas coming, but to guide them safely through the gauntlet of ways companies can kill even great ideas. Richard Thomas, the CIO of Quintiles, which provides services to pharmaceutical and biotech companies to help them run clinical trials, has a somewhat unusual way of keeping innovation in the pipeline. “We passionately believe in giving our ideas away,” he said. Thomas pushes his IT team to come up with what he calls “game changing” ideas: IT-powered new products that drive revenue, or process improvements that reduce costs significantly. Once IT has that great idea, however, Thomas looks for the right business unit to partner with and lets that business unit champion it and carry it forward. For example, Quintiles has worked for three years on a project called Semio, with partner and customer Eli Lilly. Semio is a collaborative environment that pharma companies can use to examine healthcare data to design the best approach to a clinical trial, balancing the cost, time and chance for success. To develop that project, Thomas teamed his IT pros with Quintiles business unit experts and Eli Lilly experts; those business experts led three of the four technical develop-

ment teams. Giving away ideas isn’t always easy for ambitious IT folks to swallow. But Thomas doesn’t want to build a fiefdom of products that he and his IT team run on an ongoing basis. If IT hands off projects once they’re built, that keeps a certain pressure on his team to come up with the next game-changing idea. “New IT is definitely not for everyone,” Thomas warned. “It’s not certain what it will look like when it’s finished, especially with some of these game-changing types of initiatives.” Allstate has its own way of keeping the innovation wheel spinning. For the past five years, an innovation team run out of the IT organization has worked with business units to figure out the company’s biggest problems and then rally employees to generate ideas to solve them. It started out with plenty of doubters. Matt Manzella, Allstate’s Director - Technology Innovation, remembers how the innovation team wanted chairs that were a different color from those used elsewhere at Allstate, to send a small visual cue to visitors that they’re doing something different in that area. Allstate runs two or three “innovation blitzes” a month — online brainstorming sessions aimed at a specific problem, using software from Spigit to vote ideas up and down and marshal comments. The innovation team works with business units to frame those problems, so when good ideas come in there’s a receptive audience. What matters isn’t the exact approach. What does matter is having a system for innovation that the company recognizes and fits the company culture. Having such an effort acknowledges that continuous, disruptive innovation, while an unnatural act, is nonetheless critical to keeping companies and products from stagnating.

u Chris Murphy is Editor of

InformationWeek. Write to Chris at cjmurphy@techweb.com

www.informationweek.in


Practical Analysis

What’s killing APM?

I

Art Wittmann

App performance management is seen as less important than it was two years ago, partly because vendors haven’t kept up

LOGS Art Wittmann blogs at InformationWeek. Check out his blogs at:

t’s not precisely clear that the cloud is killing application performance management, but something is. It could be that cash-strapped IT teams whose application portfolios are changing rapidly can’t give APM the time and resources to do it right, so they’re using APM, at least as we classically have seen it, less and less. Then there are the tools themselves, which are notoriously hard to set up. And there’s the rise of software-as-a-service and apps running from public or private cloud infrastructures. If you had a working APM methodology a few years ago, it has been broken by the use of cloud apps. Whatever the cause, our October 2012 survey on APM shows the tech is now seen as less important than it was in our August 2010 survey. Asked about the importance of APM in 2010, 50 percent of respondents said it was crucially important. That percentage is down to 42 percent. At the same time, survey respondents say their users are more tolerant of outages than they used to be. They have to be: Those experiencing outages on a daily basis went from 8 percent in 2010 to 10 percent, and those experiencing monthly outages went from 24 percent in 2010 to 27 percent now. This isn’t a result most app managers would be proud of. So, why is APM falling out of favor? In 2010, the reasons given for not implementing APM were: insufficient expertise to use the product (50 percent), high cost (41 percent), and it takes too much staff time to do it right (32 percent). Those are still the top three reasons for not doing APM, but the order has changed. Now a lack staff time heads the list, followed by a dearth of expertise and the cost. As app environments become more dynamic and lifecycles for apps shorten, the substantial effort required for APM isn’t worth the already iffy results it provides. And as teams slack on deployments, they get even poorer results, further devaluing APM tools. In 2010, 18 percent of survey respondents said their

APM tool exceeded their expectations, down to 10 percent today. Today, there’s a five-point rise in those saying APM products are falling below expectations. The dissatisfaction with executive dashboards is even more acute. In 2010, 21 percent said they were extremely satisfied and 41 percent somewhat satisfied with such dashboards; that’s down to 12 percent and 32 percent, respectively. It’s tempting to blame the cloud, but the fraction of apps reported to be running “in the public cloud” is about the same as it was in 2010. It appears that APM simply doesn’t provide the bang for the buck, and so not using it isn’t seen as an impediment to adopting applications of any sort. When asked whether the lack of APM tools is a barrier to wider use of cloud apps. In 2010, 49 percent said it wasn’t a barrier; 61 percent say it isn’t a barrier today. One area where APM tools definitely have let IT pros down is in keeping up with complexity. As apps take a more service-oriented design, the task of setting up an APM tool to give anything close to meaningful information is much harder than it used to be. As one survey respondent put it: “Unfortunately, current APM tools do not work well in dynamic web services, and public and private cloud-based infrastructures, since they depend on statically defined relationships. By the time these relationships are defined to the APM tool, they will in all probability be obsolete, thereby negating the value and relevance of the APM tool.” The failure of APM software vendors to keep up with user needs is breathtaking. Because the nature of app life cycles has changed so profoundly, APM as a third-party product has outlived its usefulness for most environments. Service component deployments with their own self-health reporting capability should be preferred. u Art Wittmann is Director of

InformationWeek Analytics, a portfolio of decision-support tools and analyst reports. You can write to him at awittmann@techweb.com.

december 2012 i n f o r m at i o n w e e k 69


Down to Business

What CIOs want in their successors

S

Rob Preston

Are you an aspiring CIO? Listen to what these leading CIOs say it will take for you to reach that coveted position

LOGS Rob Preston blogs at InformationWeek. Check out his blogs at:

70

informationweek december 2012

o you aspire to become a CIO, or at least move well up the chain of command. Just understanding the nuances of “the business” isn’t enough — it’s long past time for CIOs to move beyond the “alignment” rhetoric and get on with creating customer-focused business opportunities powered by technology. I recently participated in a panel session at INTEROP Mumbai in which four leading Indian CIOs discussed the attributes they’re looking for in a successor. In preparation for that session, I talked with four CIOs in the U.S. about the same subject. Here’s what they say they’re looking for in their top people. Multidimensional leaders. “They may be brilliant technologists, but without the business engagement and leadership skills, they will not make it to the higher level,” says Dave Bent, CIO of United Stationers, who advises wouldbe CIOs to gain experience in a variety of business and IT roles. Jerry Johnson, CIO of Pacific Northwest National Laboratories, a research lab under the auspices of the Department of Energy, puts the emphasis on breadth of IT experience: software development, infrastructure, operations, architecture, project management. “Consequently, I encourage — but don’t force — lateral movement within the organization,” he says. Customer-focused product and brand champions. By “customer,” we’re not talking about the company’s IT users. We’re talking about the people who buy your company’s products. Customer skills are particularly important for CIOs at technology companies, where the CIO often doubles as a dog-food-eating product spokesman. But a customer orientation is critical for all CIOs, says Kent Kushar, CIO of E. & J. Gallo Winery, who’s as comfortable at wine-tasting events discussing vintages and palate taste zones as he is at board meetings explaining analytics and supply chain

management. Players. The best CIOs get to know, on both a professional and personal level, the senior line execs responsible for delivering their companies’ core business results. Bent wants CIO candidates to have had exposure at the board level. “The CIO has to have the same broad leadership characteristics as any other C-level position — listening, communicating, as well as leading,” he says. Rajesh Uppal, CIO of Indian carmaker Maruti Suzuki, says tomorrow’s IT leaders must “empathize with and understand their users and then come back and offer some value.” Battle-tested warriors. CIO candidates must show demonstrable wins on projects, IT and otherwise. InformationWeek’s Secret CIO, who works for a USD 1 billion-plus company, relates the time he asked one of his direct reports to improve customer satisfaction with the company’s phone system. “She established call-handling benchmarks. She interviewed VPs and LOB managers to understand how they measured customer satisfaction. She talked with customers. She put a technology project in place as well, and when process changes were completed, she demonstrated, using the same metrics, that satisfaction levels had increased significantly.” Self-starters and go-getters. Arun Gupta, CIO of Indian pharmaceutical company Cipla, is looking to add 35 IT specialists to the 17 people in the company’s core IT group. “Are they going to put their neck on the block, irrespective of whether I agree or disagree?” Gupta says. V. Subramaniam, Asia-Pacific CIO of Otis Elevator, says he wants people with “fire in the belly and fire in the eyes.” He also emphasizes “the discipline of the execution. They have to make things happen, without excuses.” u Rob Preston is VP and Editor-in-Chief of InformationWeek. You can write to Rob at rpreston@techweb.com.

www.informationweek.in


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.