Edit
Can good analytics tools guarantee good decisions? et me begin by wishing all our readers and partners a very Happy and Successful New Year 2013. In our December 2012 issue ‘The Data Management Conundrum’, we talked about data and all the challenges associated with it. We also told you about the data management strategies adopted by leading companies in India. So I believe this issue is a logical extension of that story — once you have collected the data, you get down to analyzing it and deriving meaningful insights from it to drive business. The lead story titled ‘Advanced Analytics’ is written by Doug Henschen, a renowned analyst at InformationWeek USA. Advanced Analytics is all about statistical analysis and predictive modeling — being able to see what’s coming ahead of time, and preparing for it before it hits you. When it comes to mining Big Data and using advanced analytics, there’s no better example than the telecom sector. Right since inception, telcos have been analyzing call detail records to understand usage patterns, services availed, and other factors — and using that data to up-sell and cross-sell products and services. For instance, MTS India has tapped analytics to offer an innovative rewards program called m-Bonus, which dynamically computes what the consumer needs and offers the best fit deal. This initiative helped MTS devise extremely targeted campaigns. And the fruitful result was that it increased its revenue by 2 percent on a month-by-month basis, and enabled savings of approximately 20 percent in operational costs. Read more about this in Amrita Premrajan’s interview with MTS CIO Rajeev Batra. Organizations in other sectors are also making investments in analytics solutions. In the manufacturing sector, we have Hero MotoCorp, for instance, using analytics to achieve business objectives. Vijay Sethi, CIO, Hero MotoCorp tells us that spending on business analytics and predictive analytics is a key priority in 2013. Read our interview with Vijay to find out more. But what’s the use of investing in expensive analytics solutions if your people don’t have the skills to analyze data and to use it to make good decisions? That’s a topic debated in the article ‘Good Data Won’t Guarantee Good Decisions’ in Harvard Business Review (HBR) (April 2012). I also recommend the October 2012 issue of the same magazine, with the cover story titled ‘Getting Control of Big Data’. In the HBR article, the writer, Shvetank Shah declares investment in analytics can be useless, even harmful, unless employees can incorporate that data into complex decision making. Even if some of your employees majored in statistics, you still want to ensure that they are capable of making good business decisions. You want to evaluate and hone analytical skills, balanced judgment, and data literacy. And then you want to give these workers the right tools to infer meaningful business insight from Big Data. Some organizations prefer to outsource the task of data analysis. Others hire a new breed of professionals who are in high demand these days — called Data Scientists. So what’s the approach you take?
L
What’s the use of investing in expensive analytics solutions if your people don’t have the skills to analyze data and to use it to make good decisions?
Follow me on Twitter
@brian9p
4
informationweek january 2013
u Brian Pereira is Editor of InformationWeek India. brian.pereira@ubm.com
www.informationweek.in
contents Volume 2 | Issue 03 |
January 2013
20 Cover Story Advanced Analytics Predictive analysis is getting faster, more accurate and more accessible. Combined with Big Data, it’s driving a new age of experimentation
28 32
Big Data, Big Questions With organizations increasingly adopting a more matured approach to data management, vendors are aggressively pursuing the opportunity
13 Big Data vendors to watch in 2013 From Amazon to Splunk, here’s a look at the Big Data innovators that are now pushing Hadoop, NoSQL and Big Data analytics to the next level
case study
37
Cover Design : Deepjyoti Bhowmik
41
Rajeev Batra, CIO, MTS India
Analytics helps Usha International respond to dynamic market needs By implementing SAP HANA Database along with SAP NetWeaver Business Warehouse component, Usha International is able to quickly churn out real-time business-critical information and rapidly respond to dynamic market needs
opinion
43
interview
38 40
‘Organizations need to cultivate an analytic-driven culture’ Maneesh Sharma, Head - Database & Technology, SAP India
‘Business analytics is a key priority for us in 2013’ Vijay Sethi, Vice President and CIO, Hero MotoCorp
Do you Twitter? Follow us at http://www.twitter.com/iweekindia
6
informationweek january 2013
‘Data mining and BI can make all the difference between success and failure of a business model’
44 46
Find us on Facebook at http://www.facebook. com/informationweekindia
On-demand Big Data analytics: An emerging trend Emerging trend of on-demand analytics is enabling businesses of all sizes to avail supercomputing power on a pay-per-use model and analyze complex data sets to derive business-critical information
Big Data, bigger opportunities for insurance companies Technology will ignite post-financial crisis opportunity and growth for the insurance industry
Big Data driving the role of Customer Experience Officer With companies investing in the technologies that capture and analyze Big Data, Customer Experience Officer can be the conduit for ensuring that the relevant information is shared throughout an organization If you’re on LinkedIN, reach us at http://www.linkedin.com/ groups?gid=2249272
www.informationweek.in
THE BUSINESS VALUE OF TECHNOLOGY
interview 47 ‘Public-Private Partnership is key to enhance security’ Kamlesh Bajaj, CEO, DSCI
interview 50 ‘There’s never been a better time to be in IT’ Sanjay Mirchandani, CIO and COO, Global Centers of Excellence, EMC Corporation
13 14
16
Our success in the industry clearly points to an industry transition David Yen, Senior Vice President and General Manager, Data Center Technology Group, Cisco
54 Outlook 2013 Industry leaders predict key trends across various technology segments that are set to influence enterprise IT in 2013
News
EDITORIAL.........................................................4
CloudMunch eyes billion-dollar PaaS market
INDEX..................................................................8
38 percent of information in Indian organizations is duplicate
Chirpings from twitterati .................. 10
L&T Infotech targets SMBs with App Shop 71 percent of organizations using SaaS for less than three years: Gartner survey
15
52 interview
social sphere...............................................11 the Month in technology..................... 12 news analysis..............................................17
Less than 1 percent of world’s data is being analyzed: Study
industry interface.................................. 61
Dell SonicWALL aims to tap the enterprise market
analyst angle............................................67
Checking smartphone is part of morning ritual for Gen Y Indians Citrix desktop virtualization solution helps Fareportal slash power and real estate costs
global cio................................................... 68 practical analysis................................. 69 down to business..................................... 70
january 2013 i n f o r m at i o n w e e k 7
Imprint
VOLUME 2 No. 03 n January 2013
print online newsletters events research
Managing Director : Sanjeev Khaira Printer & Publisher : Kailash Pandurang Shirodkar Associate Publisher & Director : Anees Ahmed : Brian Pereira Editor-in-Chief Executive Editor : Srikanth RP Principal Correspondent : Jasmine Kohli Principal Correspondent : Ayushman Baruah (Bengaluru) Senior Correspondent : Amrita Premrajan (New Delhi) : Shweta Nanda Copy Editor Design Art Director : Deepjyoti Bhowmik Senior Visualiser : Yogesh Naik Senior Graphic Designer : Shailesh Vaidya Graphic Designer : Jinal Chheda Designer : Sameer Surve Marketing Deputy Manager : Sanket Karode Deputy ManagerManagement Service : Jagruti Kudalkar online Manager—Product Dev. & Mktg. : Viraj Mehta : Nilesh Mungekar Deputy Manager—Online : Nitin Lahare Web Designer Sr. User Interface Designer : Aditi Kanade Operations Head—Finance : Yogesh Mudras Director—Operations & : Satyendra Mehra Administration Sales Mumbai : Marvin Dalmeida Manager- Sales marvin.dalmeida@ubm.com (M) +91 8898022365 Bengaluru : Kangkan Mahanta Manager—Sales kangkan.mahanta@ubm.com (M) +91 89712 32344 Delhi : Rajeev Chauhan Manager—Sales rajeev.chauhan@ubm.com (M) +91 98118 20301
Head Office UBM India Pvt Ltd, 1st floor, 119, Sagar Tech Plaza A, Andheri-Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400072, India. Tel: 022 6769 2400; Fax: 022 6769 2426
Production Production Manager : Prakash (Sanjay) Adsul Circulation & Logistics Deputy Manager : Bajrang Shinde Subscriptions & Database Senior Manager Database : Manoj Ambardekar manoj.ambardekar@ubm.com : Deepanjali Chaurasia Assistant Manager deepanjali.chaurasia@ubm.com
Printed and Published by Kailash Pandurang Shirodkar on behalf of UBM India Pvt Ltd, 6th floor, 615-617, Sagar Tech Plaza A, Andheri-Kurla Road, Saki Naka Junction, Andheri (E), Mumbai 400072, India. Editor: Brian Pereira, Printed at Indigo Press (India) Pvt Ltd, Plot No 1c/716, Off Dadaji Konddeo Cross Road, Byculla (E), Mumbai 400027.
associate office- pune Jagdish Khaladkar, Sahayog Apartment 508 Narayan Peth, Patrya Maruti Chowk, Pune 411 030 Tel: 91 (020) 2445 1574 (M) 98230 38315 e-mail: jagdishk@vsnl.com International Associate Offices USA Huson International Media (West) Tiffany DeBie, Tiffany.debie@husonmedia.com Tel: +1 408 879 6666, Fax: +1 408 879 6669 (East) Dan Manioci, dan.manioci@husonmedia.com Tel: +1 212 268 3344, Fax: +1 212 268 3355 EMEA Huson International Media Gerry Rhoades Brown, gerry.rhoadesbrown@husonmedia.com Tel: +44 19325 64999, Fax: + 44 19325 64998 Japan Pacific Business (PBI) Shigenori Nagatomo, nagatomo-pbi@gol.com Tel: +81 3366 16138, Fax: +81 3366 16139 South Korea Young Media Young Baek, ymedia@chol.com Tel: +82 2227 34819; Fax : +82 2227 34866
RNI NO. MAH ENG/2011/39874
Editorial index Person & Organization A Balakrishnan, Geojit BNP Paribas Financial Services ..................21 Abhay Chitnis, L&T Infotech .....................................14 Amit Singh, Dell SonicWALL ....................................15 Anurag Shah, Omnitech InfoSolutions.................21 David Yen, Cisco ............................................................52 Daya Prakash, LG Electronics....................................20 Dr Matt Wood, Amazon Web Services .................43 Hu Yoshida, Hitachi Data Systems..........................55 K P Unnikrishnan, Brocade........................................58 Kamlesh Bajaj, DSCI .....................................................47 KT Rajan, Allergan ........................................................22 Linda Price, Gartner .....................................................67 Maneesh Sharma, SAP India.....................................38 Mobeen Khan, AT&T Business..................................57 Parvinder Singh, Max Life Insurance ....................22 Pradeep Prabhu, CloudMunch ...............................13 Prashant Gupta, Verizon Enterprise Solutions...44 Rajeev Batra, MTS India ..............................................41 Rajesh Shetty, Cisco .....................................................17 Ravikiran Mankikar, Shamrao Vithal Bank...........20 Ryan Hollenbeck, Verint Systems ...........................46 Sanjay Deshmukh, Citrix ............................................60 Sanjay Mirchandani,EMC Corporation..................50
ADVERTISERS’ INDEX Company name Page No.
Website Sales Contact
Sanjay Sharma, IDBI Intech .......................................20
Ctrl S
2
www.ctrls.in/mumbai-data-center
Sid Nair, Dell Services ..................................................18
CloudConnect
3 www.cloudconnectevent.in salil.warior@ubm.com
Interop
9 www.interop.in
salil.warior@ubm.com
Virtual Interop
9
salil.warior@ubm.com
www.interop.in
marketing@ctrls.in
Symantec
71 www.symantec.com
sheraz_hasan@symantec.com
Microsoft
72 www.windowsserver2012.in microsoft.in/readynow
Subodh Dubey, Usha International .......................37 Suresh Kumar, Seven Hill e-health.........................21 Udayan Banerjee, NIIT Technologies ...................22 Vijay Sethi, Hero MotoCorp.......................................40
Important Every effort has been taken to avoid errors or omissions in this magazine. In spite of this, errors may creep in. Any mistake, error or discrepancy noted may be brought to our notice immediately. It is notified that neither the publisher, the editor or the seller will be responsible in respect of anything and the consequence of anything done or omitted to be done by any person in reliance upon the content herein. This disclaimer applies to all, whether subscriber to the magazine or not. For binding mistakes, misprints, missing pages, etc., the publisher’s liability is limited to replacement within one month of purchase. © All rights are reserved. No part of this magazine may be reproduced or copied in any form or by any means without the prior written permission of the publisher. All disputes are subject to the exclusive jurisdiction of competent courts and forums in Mumbai only. Whilst care is taken prior to acceptance of advertising copy, it is not possible to verify its contents. UBM India Pvt Ltd. cannot be held responsible for such contents, nor for any loss or damages incurred as a result of transactions with companies, associations or individuals advertising in its newspapers or publications. We therefore recommend that readers make necessary inquiries before sending any monies or entering into any agreements with advertisers or otherwise acting on an advertisement in any manner whatsoever.
8
informationweek january 2013
www.informationweek.in
‘There’s never been a better time to be in IT’: EMC CIO Sanjay Mirchandani, CIO and COO, Global Centers of Excellence, EMC Corporation talks about the company’s business transformation and its adoption of cloud and Big Data technologies. He also tells us how the future of business will be impacted by analytics http://bit.ly/10is6YZ
Top 8 predictions for business mobility in 2013: AT&T In 2013, we will see many more examples of businesses, industries, and eco-systems transforming through the innovative use of mobility http://bit.ly/V8QeK1
Joe Radomsky@jradomsk tweeted:
Great article by @mobeenkhan2 on #mobile trends for business to watch for in 2013. http://goo.gl/2Nif7
Martin Mc Cormack@realmmc tweeted:
My sentiments exactly, ‘There’s never been a better time to be in IT’: EMC CIO http://t.co/bhZ7IX1m via @twttimes
Betsy Bennett, Ph.D.@drbetsyb tweeted: InformationWeek – Mobile > Top 8 predictions for business mobility in 2013: AT&T http://ow.ly/ghPqZ
James Stanbridge@ stanbridge1 tweeted:
EMC transformed itself from storage company to a provider of cloud computing solutions #li #cloud http://t. co/oLIpaOYD
Michael Phelan@MPhelan1111 tweeted:
Top 8 predictions for business mobility in 2013: AT&T http://bit.ly/V8QeK1
Shane Tickell@ shanetickell tweeted:
“There’s never been a better time to be in IT” Couldn’t agree more! http://t.co/QGsBLDqR #NHS #healthIT #cloud
‘BYOD era needs diligent monitoring of security incidents’
Cloud and mobile: A nightmare for security?
In an exclusive interview, Sanjay Poonen, President and Corporate Officer, Global Solutions and Head of Mobility Divisions, SAP shares how enterprises can ride the BYOD wave, while ensuring security. He also talks about the opportunities emerging in the mobility space http://bit.ly/W1Gxtg
A panel discussion at NASSCOM-DSCI’s Annual Information Security Summit 2012 highlighted the security threats emerging from the heightened use of cloud and mobile – “Clobile”-- by the organizations http://bit.ly/12EVEQ2
Perimeter E-Security@ perimeternews tweeted: Today’s #mobile workforce expects flexibility
http://t.co/8rqZt6u6 Your #BYOD plan must balance convenience & security http://t.co/iwRrWiDx
Chip Tsantes@ chip_tsantes tweeted:
InformationWeek – Security > Cloud and mobile: A nightmare for security? http://t.co/O6H8RaWZ via @iweekindia & vendors that put u in cloud
10
informationweek january 2013
Sweekriti Pradhan@ sweeekriti tweeted:
‘BYOD era needs diligent monitoring of security incidents’ http://t.co/SYiQPM0h via @InformationWeek #TPRTech
Michele Pagliuzzi@sekuremike tweeted:
‘BYOD era needs diligent monitoring of security incidents’ http://t.co/m4CPCuFr
Follow us on Twitter Follow us on Twitter @iweekindia to get news on the latest trends and happenings in the world of IT & Technology with specific focus on India.
www.informationweek.in
Social Sphere Join us on facebook/informationweekindia
Home
Profile
Friends
Inbox
Search
Wall Info Photos Boxes Discussions
InformationWeek India Share:
Post
Photo
Write something...
InformationWeek India By and large, enterprises are still unclear about their cloud strategy. To know more, click here http://www.informationweek.in/cloud_ computing/12-12-05/harnessing_the_power_ of_hybrid_cloud.aspx Like Comment Share l
l
Subramanian Ls like this. Subramanian Ls Enterprises need to understand first what the cloud can do for them, only then can a strategy follow ! December 5 at 1:16 pm Unlike l
1
l
InformationWeek India IBM unveiled the seventh annual “IBM 5 in 5” – a list of innovations that have the potential to change the way people work, live and interact during the next five years. These include: Touch: You will be able to touch through your phone Sight: A pixel will be worth a thousands words Hearing: Computers will hear what matters Taste: Digital taste buds will help you to eat smarter Smell: Computers will have a sense of smell
Wall Info Friend Activity Photos Events Website
Like Comment Share l
l
Rakesha Harish, Rahul Pratap Singh, Harish Gowda and 3 others like this.
About
Shubhendu Jha Luv to c the change
InformationWeek is the leading news and information source for information...
Write a comment...
More
Fan of December
Like, Tag and Share us on Facebook
Jobsin Joseph is the Facebook fan for the month of December 2012. He has completed his education from Anna University, Chennai. He is an avid follower of InformationWeek India stories and shares the updates with his friends.
Get links to the latest news and information from the world of IT & Technology with specific focus to India. Read analysis, technology trends, interviews, case studies whitepapers, and blogs and much more…
Participate in quizzes and contests and win prizes!
Like Comment
Tag Photo
Like Comment l
l
Share
january 2013 i n f o r m at i o n w e e k 11
News
T h e m o n t h i n t e c h n o lo g y
HTC prepping Windows 8 rival to iPad Mini? Taiwanese computer maker HTC next year plans to release a pair of tablets running Microsoft’s Windows 8 operating system, including a 7-inch model that could be aimed squarely at the iPad Mini and Amazon’s Kindle Fire, according to a published report.
Bloomberg, citing an unnamed source, reported that both models will run Windows RT, a pared down version of Windows 8 that only runs apps preinstalled by Microsoft — including Office 13 — or those downloaded from the company’s online Windows Store.
All Windows RT tablets are powered by chips that use ARM’s mobile processor reference design, which aims for energy efficiency and long battery life. HTC plans to release the devices next fall, Bloomberg reported, adding that both will be powered by ARM-based chips manufactured by Qualcomm. HTC was excluded from Microsoft’s original list of vendors authorized to build Windows 8 tablets and hybrids. If Microsoft has now extended a license to HTC, it may be a sign that Redmond is looking to broaden distribution for its new platform, sales of which are said to be below expectations. There are other signs that Microsoft is ready to pull out the stops to get Windows 8 before as many eyeballs as possible. Redmond originally planned to sell Surface only through its company-owned online and brick-and-mortar stores. But last week it reversed course, releasing Surface RT for sale at Best Buy and Staples. —InformationWeek USA
Red Hat to acquire ManageIQ for USD 104 million Red Hat recently entered into a definitive agreement to acquire ManageIQ, a leading provider of enterprise cloud management and automation solutions that enable organizations to deploy, manage and optimize private clouds, virtualized infrastructures and virtual desktops. With the addition of ManageIQ technologies to its portfolio, Red Hat will expand the reach of its hybrid cloud management solutions for enterprises. Red Hat has acquired ManageIQ, a privately-held company, for approximately USD104.0 million in cash. As an existing member of the Red Hat Enterprise Virtualization Certified Partner program, ManageIQ, has
12
informationweek january 2013
worked closely with Red Hat to provide customers with unified monitoring, management and automation solutions. Paul Cormier, President, Products and Technologies, Red Hat said, “Industry and customer response to Red Hat’s vision for the open hybrid cloud has been overwhelmingly positive. For enterprise cloud initiatives, effective cloud management is critical. ManageIQ offers robust features, including orchestration, policy, workflow, monitoring and chargeback, that deepen Red Hat’s cloud management capabilities and bring the promise of open hybrid cloud a step closer for the industry.”
Hughes India bags contracts worth ` 200 crore for Ministry of Finance ATM Project Hughes Communications India Ltd. (HCIL), a subsidiary of Hughes Network Systems, LLC (Hughes), a provider of broadband satellite networks and services, recently announced that it has been contracted to connect 27,000 offsite ATMs with a secure broadband satellite network over the next two years and provide managed network services for various public sector banks. The contracts have a total value of ` 200 crore (approx USD 36 million) over an eight-year period, and were awarded by eight Managed Service Providers (MSPs), namely, AccuraInfotech, AGS Transact Technologies, EPS, FIS Global, Financial Software and Systems, MphasiS, NCR, and Tata Communications Banking Infra Solutions. The win is part of one of the largest outsourcing deals in the financial sector, under which the Ministry of Finance and a consortium of all the public sector banks of India have contracted with nine different MSPs to install and manage a total of 63,000 off-site and on-site ATMs across urban and rural India. The MSPs were chosen through a series of reverse auctions for 16 different circles, conducted by six lead banks, namely, State Bank of India, Bank of India, Punjab National Bank, Bank Of Baroda, Union Bank of India, and Canara Bank. India currently has approximately 100,000 ATMs and this initiative is expected to increase the ATM density by over 60 percent.
www.informationweek.in
Cloud Computing
CloudMunch eyes billion-dollar PaaS market Software developers now have a chance at focusing on doing what they are best at — coding, for, they can leave the rest to Seattle-incorporated and Bangalore-based startup CloudMunch. The 15-member startup offers developers with a cloud-based platform that claims to take care of the entire software development lifecycle, enabling them to build, test, run and manage their applications in an automated environment. In the words of the company’s founders, developers can now “Focus on the code, CloudMunch the rest.” Pradeep Prabhu and Prasanna Raghavendra, both long-term Infosys executives started CloudMunch in September 2011 with the aim of delivering better software faster in the cloud. “We do so by providing early and instant feedback; automating the entire process with one click; and enabling ‘dev’ and ‘run’ seamlessly in the cloud,” Raghavendra, the company’s Co-founder and CTO said. The platform-as-a-service (PaaS)
offering from CloudMunch comes at a time when there is substantial uptick in the PaaS market globally. According to technology research firm Gartner, worldwide PaaS revenue is expected to reach USD 1.7 billion by 2015 up from just USD 512 million in 2010. The biggest components here will be application PaaS and integration PaaS. By continuously compiling, validating and testing the codes, CloudMunch enables “continuous delivery” of applications, which is a faster and better way to build, test, and run cloud applications. The company believes “continuous delivery” of applications in the cloud is soon going to be the new normal and has hence built its model ground-up to take advantage of these shifts. Commenting on the startup ecosystem in India, Prabhu, Founder and CEO of CloudMunch, told InformationWeek that the Indian ecosystem today is more conducive to startups than ever before. “The cost is minimum and the reach maximum with models such
Pradeep Prabhu, Founder and CEO, CloudMunch
as the cloud. The industry has matured today enabling people to utilize their long industry experience to build sustainable startups. Finally, there is a greater acceptance globally for products and services from India.” Asked if the company has set itself any revenue target and timeline, Prabhu said that they are currently focused on customer adoption and “delighting their customers.” To begin with, the company is mainly targeting product and Internet-based companies. —Ayushman Baruah
Storage
38 percent of info in Indian organizations is duplicate More than a third (38 percent) of business information in Indian organizations is duplicate, contributing to one of the top challenges associated with information management, according to the India results of the second chapter of Symantec’s State of Information Survey – Information Mayhem. The survey further revealed that more than 39 percent of organizations aren’t aware of the level of importance to the business of the information they are storing, or even if it’s business or personal information. More than 43 percent don’t know how old information is and 40 percent don’t know who owns it. This lack of information understanding is resulting in businesses spending more than necessary on storage — globally
60 percent of enterprises spend more than USD 100,000 per year on their storage infrastructure. And the complexity is significantly reducing productivity. “As our appetite for information grows, our ability to intelligently manage it must keep pace,” says AnandNaik, Managing Director - Sales, Symantec in India. “The vortex of information challenges means that 78 percent of companies have missed compliance requirements in the past year. Information governance, and identifying ways to help IT focus their efforts on the information that actually matters, holds the key to fixing these issues.” As per the survey, Indian businesses are facing huge challenges when trying to manage their information with two
thirds (66 percent) saying that there was no ‘single version of truth’ when it came to information. In addition, 63 percent of information is hard to find, further contributing to the sea of information that is challenging businesses today. The report recommended that organizations should develop an effective information governance program, designed to provide top-down support for the intelligent maintenance of the company’s data requirements. For that organizations should establish the Clevel owner of information governance, concentrate on focused projects, and manage information according to its importance to the business. —InformationWeek News Network
january 2013 i n f o r m at i o n w e e k 13
News Cloud Computing
L&T Infotech targets SMBs with App Shop but a complete surrounding ecosystem Research firm, Zinnov, estimates that portal of office, e-mail, collaboration, with over 50 million units, India has the event notification and BI reporting.” The largest number of SMBs across the world. goal is to create a platform which can While more than one-third of India’s GDP support multiple domains based comis attributed to the SMB sector, most munities like discrete manufacturing SMBs are still struggling to compete with suppliers, pharma distributors, healthcare large players as they do not adequately and educational institutes. leverage IT. A large number of SMBs have The firm has also smartly decided to still not looked at using IT as a catalyst for target the extended arms of companies, driving growth, as they have traditional which are huge in number, but are not challenges such as skill set availability Abhay Chitnis, VP & CTO, L&T Infotech mature in terms of processes and technoland affordable and customized IT prodogy. This includes suppliers and distributors of large compaucts. This market is up for grabs as ambitious SMBs are now nies. With this objective, the firm has launched L&T Infotech looking at using technology to improve their efficiencies and App Shop (called LTI App Shop). The LTI App Shop currently productivity. has apps such as Payfast — a cloud-based payroll system, Not surprisingly, a host of vendors, system integrators and Shiksha cloud (ERP platform for education sector), Planning pure play cloud startups have started tapping this attractive and Procurement, cEMS (request based system to manage market by offering customized solutions. The latest one to anadmin processes) and Sapphire (social media analytics) along nounce plans for tapping this huge market is L&T Infotech. with value added services like e-mail, helpdesk, and collaboThe firm has ambitious plans for making a mark in this ration apps. The firm will also host apps from other partner segment in the next one year. Explaining the positioning, vendors, backed by a unified billing and single sign-on feature. Abhay Chitnis, VP & CTO, L&T Infotech, says, “Ours is a unique With a well targeted strategy and a portfolio of needplatform which will provide services right from lowest level of based apps for SMBs, L&T Infotech is well positioned to carve provisioning of Infrastructure (IaaS) to external SaaS applicaout its own unique identity in a crowded cloud market. tions up to highest level of maturity in terms of business pro— Srikanth RP cess orchestration. We will provide not just SaaS applications
Cloud Computing
71 percent of organizations using SaaS for less than three years: Gartner survey Adoption of software as a service (SaaS) has grown dramatically among users of enterprise software solutions, but it varies widely within markets, according to Gartner. A recent Gartner survey showed 71 percent of organizations have been using SaaS for less than three years. The results indicate that interest in the SaaS deployment model remains strong and continues to expand with late adopters. Implementing net new solutions or replacing existing solutions is now the primary driver for using SaaS, according to the survey. Worldwide, there is a shift in SaaS adoption from primarily extensions to existing apps to net new deployments or replacements of exist-
14
informationweek january 2013
ing on-premises apps. “Although nearly half of respondents in Asia/Pacific indicated the primary adoption driver of SaaS was net new deployments, the U.S. and European respondents indicated their strongest driver was to replace existing onpremises apps,” said Charles Eschinger, Research VP, Gartner. “It’s not surprising that SaaS is being deployed as net new deployments in Asia/Pacific since many of the users are relatively new businesses with few legacy systems.” According to the survey, investments in SaaS are expected to increase across all regions. Seventy-seven percent of respondents expected to increase spending on SaaS, while 17 per-
cent plan to keep spending the same. More than 80 percent of respondents in Brazil and Asia/Pacific indicated more spending on SaaS apps over the next two years. The U.S. and European countries were not far behind with 73 percent of U.S. respondents and 71 percent of European respondents intending to increase spending on SaaS. Majority of organizations are deploying CRM and enterprise content management (ECM) apps. Supply chain management, web conferencing, teaming platforms and social were the applications picked most as replacements for on-premises solutions. — InformationWeek News Network
www.informationweek.in
B i g D ata
Less than 1 percent of world’s data is being analyzed: Study Despite the unprecedented expansion of the digital universe due to the massive amounts of data being generated daily by people and machines, only 0.5 percent of the world’s data is being analyzed, as per the EMC-sponsored IDC Digital Universe study, Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East. The proliferation of devices such as PCs and smartphones worldwide, increased Internet access within emerging markets and the boost in data from machines such as surveillance cameras or smart meters has contributed to the doubling of the digital universe within the past two years alone to a mammoth 2.8 ZB. IDC projects that the digital universe will reach 40 ZB by 2020, an amount that exceeds previous forecasts by 14 percent. As per the study, the digital universe will double every two years between now and 2020. There will be approximately 5,247 GB of data for every man, woman and child on earth in 2020. A major factor behind the expansion of the digital universe is the growth of machine generated data, increasing from 11 percent of the digital universe in 2005 to over 40 percent in 2020. The study revealed that large quantity of useful data is getting lost, as the majority of new data is largely untagged file-based and unstructured data. In 2012, 23 percent of the digital universe would be useful for Big Data if tagged and
analyzed. However, currently only 3 percent of the potentially useful data is tagged, and even less is analyzed. The study further revealed that by 2020, 33 percent of the digital universe (13,000+ exabytes) will have Big Data value if it is tagged and analyzed. The amount of data that requires protection is growing faster than the digital universe itself. As per the study, less than a third of the digital universe required data protection in 2010, but that proportion is expected to exceed 40 percent by 2020. In 2012, while about 35 percent of the information required some type of data protection, less than 20 percent of the digital universe actually had these protections. — InformationWeek News Network
S o f t wa r e
Dell SonicWALL aims to tap the enterprise market Post acquisition by Dell, SonicWALL, now part of Dell’s software group, is looking to acquire customers in the enterprise segment building on Dell’s existing relationship in this space. During the past four years, Dell has globally invested more than USD 10 billion to deliver innovative, high-performing end-to-end solutions to better support the evolving needs of its customers, the company said at the 2nd Annual Dell World conference. More than 95 percent of the Fortune 500 companies rely on Dell for IT solutions and services. “We want to make inroads into the large enterprise as we have the complete reach now. Earlier, it might have taken us 3-6 months to reach to a big customer, but now we have immedi-
ate accessibility and reach because of Dell,” Amit Singh, the newly appointed Country Head of Dell SonicWALL told InformationWeek. In terms of the business split, Singh says he would like to have a balance between the SMEs (small and medium enterprises) and large enterprises. In fact, he has already put in place dedicated sales teams for each of the two markets. Given the strict regulatory and compliance requirements, Dell SonicWALL, like most other security companies, counts BFSI and government as two of their most important industry verticals. SonicWALL’s acquisition by Dell has been welcomed by their customers, Singh said. “Customers are now looking forward to the added benefit of Dell’s
security capabilities along with Dell’s existing integration capabilities and services.” SonicWALL India has about 300 employees in its R&D center in Bangalore and the team is a significant contributor to the global engineering team supporting their unified threat management (UTM) and Aventail products. SonicWALL acquired Aventail Corporation, a provider of SSL VPN secure remote access solutions, to complement its security portfolio in 2007. From a partner perspective, Singh reinforces that they would continue with their 100 percent indirect business model. “All our businesses continue to get executed through a channel partner,” he said. —Ayushman Baruah
january 2013 i n f o r m at i o n w e e k 15
News Mobile
Checking smartphone is part of morning ritual for Gen Y Indians 90 percent of Gen Y surveyed worldwide said they check their smartphones for updates in e-mails, texts, and social media sites, often before they get out of bed, according to the 2012 Cisco Connected World Technology Report (CCWTR.) In India, 96 percent of those who have smartphones check for updates as part of the morning routine. In India, 70 percent of respondents compulsively check their smartphones for e-mails, texts or social media updates. Of those, 42 percent said they would feel anxious if they couldn’t check their smartphones and 71 percent wish they didn’t feel so compelled, but they like to stay connected. Over 84 percent of respondents use their smartphones in bed; in fact many check their smartphones in the morning before they get out of bed. And for some, it’s the last thing they do at night, the report said. Nearly 22 percent of Indian respondents admitted to using smartphones in the bathroom compared to the global average of 33 percent. More than half of Indian respondents (56 percent) use smartphones to check e-mail and social media during meals with friends and family compared to the global average of 46 percent.
Nearly 84 percent of the Gen Y respondents in India said mobile applications are important to their daily lives compared to the global average of 70 percent. More than half (62 percent) respondents from India said they mainly use mobile applications for games and entertainment, while 33 percent of them mainly use them for work. In India, a significant number of Gen Y respondents admitted they spend more time online with friends than socializing in person — the number at 56 percent is above the global average of 40 percent. Two-thirds of global respondents said they spend an equal amount or more time socializing online with friends than they do in person. In India, this percentage is 83. The report further revealed that in order to stay connected the Gen Y workforce is also breaking company security policies. In India, 41 percent of respondents said their company’s policy forbids them to use company-issued devices for non-work activities and 56 percent say they do not follow those rules all the time. However, 58 percent of IT professionals believe employees follow the rules. — InformationWeek News Network
V i rt u a l i z at i o n
Citrix desktop virtualization solution helps Fareportal slash power and real estate costs Fareportal, the seventh largest online travel group in U.S., had been facing a major business challenge. The increasing size and scale of the company was necessitating an increase in office space; plans of which were marred by rising real estate prices. The company was looking for a powerful technology that would help them reutilize their real estate resources The IT team zeroed in on desktop virtualization as the key technology that could resolve this problem — and after weighing different models like pay-as-you-go and subscription models, it chose Citrix XenDesktop. After a successful Proof of Concept (PoC), Fareportal migrated almost 30 percent of its employees onto Citrix
16
informationweek january 2013
XenDesktop virtualized environment, and the remaining 70 percent would be migrated in the near future. “With Citrix virtual desktops we have helped Fareportal improve the seat utilization
of employees working in different shifts. This increased real estate utilization has helped Fareportal optimize cost,” said Sanjay Deshmukh, Area Vice President, India Subcontinent, Citrix.
Now the company can accommodate more people in the same space and the challenge of buying expensive real estate has been resolved. The company has thereby saved almost USD 400,000 per annum on real estate costs and 25-30 percent on power costs. As the company is PCI compliant, the deployment has also helped the company secure data directly at the server side rather than individually securing individual end points. Fareportal is currently in the process of transitioning its development environment on Citrix XenDesktop and moving the entire call center onto the virtualized environment in the near future. — InformationWeek News Network
www.informationweek.in
News Analysis
Cisco makes deep inroads into Indian SMB sector The SMB sector is one of the biggest revenue generators for Cisco India and over the last three years, it has been growing at a CAGR of 45 percent By Srikanth RP A partner-led approach is helping Cisco increase its footprint impressively in the fast growing SMB market in India. The networking major is aggressively expanding its SMB portfolio over the past two years, and is seeing good results for its strategy. Cisco has been meticulously and systematically developing the SMB market. This includes having a dedicated R&D team to build products only for small customers, and making an investment of USD 100 million to develop products for small businesses. “Adoption of technology among SMBs is directly dependent on bringing down the total cost of ownership and increasing ease of use. We have made great efforts to study the exact needs of SMBs and have developed products based on customer feedback,” explains Rajesh Shetty, Vice President, Cisco India and SAARC. Shetty says that the focus is on reducing the complexity and making it easy for SMBs to configure. To tap into the ever growing opportunity in the SMB market, which Gartner estimates to be around 30 percent of the USD 79.8 billion IT spending market in India, Cisco has launched multiple marketing initiatives for its partner network and also introduced a specific product, BE 3000. Developed in India specifically for the Indian SMBs, BE 3000 is a Unified Communication tool that gives an SMB organization the ability to seamlessly connect with the employees, anywhere and anytime. Cisco has been aggressively expanding its SMB portfolio over the past two years, with a range of products in categories like routers and switches, security, wireless, video and
conferencing and network storage. The firm is also seeing significant revenue coming in from emerging areas such as cloud, video, security and mobility. In terms of sectors, the manufacturing vertical is witnessing significant traction where technology penetration has been traditionally low. Cisco is also seeing interest for its products related to video conferencing, virtual desktops and unified communications, in the IT sector. Healthcare and education also hold great potential. The results of this strategy can be seen from the fact that the SMB sector has turned out to be one of the biggest revenue generators for Cisco India and over the last three years, it has been growing at 45 percent CAGR. Cisco has signed up names such as Delhi Duty Free Services, Manipal University and MSN Laboratories in Hyderabad for its virtual workspace solution; Geometric Software for virtual desktop infrastructure; and Bapatla College of Engineering in Guntur, Andhra Pradesh for its Digital Media System. Cisco believes that it has just scratched the surface of a big market that is just on the verge of exploding to greater heights. “A large majority of this market is still untapped, especially in tier II and upcountry markets. Significant addressable market for Cisco outside the top six metros is estimated at approximately USD 400 million. We are looking at beyond the top 8-10 cities to another 30 cities where there is a market opportunity for our products and services,” opines Shetty. With rising mobile and internet penetration coupled with a smart positioning of products, Cisco has a huge opportunity to network beyond its established base.
“A majority of the SMB market is still untapped, especially in tier II and upcountry markets. Significant addressable market for Cisco outside the top six metros is estimated at nearly USD 400 million” Rajesh Shetty
Vice President Cisco India and SAARC
u Srikanth RP srikanth.rp@ubm.com
january 2013 i n f o r m at i o n w e e k 17
News Analysis
Dell sees big potential in Indian healthcare sector With healthcare providers in India slowly waking up to the transformational potential of using IT, Dell is betting big on its global healthcare expertise to make a mark in India By Srikanth RP A big domain player in the global healthcare segment, Dell believes that it has the potential to repeat its success in the Indian market too. With a growing Indian middle class and rising disposable income, patients are demanding better quality healthcare. Healthcare service providers are looking at investing in IT to automate paperbased processes and improve customer satisfaction. “The Indian healthcare segment is highly fragmented in nature, and has relatively low exposure to IT. As compared to other countries, the percentage of IT spending on healthcare with respect to GDP is quite low. Hence, the potential to transform existing processes using IT is huge,” says Sid Nair, VP and Global GM, Healthcare and Life Sciences, Dell Services. Independent market research reports corroborate this huge potential. Market research firm, Zinnov, estimates that the annual IT spending within existing hospitals in India is expected to reach USD 1.5 billion by 2020. Larger hospital chains specifically are looking at automating paper-based processes by deploying solutions for automating back-office functions and digitizing patient records through Electronic Medical Records (EMR). A case in point is the deal Max Healthcare signed with Dell Services. Nair believes that over the next few years, hospitals will significantly upscale their IT investment.
Can cloud aid healthcare? Dell also sees a huge role for services related to cloud computing in the segment. For example, a study by
18
informationweek january 2013
Zinnov estimates the total addressable opportunity for cloud solutions in the sector to be approximately USD 600 million by 2020. Zinnov also believes that the cloud can potentially address close to 40 percent of the total annual healthcare IT spending in India. “In a country where there are large number of individual practitioners and mid-sized hospitals, the cloud is a natural fit. Usage of the cloud can eliminate paper-based processes, help save costs, enable best practices and improve efficiencies,” states Nair. For example, with the growing usage of digital records, storage and archival needs are growing at a fast clip for hospitals. Hospitals perform thousands of radiology tests every year, which require enormous storage capacities. Electronic Health Records are also growing at an equally fast pace. This is creating a unique market, where hospitals need to share image information to better coordinate patient care. However, with images lying in isolated systems, physicians and hospitals struggle to share this information. Dell wants to use its experience in global markets, and provide solutions to teething problems like storage. Nair gives the example of Dell’s Unified Clinical Archive, which is an ideal alternative for hospitals that prefer not to build and maintain their own archive. By consolidating and moving archiving to the cloud with flexible per-study pricing, hospitals can reduce data retention costs. The huge success of the solution can be seen from the fact that the Dell UCA now manages more than 68 million clinical studies archived in the U.S.; nearly 4.8 billion diagnostic imaging objects and supports more than 800
“The percentage of IT spending on healthcare with respect to GDP is quite low in India. Hence, the potential to transform existing processes using IT is huge” Sid Nair
VP and Global GM, Healthcare and Life Sciences, Dell Services clinical sites in one of the world’s largest cloud-based clinical archives. Nair believes that India offers a similar opportunity, primarily because of a large network of entrepreneur driven clinics and hospitals that are looking at being at par with large hospitals, but at lower costs. u Srikanth RP srikanth.rp@ubm.com
www.informationweek.in
28
Big Data, Big Questions
40
‘Business analytics is a key priority for us in 2013’
32
13 Big Data vendors to watch in 2013
41
‘Data mining and BI can make all the difference between success and failure of a business model’
37
Analytics helps Usha International respond to dynamic market needs
44
Big Data, bigger opportunities for insurance companies
january 2013 i n f o r m at i o n w e e k 19
Cover Story
Advanced
Ana ytics
Predictive analysis is getting faster, more accurate and more accessible. Combined with Big Data, it’s driving a new age of experimentation By Doug Henschen
F
ive years ago, companies were standardizing on one or a couple of business intelligence products. Broad interest in advanced analytics, especially the predictive kind, was just emerging. Today, companies of all sizes and industries are experimenting with and using analytics, and veteran users are going for new levels of sophistication, according to our new InformationWeek Analytics, Business Intelligence and Information Management Survey. Companies are embracing analytics to optimize operations, identify risks and spot new business opportunities. Advanced analytics is all about statistical analysis and predictive modeling — being able to see what’s coming and take action before it’s too late, rather than just reacting to what has already happened. That
latter practice, derisively known as “rearview-mirror reporting,” is associated with conventional BI. The more data companies use, the more accurate their predictions become. But the Big Data movement isn’t just about using more data. It’s also about taking advantage of new data types, such as social media conversations, clickstreams and log files, sensor information and other real-time feeds. Experienced practitioners are taking cutting-edge approaches, including in-database analytics, text mining and sentiment analysis. In each of the past six years, respondents to our analytics and BI survey have rated their interest in 10 leading-edge technologies, and advanced analytics has always been the No. 1 choice. Advanced data visualization is No. 2 this year, up
from being ranked third in 2009. Last year we added “Big Data analysis” to the list of cutting-edge pursuits, and this year it ranked No. 4 along with collaborative BI. We also see clear evidence that companies are investing in software, people and advanced techniques. For starters, this year we added “in-database analysis for predictive or statistical modeling” to our list of leading edge technologies, and respondents rated their interest higher than for more-established categories such as mobile BI and cloud-based BI. With in-database analysis, statistical and predictive algorithms are rewritten to operate inside databases that run on massively parallel processing (MPP) platforms. In-database analysis is faster than the old approach to data mining, where
expert Voice By using BI and predictive analytics, we are able to devise new ways to enhance customer experience. This helps us formulate strategies to retain and acquire new customers Ravikiran Mankikar, GM –IT Department, Shamrao Vithal Bank
20
informationweek january 2013
Organizations need to evolve BI to gain more accurate and actionable insights on a real-time basis. This will help organizations derive business value and ensure more efficient and effective growth Sanjay Sharma MD& CEO, IDBI Intech
Predictive analytics is a subset of advanced analytics and demands a dynamic robust BI infrastructure to handle the unfathomable data being generated at an atomic level Daya Prakash CIO, LG Electronics
www.informationweek.in
analysts moved data sets from data warehouses into specialized analytic servers to create and test predictive models. Data movement delays plagued the old approach, and the analytic servers were underpowered. As data sets have grown, time and power constraints limit work to small data samples rather than all available information, limiting the accuracy of the resulting models. Businesses that have embraced in-database approaches say they can develop models in less time for more precisely targeted segments, whether they’re trying to predict customer behavior, product performance, business risks or other variables. What’s more, MPP power lets them crunch through massive data sets, so they can use all available data and deliver far more accurate models.
In-Database-Enabled Text Mining
In-database approaches are maturing and breaking into new areas. Text mining, an advanced analytics technique applied to unstructured text, has become popular for social media analysis, according to our survey respondents. Areas include competitive intelligence gathering, customer behavior analysis, and brand and product reputation analysis. Health insurer UnitedHealthcare, for example, is testing text mining using SAS’s in-database High Performance Analytics software, which runs on Teradata and EMC Greenplum
platforms, as well as commodity grids. Working in partnership with SAS and EMC, UnitedHealthcare is using HPA to develop deeper, more accurate predictive models to analyze customers’ call center communications. In the past 18 months, UnitedHealthcare has collected more than 160 million rows of unstructured information from its call center notes and call recording transcriptions. It uses SAS’s text mining software to look for indications that customers didn’t have their questions or concerns resolved during a call, e-mail or postal mail interaction. The software’s predictive models identify suspect customer interactions that require follow-up calls or messages. The scale of all of this unstructured information, coupled with the comparatively meagre performance of conventional symmetric multiprocessor servers running the analytics software, previously made this sort of analysis time consuming and inaccurate. UnitedHealthcare used a sample of historical data to “train” the predictive models on the data to be analyzed. The more data used, the more accurate the model becomes. Before, it used about 50,000 rows of data as a training set, but with SAS’s HPA in-database approach, UnitedHealthcare can use 50 million rows and develop the model in the same amount of time, says Mark Pitts, director of data science, solutions and
strategy for the insurer. “Our data scientists are more productive, and with more training data, our models are more discriminating,” Pitts says. That means UnitedHealthcare is finding more calls that deserve follow-up and getting fewer false positives. Bottomline: Customer satisfaction stands to improve.
Customer Retention 101
The University of Kentucky ventured into advanced analytics this year to improve student retention. Since the 2007 financial crisis, many colleges and universities across the country are seeing a higher percentage of their students drop out before they complete their degrees. Lower retention rates can lead to shrinking enrollment and contraction of courses, degree programs and faculties. The baseline measures universities use to predict whether a student will fit in and succeed include high school grade point average and ACT and SAT test scores, of course, but UK also factors in ethnicity, socio-economic background, alumni status of parents and other information in deciding whether to accept applicants. Once students are enrolled, colleges and universities collect all sorts of new data that shows how they’re faring. How does UK predict who’s in danger of dropping out so that it can intervene? It analyzes student record data going back to 1988 to model
expert Voice Analytics is very promising in the healthcare sector as hospitals can mine intelligence from the massive amount of data emerging from patient records and medical images, and put in place the required care forecasted Suresh Kumar
CIO, Seven Hill e-health
Using wellorganized analytics, decision makers obtain new insights from past performance, exceptions and deviations. These observations enable management to develop efficient strategies A Balakrishnan, CTO
Geojit BNP Paribas Financial Services
Data is the newage gold and analytics is a key to mine that gold. The real challenge for enterprises is to identify the right questions that they need to ask so that analytics can be of real value Anurag Shah, COO & Global Operations Head, Omnitech InfoSolutions
january 2013 i n f o r m at i o n w e e k 21
Cover Story the characteristics of students who did and didn’t drop out. In addition to taking into account preadmission variables, the university can tap information on scholarships, parental participation, courses of study, instructors, degree programs, dorms students lived in, extracurricular activities and, of course, grades and attendance. Given the fact that UK has about 22,000 undergrads and 6,000 to 7,000 grad students enrolled at any one time, that’s a lot of data. UK began a proof-of-concept project in March, using the past six years of student data. Within two months, it had a baseline predictive model using high school GPA and college entrance exam scores, and since May a statistician hired from another university has been adding variables to the model from the university’s Blackboard learning management system. UK thinks retention is closely related to the level of student engagement, so it’s adding data on the number of times students log in to their class web pages, check syllabuses, download homework assignments, collaborate online with classmates and turn in homework assignments on time. The university is using SAP’s new Predictive Analysis software in large part because it already uses SAP’s ERP system and BusinessObjects BI software. UK is doing regression analyses and statistical modeling using the R programming language, deployed on SAP’s Hana in-memory database so it can support fast, ad
The social factor What factors are driving, or would drive, your company’s interest in using social emdia and social media analysis technologies? Competitive intelligence
41%
Customer behaviour analysis (spotting influencers and churn threats)
38% Brand/product/reputation management (including “voice of the customer’ apps)
34% Customer service (including product and srvice quality and warranty apps)
27%
Customer segmentation (for up-sell and cross-sell)
25% Compliance (financial services, insurance risk management and fraud detection)
24% Social media and social network analysis technologies aren’t a priority for my company
27% Data: InformationWeek 2013 Analytics, Business Intelligence and Information Management Survey of 417 business technology pros at companies using or planning to deploy data analytics, BI or statistical analysis software, October 2012
hoc analysis across a wide range of variables. Hana handles querying in RAM without the need to constantly build new multidimensional cubes and data aggregations. “If we have people on campus who want to know what the prediction scores are based on certain socioeconomic backgrounds or certain combinations of any number of 85 variables, we can quickly ask those questions and adjust our model, as opposed to waiting for nightly [batch] loading jobs,” says Adam Recktenwald, UK’s enterprise architect. UK built an iPad-based application that lets the university’s business analysts drill down on data, selecting the dimensions they’re interested. They can look at data across an entire
cohort, or graduating class, examining segments and microsegments based on those 85 variables. “Right now, the retention analytics that most large schools are doing tend to look at students in large swaths, such as high-performing students and low-performing students,” Recktenwald says. “We want to bring this down to microsegments, so we can better understand the needs of individual students.” Once the retention program is proved, UK plans to develop models that will help it predict revenue and physical facility needs.
Crowdsourcing Analysis
As corporate data stores grow in volume, variety and complexity, they
expert Voice Social media explosion has opened up new possibilities of understanding customer behavior. To achieve that, it is necessary to master the technique of analyzing large amounts of unstructured data Udayan Banerjee CTO, NIIT Technologies
22
informationweek january 2013
Variety of data types, ubiquitous analysis tools, and high potential computing power at reasonable costs, are some of the drivers which catapults advanced analytics to hold promise for the future KT Rajan, Director - Operations, IS, - India & South Asia, Allergan
We have been using BI and analytics for more than four years. Though the initiative was kicked off to support marketing campaigns, we are now in the process of rolling it across the enterprise Parvinder Singh, Corporate VP Head - IT Services, Max Life Insurance
www.informationweek.in
provide advanced analytics practitioners with more fodder for predictive models. But they aren’t always sure they’re making full use of all that information. Property and casualty insurer Allstate came to that conclusion when it started working with Kaggle, a Big Data crowdsourcing firm. With more than 16 million customers, Allstate has no shortage of data. It also has no shortage of analytics experts — a 50-person predictive modeling team analyzes that data for the company’s product research group, helping guide coverage and rates. Allstate’s analytics gurus are passionate about what they do, so much so that some of them participate in Kaggle competitions in their spare time. Founded in 2010, Kaggle organizes competitions in which businesses, government agencies and researchers present data sets and problems. Kaggle has had some 61,700 participants submit more than 171,915 entries to more than 65 competitions. Sponsors pay prize money in exchange for the intellectual property behind the winning model. In early 2011, Allstate’s VP of product research, Eric Huls, overheard some of his employees talking about a Kaggle competition and was captivated by the idea. Within weeks, Allstate hosted a competition for which it supplied three years of auto policy data and asked competitors to beat Allstate’s baseline model for predicting which covered autos would be involved in bodily injury claims and how much those claims would cost. The models were created and trained using two years of data and then tested for accuracy against the third year of data. All personally identifiable information was stripped from the data, but still included were plenty of details about the make, model, horsepower, weight and length of the vehicles, along with the cost of any bodily injury claims. In the vast majority of cases, there were no bodily injury claims, so the cost figure was zero. Several Kaggle competitors handily beat Allstate’s baseline, and the
Where analytics and BI fit in How dones your company deploy, or plan to deploy, analytics and BI technologies? We don’t plan on deploying
We have many products scattered throughout departments, operation and locations
3% 17%
20%
We deploy them as part of other technology initiatives
30% 30% We have standardized on one or a few analytics and BI products company-wide
We deploy them on a project-by-project basis
Data: InformationWeek 2013 Analytics, Business Intelligence and Information Management Survey of 417 business technology pros at companies using or planning to deploy data analytics, BI or statistical analysis software, October 2012
winner beat Allstate’s model by 270 percent, meaning it was that much more accurate in predicting which vehicles would be involved in accidents and how much related bodily injury claims would cost. The insurer has since incorporated some of the techniques used in that winning model, Huls says. The appealing thing about Kaggle and crowdsourcing, Huls says, is that the competitors approach problems from many different angles. Kaggle’s top competitors include astronomers,hedge fund quants,
customer churn, but this time it’s using an invitation-only format, whereby a select group of 15 competitors was invited to join the commercially sensitive project. To participate, contestants had to sign non-disclosure agreements so that Allstate doesn’t have to worry about giving away its business secrets and it can share more data. There’s still no personally identifiable information involved but Allstate has included details on prices paid for policies, coverage plans and other products purchased, and whether those customers
Advanced analytics is all about statistical analysis and predictive modeling — being able to see what’s coming and take action before it’s too late statisticians, physicists, economists and mathematicians, and they don’t have preconceived “we do things this way” notions about how to solve a problem. “By putting some of these problems out there for others to work on, it prevents us from being more satisfied with the job that we’re doing than we should be,” Huls says. Allstate started another Kaggle competition in September to study
stuck with Allstate or switched to another insurer. Allstate also offered contestants more context, and detail on the methods it uses internally, and insight into what has and hasn’t worked in the past, Huls says.
From standardization to experimentation
Veteran analytics users like Allstate that are trying out crowdsourcing and other advanced techniques have come
january 2013 i n f o r m at i o n w e e k 23
Cover Story a long way from 2009, when 47 percent of respondents to our survey said they had “standardized on one or a few BI tools deployed throughout the company.” That percentage has since declined to 30 percent of this year’s 417 survey respondents using or planning to use data analytics, BI or statistical analysis software. What’s more, fewer respondents now report their companies have a “standard BI platform.” Why the reversal? For starters, the standardization movement started around the time of a significant consolidation of BI vendors that culminated in 2007 when SAP acquired Business Objects, Oracle acquired Hyperion and IBM acquired Cognos. Microsoft was moving aggressively into BI at that time, and all four of those companies were pushing customers to consolidate BI investments around their big, broadly capable BI suites. Market-share gains followed for the four vendors. Before they were acquired, BusinessObjects, Hyperion and Cognos had snapped up many smaller, innovative companies. A new wave of nimble and innovative BI companies — including QlikTech, Tableau Software and Tibco Spotfire — has since emerged, and they are now the fastest-growing BI vendors. IBM, Microsoft, Oracle and SAP hold more than half (53.2 percent) of the total BI software tools market, in terms of 2011 revenue, according to IDC. Their BI sales are also growing at double-digit rates, but they’re not growing as quickly as smaller, innovative rivals focusing on advanced data visualization and self-service BI (requiring minimal IT support). SAS and IBM’s SPSS unit have long dominated the advanced analytics software category, with 35.2 percent and 16.8 percent of the 2011 market, respectively, according to IDC. Microsoft is third with just a 2.5 percent share. Dozens of smaller competitors each have less than 1 percent. Open source offerings such as the R project for statistical programming are also popular. R is gaining commercial adoption, and that uptake hasn’t escaped the notice of BI vendors that
24
informationweek january 2013
What’s spuring interest in Big Data analysis? What data souces or challenges are driving, or would drive, your company’s interest in using big data analysis? Finding correlations across multiple, disparate data souces (clickstreams, geospatial, transactions, etc.)
52% Predicting customer beheaviour
45%
Predicting product or service sales
34%
Predicting fraud or financial risk
28%
Identifying computer security risks
23% High-scale machine data from sensors, Web logs, etc
22%
Social network comments for consumer sentiments
18% Web clickstreams
11%
My company isn’t interested in big data analytics
15%
Data: InformationWeek 2013 Analytics, Business Intelligence and Information Management Survey of 417 business technology pros at companies using or planning to deploy data analytics, BI or statistical analysis software, October 2012
have since introduced R-based analytics modules tied to their BI suites. Information Builders was among the first to make this move, in 2009, with the release of its RStat module. Tibco Spotfire followed, along with Oracle and SAP.
Big Data Ambitions
Where Allstate’s Kaggle competitions have involved highly structured data in subterabyte quantities, many Kaggle competitions involve Big Data — tens of hundreds of terabytes of highly variable data with inconsistent structures, complex data such as click streams and log files, and minimally structured data such as text and social network data. This is the fast-growing world of Big Data that many analytics and BI practitioners are eager to explore. What are the top reasons our survey respondents are interested in Big Data? To find correlations across disparate data sources, predict customer behavior, and predict product or service sales. Their biggest concerns? Scarcity of expertise, expense of platforms and lack of a clear business case. Hand in hand with the Big Data trend is a debate about the future of data warehousing, analytics and BI,
sectors where relational databases and the SQL query language have held sway for more than 30 years. Companies are now experimenting with Hadoop and NoSQL databases, which let them work with unstructured, variable and complex data without all of the data modeling steps that relational databases require. Many of those adopting these new platforms are trying out new tools as well as new vendors, such as Datameer and Karmasphere. They’re also raising concerns about finding people with the skills to use these tools. As companies embrace new platforms such as Hadoop and NoSQL databases, today’s relational platforms will likely be focused even more on specialized analytical tasks and applications. Relational databases don’t adapt quickly or well in places where new data types, such as complex data and varied data, are constantly showing up. And where data volumes are extreme, Hadoop’s lower costs will make it a winner. That means there will be a big opportunity for new analytics and BI tools built on Hadoop. These changes are likely to take another five years, but they’re coming. Source: InformationWeek USA
www.informationweek.in
Infographic
26
informationweek january 2013
www.informationweek.in
january 2013 i n f o r m at i o n w e e k 27
Cover Story
Big Data, Big Questions
Sears is embracing Hadoop to get a grip on its Big Data. Can emerging tech fit in a legacy infrastructure and help Sears compete? By Doug Henschen
L
ike many retailers, Sears Holdings, the parent of Sears and Kmart, is trying to get closer to its customers. At Sears’ scale, that requires big-time data analysis capabilities, but three years ago, Sears’ IT wasn’t really up to the task. “We wanted to personalize marketing campaigns, coupons, and offers down to the individual customer, but our legacy systems were incapable of supporting that,” says Phil Shelley, Sears’ executive VP and CTO, in a meeting with InformationWeek editors and his team at company headquarters in suburban Chicago. Improving customer loyalty, and with it sales and profitability, is desperately important to Sears as it faces fierce competition from Wal-Mart and Target, as well as online retailers such as Amazon.com. While revenue at Sears has declined, from USD 50 billion in 2008 to USD 42 billion in 2011, big-box rivals Wal-Mart and Target have grown steadily, and they’re far more profitable. Meantime, Amazon has gone from USD 19 billion in revenue in 2008 to USD 48 billion last year, passing Sears for the first time. A Shop Your Way Rewards membership program started by Sears in 2011 is part of a five-part strategy to get the company back on track. Behind the scenes is a cutting-edge implementation of Apache Hadoop, the
28
informationweek january 2013
high-scale, open source data processing platform driving the Big Data trend. Despite Sears’ less-than-cutting-edge reputation as a retailer, the company has been an innovator in using Big Data. In fact, Shelley is leading a Sears subsidiary, MetaScale, that’s pitching services to help companies outside retail use Hadoop. But will companies be interested in buying Big Data cloud and consulting services from Sears? And can Sears’ own Big Data efforts help the company regain its footing in the retail industry?
Fast and Agile
Sears’ process for analyzing marketing campaigns for loyalty club members used to take six weeks on mainframe, Teradata, and SAS servers. The new process running on Hadoop can be completed weekly, Shelley says. For certain online and mobile commerce scenarios, Sears can now perform daily analyses. What’s more, targeting is more granular, in some cases down to the individual customer. Whereas the old models made use of 10 percent of available data, the new models run on 100 percent. “The Holy Grail in data warehousing has always been to have all your data in one place so you can do big models on large data sets, but that hasn’t been feasible either economically or in terms of technical capabilities,” Shelley says, noting that Sears previously kept data
anywhere from 90 days to two years. “With Hadoop, we can keep everything, which is crucial because we don’t want to archive or delete meaningful data.” Sears is still the largest appliance retailer and appliance service provider in the U.S., for example, so it’s in a strong position to understand customer needs, service trends, warranty problems, and more. But Sears has only been scratching the surface of using available data. Enter Hadoop, an open source data processing platform gaining adoption on the strength of two promises: ultrahigh scalability and low cost compared with conventional relational databases. Hadoop systems at 200 terabytes cost about one-third of 200-TB relational platforms, and the differential grows as scale increases into the petabytes, according to Sears. With Hadoop’s massively parallel processing power, Sears sees little more than one minute’s difference between processing 100 million records and 2 billion records. The downside of Hadoop is that it’s an immature platform, perplexing to many IT shops, and Hadoop talent is scarce. Sears learned Hadoop the hard way, by trial and error. It had few outside experts available to guide its work when it embraced the platform in early 2010. The company is now in the enviable position of having Big Data experience among its employees in the U.S. and India. MetaScale will leverage Sears’ data center capacity in Chicago and
www.informationweek.in
Detroit, just as Amazon Web Services takes advantage of Amazon’s massive e-commerce compute capacity.
Open Source Moves In
Sears’ embrace of an open source stack began at the operating system level, with Linux. Sears routinely replaces legacy Unix systems with Linux rather than upgrade them, Shelley says, and it has retired most of its Sun and HP-UX servers. Microsoft server and development technologies are also on the way out. Moving up the stack, Sears is consolidating its databases to MySQL, Info- Bright, and Teradata — EMC Greenplum, Microsoft SQL Server, and Oracle (including four Exadata boxes) are on their way out, Shelley says. Hadoop’s power comes from dividing workloads across many commodity Intel x86 servers, each with multiple CPUs and each CPU with multiple processor cores. Since early 2010, Sears has been moving batch data processing off its mainframes and into Hadoop. Cost is the big motivator, as mainframe MIPS cost anywhere from USD 3,000 to USD 7,000 per year, Shelley says, while Hadoop costs are a small fraction of that. Sears says it has surpassed its initial target to reduce mainframe costs by USD 500,000 per year, while also delivering “at least 20, sometimes 50, up to 100 times better performance on batch times,” Shelley says. Eliminating all of the mainframes in use would enable it to save “tens of millions” of dollars, he says.
‘ETL Must Die’
Sears’ move to Hadoop began as an experiment using a single node running on a netbook computer — the netbook that still sits on Shelley’s office desk. Sears deployed its first production cluster of 20 to 30 nodes in early 2010. A major Big Data processing bottleneck then was extract, transform, and load processing, and Shelley has become a zealot about eliminating ETL. “ETL is an antiquated technique, and for large companies it’s inefficient and wasteful because you create multiple copies of data,” he says. “Everybody
SEARS AT A GLANCE Bricks And Clicks Sears Holdings includes Sears and Kmart stores (more than 2,600 stores) and Lands’ End. This year, it spun off Sears Hometown and Outlet stores. Digital Ties Its Shop Your Way loyalty program offers points and promotions. IT Operations IT staff in the U.S. and India, with data centers in Chicago and Detroit areas. Hadoop Startup MetaScale will run Hadoop clusters— in Sears’ data center or remotely—on subscription basis, plus do consulting. used ETL because they couldn’t put everything in one place, but that has changed with Hadoop, and now we copy data, as a matter of principle, only when we absolutely have to copy.” Sears can’t eliminate ETL overnight, so it has been moving the slowest and most processing-intensive steps within ETL jobs into Hadoop. Shelley cites an ETL process that took 20 hours to run using IBM DataStage software on a cluster of distributed servers. One step that took 10 hours to run in DataStage now can run in 17 minutes on Hadoop, he says. One downside: It takes 90 minutes to FTP the job to Hadoop and then bring results back to the ETL servers. That FTP time is a trade-off in Sears’ approach of picking off one ETL step at a time. Shelley intends to keep moving steps in that process until the entire data transformation workload is on Hadoop. “The reason we do it this way is you get a very big hit quickly,” he says, noting it takes less than two weeks to get each step into production. Shelley vows to get rid of ETL eventually, “but you do it in a very nondisruptive, nonscary way for the business.” Shelley’s “ETL must die” view has its doubters. Coming to the defense of
ETL, Mike Olson, CEO of Cloudera, the leading Hadoop software distributor, recently told InformationWeek, “Almost without exception, when we see Hadoop in real customer deployments, it is stood up next to existing infrastructure that’s aimed at existing business problems.” Shelley sees Hadoop as part of a larger IT ecosystem, too, and says systems such as Teradata will continue to have an important, focused role at Sears. But he’s on the far end of the spectrum in terms of how much of the legacy environment Hadoop might replace. Countering Shelley’s sometimes sweeping predictions of legacy system replacement, Olson says: “It’s unlikely that a brand-new entrant to the market [like Hadoop] is going to displace tools for established workloads.”
Scaling Out
Sears’ main Hadoop cluster has nearly 300 nodes, and it’s populated with 2 PB of data — mostly structured data such as customer transaction, point of sale, and supply chain. (Hadoop systems create two copies of the data, so the total environment is 6 PB). To give a sense of how early Sears was to Hadoop development, Wal-Mart divulged early this year that it was scaling out an experimental 10-node Hadoop cluster for e-commerce analysis. Sears passed that size in 2010. Sears now keeps all of its data down to individual transactions (rather than aggregates) and years of history (rather than imposing quarterly windows on certain data, as it did previously). That’s raw data, which Shelley says Sears can refactor and combine as needed quickly and efficiently within Hadoop. Hadoop isn’t a science project at Sears — critical reports run on the platform, including financial analyses; SEC reporting; logistics planning; and analyses of supply chains, products, and customer data. For ad hoc query and analysis, Sears uses Datameer, a spreadsheet-style tool that supports data exploration and visualization directly on Hadoop, without copying or moving data. Using Datameer, Sears can develop in three days interactive reports that used to take IT six to 12
january 2013 i n f o r m at i o n w e e k 29
Cover Story weeks, Shelley says. The old approach required intensive IT support for ETL, data cubing, and associated testing. Now line-of-business power users are developing most of the new reports.
The MetaScale Mission
Shelley is still CTO of Sears, but if his portrayal of all the things Hadoop can do sounds a bit rosy, keep in mind that he’s also now CEO of MetaScale, a division that Sears is hoping will make money from the company’s specialized big data expertise. The rarest commodity that MetaScale offers is Sears’ experience in bringing mainframe data into the Hadoop world. Old-school Cobol programmers at Sears were initially Hadoop skeptics, Shelley says, but many turned out to be eager and highly skilled adopters of the Pig language for running MapReduce jobs on Hadoop. Tasks that required
3,000 to 5,000 lines of Cobol can be reproduced with a few hundred lines of Pig, he says. The company learned how to load data from IMS (mainframe) databases into Hadoop and bring result sets back into mainframe apps. That’s not trivial work because it involves a variety of compressed data format transformations, and packing and unpacking of data. MetaScale’s business model is to run Hadoop clusters for other companies as a subscription cloud service in Sears’ data center. Or Sears will remotely manage clusters in a customer’s data center, a setup that two early customers, one in healthcare and the other in financial services, both want for regulatory reasons. Monthly fees are based on the volume of terabytes supported, and customers can buy out deployments if they want to take them over and run them themselves. MetaScale also offers data
architecture, modeling, and management services and consulting. The big idea behind Hadoop is to bring in as much data as possible while keeping data structures simple. “People want to overcomplicate things by representing data and dividing things up into separate files,” says Scott LaCosse, Director of Data Management at Sears and MetaScale. “The object is not to save space, it’s to eliminate joins, denormalize the data, and put it all in one big file where you can analyze it.”
Counterintuitive Tactic
It’s an approach that’s counterintuitive for a SQL veteran, so a big part of MetaScale’s work is to help customers change their thinking: You apply schema as you pull data out to use it, rather than take the relational database approach of imposing a schema on data before it’s loaded onto the platform. Hadoop holds data in its raw form, giving users
COMMENTARY
Is Sears’ Big Data startup a distraction? Given all of the headwinds Sears faces, should its CTO be spending much of his time building a startup? Sears Holdings had operating losses four of the last five quarters. Sears Chairman Eddie Lampert, whose hedge fund owns more than 60 percent of the company, started his chairman’s letter back in February with this assessment: “Our poor financial results in 2011, culminating in a very poor fourth quarter, underscore the need to accelerate the transformation of Sears Holdings.” In that environment, launching a technology startup is risky, given the potential for distraction from meeting the IT needs of a USD 42 billion-a-year retail business that includes Sears and Kmart. With the MetaScale venture, CTO Phil Shelley is looking to take advantage of Sears’ broad experience with using the Hadoop Big Data platform. MetaScale sells subscription services to manage large data sets using Hadoop, and it offers Big Data consulting. Shelley doesn’t see such a startup as risky. Lampert, he says, sets the expectation that executives need to make such moves. “It’s a very innovative environment,” Shelley says. “The concept of generating new business, a new business model, a whole new business, is very much encouraged.” Sears is doing cutting-edge work when it comes to Hadoop and Big Data management. Some of the most practical work — work that other big companies might buy as a service — is moving big batch processing data loads
30
informationweek january 2013
off of mainframes, cutting hours of processing time. Applied to Sears’ own business, eliminating mainframes could save tens of millions of dollars. But when it comes to applying Big Data tactics to change Sears, it feels like Sears could be doing more and moving faster. One of the most promising areas is for customized marketing promotions, using Sears’ growing loyalty card program to know what customers bought in the past and what they’re now buying, and to give them an intensely relevant offer to get them to buy more. Sears is just beginning to do that kind of personalization at scale. “You’re starting to see that much more personal, targeted, digital engagement,” Shelley says. “It’s a big company to change, so it will take awhile, but it is changing.” A critical advantage of Hadoop, Shelley says, is its ability to let companies keep and analyze all of their data. Whereas Sears used to analyze 10 percent of data on customers to figure out which promotions might work, now it can analyze all data on them. Because it’s cheaper and faster to keep and analyze data, he says, it’s collecting more of it — such as data coming into Sears’ call center about which appliances are breaking and how often. But Shelley doesn’t offer a clear example of how Sears is putting that kind of data to profitable use. One argument in favor of Sears doing a startup like
www.informationweek.in
the flexibility to combine and examine the data in many ways over time. “If in three years you come up with a new query or analysis, it doesn’t matter because there’s no schema,” Shelley says. “You just go get the raw data and transform it into any format you need.” For all of Shelley’s boldness about replacing legacy systems, he’s careful to describe Hadoop as part of an ecosystem. Sears still uses Teradata and InfoBright, for example, when applications call for fast analysis. But Hadoop is the center of Sears’ data management strategy, handling the large-scale heavy lifting, while relational tools take tactical roles. So where should Hadoop adopters begin? “You have to go fast and be bold without taking stupid risks,” Shelley says. Start with a business need “that causes enough pain that people will notice and they’ll see tangible benefits.”
Sears itself still has a lot to prove with its own use of Hadoop to solve huge business problems, such as offering customers personalized promotions. Shelley cites plenty of conceptual uses of Hadoop, and he sprinkles in details on speed-and-feed gains, but he doesn’t offer clear cases of tangible benefits the retailer has realized. The company is well along in adopting Hadoop and in developing specialized expertise that might benefit MetaScale customers —particularly those using mainframes —but will Hadoop really help turn Sears around? Sears’ latest results for the quarter ended July 28 show that earnings before interest, taxes, depreciation, and amortization were up 163 percent, to USD 153 million, from USD 58 million in the year earlier quarter. But samestore sales were down 2.9 percent at Sears and 4.7 percent at Kmart. Sears’ spin is that it’s selling fewer items
Meta-Scale is that Amazon.com, the most feared company in retail today, is doing its own tech startups. The e-commerce giant’s Amazon Web Services arm pioneered the sale of commodity infrastructure-as-a-service, letting companies buy computing capacity by the hour with only a credit card. Sears is attacking a niche in the cloud computing market: high-end, specialized Hadoop workloads. A more powerful argument in favor of MetaScale is that it forces Sears to stay on the leading edge of Big Data management and analytics, and lets it learn from big companies that are innovating in other industries while also driving some revenue. Shelley won’t say how many clients Meta-Scale has, but he refers to a major healthcare company and another in financial services.
Retail Must Change
You can’t walk into a Sears store today and really feel how Shelley’s Big Data efforts have changed the shopping experience. But Sears isn’t alone — up against this challenge is every single big retailer: Best Buy, J.C. Penney, Wal-Mart, Target, Home Depot, Lowe’s. Every one of them needs to figure out how to make in-store shopping so appealing that customers come to their stores to make purchases rather than to just look around and then buy from discount competitors online. It’s no exaggeration to say that Sears’ survival hinges on its ability to figure out how to serve customers across store, Web, and mobile channels. Lampert, in his chairman’s letter, listed the five pillars
more profitably, which could be in part because of smarter targeting and promotion. But Sears can’t shrink its way back to greatness. As Wal-Mart and Target gain share, their buying power and ability to press Sears on margins only grows. Would-be MetaScale customers in other industries will face different challenges as they consider embracing Hadoop. Could quick analytical access to an entire decade of medical record data change how doctors diagnose and treat patients? Could faster processing spot financial services fraud more effectively? Companies are focused on choosing and building out the next-generation platforms that will handle those Big Data jobs. Will Hadoop be that platform, and will Hadoop help turn MetaScale into a successful pioneer? That’s a story that has yet to unfold. Source: InformationWeek USA
of Sears’ business. One is “reinventing the company continuously through technology and innovation.” Lampert said he spends more of his time on that pillar than any other. He realizes that people will soon, instinctively, reach for their smartphones as they shop in stores. “How people shop today is changing, and it isn’t just the younger generation that is benefiting from iPads, Facebook, and online retail,” Lampert wrote. Retailers have yet to take truly daring steps to create this cross-channel experience. But imagine stores geofenced by Wi-Fi, so that loyalty card customers’ smartphones automatically notify the store when they walk in. Sounds creepy at first, but that’s exactly what many people have set up on Amazon. Give people a compelling reason to set that kind of functionality up in a physical store — say, to receive customized offers on their phones—and they will. How about changing prices multiple times a day? Sears is building the data analytics necessary to make those kind of dynamic price and inventory decisions; like other retailers, it’s also experimenting with digital price signs in stores that would make such changes feasible. Again, dynamic pricing would be a dramatic change, but perhaps one needed to compete with online retailing. Retailers’ survival depends on these kinds of changes in the mobile e-commerce era. If Sears’ MetaScale work helps it figure out the omni-channel shopper, its startup will succeed. If it doesn’t, or it distracts Sears from this mission, it will have failed because there won’t be a Sears or Kmart left. — Chris Murphy
january 2013 i n f o r m at i o n w e e k 31
Cover Story
From Amazon to Splunk, here’s a look at the Big Data innovators that are now pushing Hadoop, NoSQL and Big Data analytics to the next level By Doug Henschen
32
informationweek january 2013
www.informationweek.in
T
here are leaders and there are followers in the Big Data movement. This collection comprises a baker’s dozen leaders. Some, like Amazon, Cloudera and 10Gen, were there at the dawn of the Hadoop and NoSQL movements. Others, like Hortonworks and Platfora, are newcomers, but draw on deep experience. The three big themes you’ll find in this collection are Hadoop maturation, NoSQL innovation and analytic discovery. The Hadoop crowd includes Cloudera, HortonWorks and MapR, each of which is focused entirely on bringing this Big Data platform to a broader base of users by improving reliability, manageability and performance. Cloudera and Hortonworks are improving access to data with their Impala and HCatalog initiatives, respectively, while MapR’s latest push is improving HBase performance. The NoSQL set is led by 10Gen, Amazon, CouchBase, DataStax and Neo Technologies. These are the developers and support providers behind MongoDB, DynamoDB, CouchBase, Cassandra and Neo4j, respectively, which are the leading document, cloud, key value, column and graph databases. Big Data analytic discovery is still in the process of being invented, and the leaders here include Datameer, Hadapt, Karmasphere, Platfora and Splunk. The first four have competing visions of how we’ll analyze data in Hadoop, while the last specializes in machinedata analysis. What you won’t find here are old-guard vendors from the relational database world. Sure, some of those big-name companies have been fast followers. Some even have software distributions and have added important capabilities. But are their hearts really in it? In some cases, you get the sense that their efforts are window dressing. There are vested interests — namely license revenue — in sticking with the status quo, so you just don’t see them out there aggressively selling something that just might displace their cash cows. In other
cases, their ubiquitous connectors to Hadoop seem like desperate ploys for some Big Data cachet. For many users, the key issues include flexibility, speed and ease of use. And it isn’t clear that any single product or service can offer all of those capabilities at the moment. We’re still in the very early days of the Big Data movement, and as the saying goes, the pioneers might get the arrows while the settlers get the land. In our eyes, first movers like Amazon and Cloudera already look like settlers, and more than a few others on this list seem to have solid foundations in place. As we’ve seen before, acquisitions could change the Big Data landscape very quickly. But as of now, these are 13 Big Data pioneers that we’re keeping our eyes on in 2013
1
10Gen
10Gen is the developer and commercial support provider behind open source MongoDB. Among six NoSQL databases highlighted in this roundup (along with DynamoDB, Cassandra, HBase, CouchBase and Neo Technologies), MongoDB is distinguished as the leading document-oriented database. As
new sharding and replication features for multi-data center deployments, and improved performance and database concurrency for high-scale deployments. The data aggregation framework fills an analytics void by letting users directly query data within MongoDB without using complicated batch-oriented MapReduce jobs. CouchBase plans to step up competition with MongoDB by way of JSON support, but we’re sure 10Gen and the MongoDB community will step up to improve scalability and performance in 2013.
2
Amazon
Amazon is about as big a Big Data practitioner as you can get. It’s also the leading Big Data services provider. For starters, it introduced Elastic MapReduce (EMR) more than three years ago. Based on Hadoop, EMR isn’t just a service for MapReduce sandboxes; it’s being used for day-to-day high-scale production data processing by businesses including Ticketmaster and DNA researcher Ion Flux. Amazon Web Services upped the Big Data ante in 2012 with two new services: Amazon DynamoDB, a NoSQL database service, and Amazon
We’re still in the very early days of the Big Data movement. First movers like Amazon and Cloudera already look like settlers, and seem to have solid foundations in place such it can handle semi-structured information encoded in JSON (Java Script Object Notation), XML or other document formats. The big attraction is flexibility, speed and ease of use, as you can quickly embrace new data without the rigid schemas and data transformations required by relational databases. MongoDB is not the scalability champion of the NoSQL set, but 10Gen is working on that. In 2012 it introduced MongoDB 2.2, which added a real-time aggregation framework,
Redshift, a scalable data warehousing service now in preview and set for release early next year. DynamoDB, the service, is based on Dynamo, the NoSQL database that Amazon developed and deployed in 2007 to run big parts of its massive consumer website. Needless to say, it’s proven at high scale. Redshift has yet to be generally available, but Amazon is promising 10 times faster performance than conventional relational databases at one-tenth the cost of on-premises data warehouses.
january 2013 i n f o r m at i o n w e e k 33
Cover Story With costs as low as USD 1,000 per terabyte, per year, there’s no doubt Redshift will see adoption. These three services are cornerstones for exploiting Big Data, and don’t forget Amazon’s scalable S3 storage, EC2 compute capacity and myriad integration and connection options for corporate data centers. In short, Amazon has been a Big Data pioneer, and its services appeal to more than just startups, SMBs and Internet businesses.
3
Cloudera
Cloudera is the number one provider of Hadoop software, training and commercial support. From this position of strength, Cloudera has sought to advance the manageability, reliability and usability of the platform. During 2012, the discussion turned from convincing the broad corporate market that Hadoop is a viable platform to convincing people that they can gain value from the masses of data on a cluster. But to do that, we’ll need to get past one of Hadoop’s biggest flaw: the slow, batch-oriented nature of MapReduce processing. Tackling the problem head on, Cloudera has introduced Impala, an interactive-speed SQL query engine that runs on the existing Hadoop infrastructure. Two years in development and now in beta, Impala promises to make all the data in the Hadoop Distributed File System (HDFS) and Apache HBase database tables accessible for real-time querying. Unlike Apache Hive, which offers a degree of SQL querying of Hadoop, Impala is not dependent on MapReduce processing, so it should be much faster. There’s a lot riding on Impala. What’s not yet clear is whether it will mostly work with conventional relational tools or whether it will cut many of them out of the picture. Thus, all eyes will be on Cloudera in 2013.
4
Couchbase
A top contender in the NoSQL movement, Couchbase is a key-value store that is chosen for its scalability, reliability and high performance.
34
informationweek january 2013
As such, it’s used by Internet giants (Orbitz), gaming companies (Zynga), and a growing flock of brick-andmortar companies (Starbucks). These and other customers need to scale up much more quickly and affordably than is possible with conventional relational databases. Couchbase is the developer and commercial support provider behind the open-source database of the same name. Key-value stores tend to be simple, offering little beyond record storage. With Couchbase 2.0, set for release in mid-December, Couchbase is looking to bridge the gap between key-value
than 200 analytic functions, from simple joins to predictive analytics. Datameer customer Sears Holdings reports that it can develop in three days interactive reports that would take six to 12 weeks to develop using conventional OLAP tools. What’s more, the spreadsheet-style interface gives business users a point-and-click tool for analyzing data within Hadoop. Through a recent partnership with Workday, Datameer is poised to embed its capabilities into that cloud vendor’s enterprise applications. We’ll be watching for breakthrough results.
There’s a lot riding on Impala.What’s not yet clear is whether it will mostly work with conventional relational tools or whether it will cut many of them out of the picture store and document database, the latter being MongoDB’s domain. Couchbase 2.0 adds support for storing JSON (Java Script Object Notation) documents, and it adds tools to build indexes and support querying. These basics may not wow MongoDB fans used to myriad developer-friendly features, but Couchbase thinks scalability and performance will win the day. Look for a pitched battle in 2013.
5
Datameer
Having lots of data is one thing. Storing it all in one scalable place, like Hadoop, is better. But the real value in Big Data is being able to structure, explore and make use of that data without delay. That’s where Datameer comes in. Datameer’s platform for analytics on Hadoop provides modules for data integration (with relational databases, mainframes, social network sources and so on), a spreadsheetstyle data analysis environment and a development-and-authoring environment for creating dashboards and data visualizations. The big draw is the spreadsheet-driven data analysis environment, which provides more
6
DataStax
Apache Cassandra is an open-source, columngroup style NoSQL database that was developed by Facebook and inspired by Amazon’s Dynamo database. DataStax is a software and commercial support provider that can implement Cassandra as a standalone database, in conjunction with Hadoop (on the same infrastructure) or with Solr, which offers full-text-search capabilities from Apache Lucene. The combination of Cassandra and Hadoop on the same cluster is attractive. There are some performance tradeoffs in the bargain, but Cassandra as implemented by DataStax offers a few scalable and cost-effective options. A big appeal with this NoSQL database is CQL (Cassandra Query Language) and the JDBC driver for CQL, which provide SQL-like querying and ODBC-like data access, respectively. Implemented in combination with Hadoop, you can also use MapReduce, Hive, Pig and Sqoop. Use of Solr is separate from Hadoop, but capabilities include full-text search, hit highlighting, faceted search, and geospatial search.
www.informationweek.in
The two biggest threats to Cassandra, and thus to DataStax, are HBase (now used by Facebook) and DynamoDB, Amazon’s cloud-based service based on Dynamo. The bigger threat appears to be HBase, as the entire Hadoop community is working on maturing that Hadoop component into a stable, high-performance, easyto-manage NoSQL database that’s available as part of the same platform. Success will likely take some of the wind out of Cassandra’s sails (and out of DataStax’s sales). For now, HBase is still perceived as green while DataStax customers like Constant Contact, Morningstar and NetFlix attest to stability, scalability and performance on Cassandra today.
7
Hadapt
Hadapt was hip to the need for business intelligence and analytics on top of Hadoop before its first round of funding in early 2011. Hive, the Apache data warehousing component that runs on top of Hadoop, relies on slow, batch-oriented MapReduce processing. Hadapt works around that delay by adding a hybrid storage layer to Hadoop that provides relational data access. From there you can do SQL-based analysis of massive data sets using SQL-like Hadapt Interactive Query. The software automatically splits query execution between the Hadoop and relational database layers, delivering the speed of relational tools with the scalability of Hadoop. There’s also a development kit for creating custom analytics, and you can work with popular, relational-world tools such as Tableau software. Hadapt is in good company, with Cloudera (Impala), Datameer, Karmasphere, Platfora and others all working on various ways to meet the same analytics-on-Hadoop challenge. It remains to be seen which of these vendors will be a breakout success in 2013.
8
Hortonworks
Hortonworks is the youngest provider of Hadoop software and
commercial support, but it’s an old hand when it comes to working with the platform. The company is a 2011 spinoff of Yahoo, which remains one of the world’s largest users of Hadoop. In fact, Hadoop was essentially invented at Yahoo, and Hortonworks retained a team of nearly 50 of its earliest and most prolific contributors to Hadoop. Hortonworks released its first product, Hortonworks Data Platform (HDP) 1.0, in June. Unlike those from rivals Cloudera and MapR, Hortonworks’ distribution is entirely of open source Apache Hadoop software. And while Hortonwork’s rivals claim
as a service on Amazon Web Services for use in conjunction with Elastic MapReduce. Karmasphere uses Hive, the data warehousing component built on top of Hadoop. The company concedes that Hive has its flaws, like lack of speed tied to MapReduce batch processing. But Karmasphere is integrating its software with the Cloudera Impala real-time query framework as one way around those flaws. “Impala dramatically improves speed-to-insight by enabling users to perform real-time, interactive analysis directly on source data stored in
Hadapt is in good company, with Cloudera, Datameer, Karmasphere, Platfora and others working on ways to meet the analytics-onHadoop challenge higher performance (MapR) or are shipping components that are not yet sanctified by Apache (Cloudera), Hortonworks says its platform is proven and enterprise-ready. Hortonworks isn’t leaving it up to others to innovate. The company led the development of the HCatalog table management service, which is aimed at the problem of doing analytics against the data in Hadoop. Teradata is an early adopter of HCatalog and a major partner for Hortonworks. Microsoft is another important partner, and it tapped Horton to create a version of Hadoop (since contributed to open source) that runs on Windows. With partners like these and its influential team of contributors, there’s little doubt Hortonworks will be a big part of Hadoop’s future.
9
Karmasphere
Karmasphere provides a reporting, analysis and data-visualization platform for Hadoop. The company has been helping data professionals mine and analyze web, mobile, sensor and social media data in Hadoop since 2010. The software also is available
Hadoop,” stated Karmasphere in an October announcement about the partnership. We’ll see how quickly Impala will mature from private beta testing to proven production use, but if it delivers as promised, Karmasphere and others will see a huge leap forward in lowlatency Big Data analysis.
10
MapR
MapR’s guiding principles are practicality and performance, so it didn’t think twice about chucking the Hadoop Distributed File System out of its Hadoop software distribution. HDFS had (and still has, MapR argues) reliability and availability flaws, so MapR uses the proven Network File System (NFS) instead. In the bargain, MapR claims to get “twice the speed with half the required hardware.” The NFS choice also enabled MapR to support near-real-time data streaming using messaging software from Informatica. MapR competitors Cloudera and Hortonworks can’t stream data because HDFS is an append-only system. MapR’s latest quest for better
january 2013 i n f o r m at i o n w e e k 35
Cover Story performance (regardless of open source consequences) is the M7 software distribution, which the vendor says delivers high-performance Hadoop and HBase in one deployment. Many users have high hopes for HBase because it’s the NoSQL database native to the Apache Hadoop platform (promising database access to all the data on Hadoop). But HBase is immature and still suffers from flaws, including instability and cumbersome administration. M7 delivers two times faster performance than HBase running on standard Hadoop architectures, says MapR, because the distribution does away with region servers, table splits and merges and data compaction steps. MapR also uses its proprietary infrastructure to support snapshotting, high availability and system recovery for HBase. If you’re an open source purist swayed by arguments about system portability, MapR may not be the vendor for you. But we’ve talked to high-scale customers who have chosen MapR for better performance. Want to give it a try? MapR is available both on Amazon Web Services and the Google Compute Engine.
11
Neo Technologies
“Social applications and graph databases go together like peanut butter and jelly,” says graph database consultant Max De Marzi. That’s the big reason Neo4j, the open source graph database developed and supported by Neo Technologies, has a unique place in the NoSQL world. Neo4j is used to model and query highly complex interconnected networks with an ease that’s not possible with relational databases or other NoSQL products. Other NoSQL databases may excel in dealing with ever-changing data, but graph databases shine in dealing with ever-evolving relationships. In social network applications you can model and query the ever-changing social graph. In IT and telecom network scenarios you can quickly resolve secure access challenges. In master
36
informationweek january 2013
data management applications you can see changing relationships among data. And in recommendationengine apps you can figure out what people want before they know they want it. Neo4j is a general-purpose graph database that can handle transaction processing or analytics, and it’s compatible with leading development platforms including Java, Ruby, Python, Groovy and others. Neo Technology is working on scale-out architecture, but even without that, Neo4j can manage and reveal billions of relationships. As social and network applications multiply, Neo Technology is in a prime position to manage the future.
12
Platfora
Platfora is one of those startups offering a “newbreed” analytics platform built to run on top of Hadoop. The software creates a data catalog that
or weeks it might take to rebuild a conventional data warehouse. Platfora is the newest of newbreed Big Data analysis companies, but it has a who’s who list of venture capital backers and an experienced management team. In short, this is one to watch in 2013.
13
Splunk
Splunk got its start offering an IT tool designed to help data center managers spot and solve problems with servers, messaging queues, websites and other systems. But as the Big Data trend starting gathering steam, Splunk recognized that its technology could also answer all sorts of questions tied to high-scale machine data (a big factor in the company’s successful 2012 IPO). Splunk employs a unique language and its core tools are geared to IT types, but those power users can set up metrics and dashboards that
If you’re an open source purist swayed by arguments about system portability, MapR may not be the vendor for you. But high-scale customers are choosing MapR for better performance enumerates the data sets available on a Hadoop Distributed File System. When you want to do an analysis, you use a shopping cart-metaphor interface to pick and choose the dimensions of data you want to explore. Behind the scenes, Platfora’s software generates and executes the MapReduce jobs required to bring all the requested data into a “data lens.” Once the data lens is ready — a process that takes a few hours, according to Platfora — business users can create and explore intuitive, interactive data visualizations. You get sub-second query response times because the data lens runs in memory. Need to add new data types or change dimensions? That takes minutes or hours, says Platfora, versus the days
business users can tap to better understand e-commerce traffic, search results, ad campaign effectiveness and other machine-data-related business conditions. There’s an overlap with Hadoop in that Splunk has its own proprietary machine-data repository, but database expert Curt Monash says Splunk is working on ways to work with Hadoop that go beyond the two-way integrations currently available. That would presumably leave Splunk free to pursue analytics while diminishing the need for redundant infrastructure. We’ll be watching for that important release.
Source: InformationWeek USA
www.informationweek.in
Case Study
Analytics helps Usha International respond to dynamic market needs By implementing SAP HANA Database along with SAP NetWeaver Business Warehouse component, Usha International is able to quickly churn out real-time business-critical information and rapidly respond to dynamic market needs By Amrita Premrajan
A
household name in India, Usha International Limited (UIL) is a multiproduct consumer durable manufacturing, marketing and distribution company, which has more than 33 warehouses, 70 companyowned retail stores, and a distribution network of more than 14,000 dealers. Being in the milieu of a competitive market, the key business need of the company was to be amongst the first few who quickly respond to the changing market needs and immediately tap the profits. In order to achieve this, firstly UIL wanted to gain granular visibility across its widespread supply chain, sales and inventory data. And secondly the company wanted to be able take quick and informed decisions by churning out real-time business-critical insights from the enterprise’s business data. For example, UIL wanted that the real-time data on cash-flow, daily sales and inventory data should be available for operations review at the click of a button. After a lot of evaluation and brainstorming, UIL decided to implement SAP High Performance Analytics Appliance (HANA) Database which uses in-memory technology that facilitates in obtaining quick results from even complex data queries. Traditionally, a majority of data storage, calculation and aggregation, happened on-disk which substantially impacted the end-user experience. But with SAP HANA, a complete in-memory database is built that combines transactional data processing, analytical data processing and application logic processing functionality, in-memory.
“SAP HANA on Business Warehouse provides us visibility into live information. This helps us make right decisions and respond faster to consumer demands”
Subodh Dubey
Group CIO, Usha International Limited
In February 2012, UIL started the implementation of SAP HANA Database over which the SAP NetWeaver Business Warehouse (the Business Intelligence and analytics piece) was to be run. For this implementation, UIL worked closely with a dedicated SAP Ramp-Up coach and SAP Enterprise Support services and successfuly completed the entire project within just four weeks. By March 2012, the project was live and was rolled out to its 16 branch offices, 30 depots, and four manufacturing plants. Talking about the benefits SAP HANA has brought to the company, Subodh Dubey, Group CIO, UIL says, “To take full advantage of our business data, we needed high-performance business analytics. That means faster data loads
Key Benefits l
l
l
l
50 percent reduction in inventory levels 50 times faster query performance About 75 percent improvement in reporting and analytics performance Achievement of 8 times faster data loads
and faster query response times. Implementing SAP HANA has improved our performance tremendously with faster data loads and better storage compression, and has given a whole new life to our BI solution.” One of the major benefits UIL registered due to the transparency brought by the implementation was reduction in inventory levels by 50 percent and elimination of out-of-data inventory. This in turn led to lowering of inventory carrying costs and better cash flow. Also, UIL has achieved sales visibility down to the individual Stock Keeping Unit (SKU) and territory. “SAP HANA on Business Warehouse provides us visibility into live information with minimal time-lag. This helps us make right decisions at the right time and respond faster to consumer demands,” adds Dubey. UIL recently won the SAP ACE Award for Customer Excellence 2012 for being the first ever customer to do the real-time analysis of Big Data using the new SAP HANA for Business Warehouse.
u Amrita Premrajan
amrita.premrajan@ubm.com
january 2013 i n f o r m at i o n w e e k 37
Interview
‘Organizations need to cultivate an analytic-driven culture’
IT-based decision making has been a continuously evolving process. Maneesh Sharma, Head - Database & Technology, SAP India, discusses the challenges organizations face to gain business insights from the huge amount of data and how analytics helps enterprises in taking more fact-based decisions
38
informationweek january 2013
What are some of the biggest challenges organizations face while making technology-based business decisions? Technology-based decision-making by businesses in India has been a continuously evolving process. Over the years, organizations have automated their business processes by using software solutions like ERP, core banking, billing, CRM, etc. This ‘data-in’ process was followed by the need of ‘data-out’ by business users which led to the adoption of Business Intelligence (akin to MIS) and data warehousing solutions. There has been a steady and silent growth of data within organizations both in the form of structured data (what we call query-able transactional data) and unstructured data (content in files, social feeds, network feeds, etc). With new-age applications that are driving mobile usage along with social media, the very real explosion of data has caught organizations off-guard. With our always-on and accessanywhere paradigm, real-time visibility into processes and data is the need of the hour for informed decision making. Getting business insights from this data is impeded by two common challenges: (a) Quality and sanctity of data is always the suspect impacting adoption of technology solutions and (b) The sheer volume of data makes it difficult to give insights at the time it is needed the most. According to Gartner, a majority of the organizations in India are still struggling to use output from analytical tools to input into strategy. Organizations need to cultivate an analytic-driven culture to keep at pace with changing business scenarios. How can they use advanced analytical tools to support these decisions? Decision-making in India historically has been based on either “gut feelings” or
www.informationweek.in
on the business experience of managers. Business Intelligence/analytics allows enterprises to make more fact-based decisions. However, there is a shift in the usage of these tools with business users asking for actionable ‘insights’ rather than ‘reports’ from BI platforms. These platforms are now providing industryfocused analytical solutions to provide actionable insights in real time to users. End customers are no longer interested in reports that present data to them to depict what has already happened. They are demanding solutions to problems/situations that can be ‘predicted’ so that they can plan for it. A retailer would like to know which promotion to run based on historical analysis of what has worked or not worked. A customer loyalty relationship manager would like to provide spot offers to users based on buying behavior and personal preferences. The planning division of a manufacturer can do realtime simulation of their manufacturing forecast based on the in-hand inventory and past demand. An e-commerce site can help online users fight the right product/price based on usage patterns, and so on. What are your latest offerings in the analytics space? SAP’s real-time database platform and SAP BusinessObjects platform solutions provide comprehensive analytical functionality that can empower users to make effective, informed decisions based on solid data and analysis in real-time. All users, from the high-end analyst to the casual business users, have access to the information they need with minimal dependence on IT resources and developers. Some of latest offerings from SAP in the analytics space are: l SAP 360 Customer Solution: Powered by the SAP HANA platform, it harnesses the power of in-memory computing, cloud, enterprise mobility and collaboration to allow organizations to revolutionize the way they engage with their customers beyond traditional customer relationship management (CRM). l SAP Predictive Analysis Software is an advanced analysis solution
that helps organizations gain a competitive edge by uncovering and predicting trends in their business. SAP Predictive Analysis can be deployed stand-alone or with other SAP software, including the SAP HANA platform to unleash powerful predictive capabilities. l New Analytics Edition that bundles SAP BusinessObjects BI solutions and the SAP Sybase IQ server, is a high-performance data foundation designed specifically for analytics. The solutions help organizations of all sizes gain better insights into Big Data with advanced data visualization, self-service data discovery, mobile BI, and predictive capabilities. Analytics editions of SAP BusinessObjects BI and SAP Crystal solutions help companies manage Big Data and transform it into intelligent data. Where do you see analytics in the years to come? According to Gartner, Analytic Applications and Performance Management is projected to grow to USD 27.8 million by the end of 2016 from USD 14.2 million in 2012. BI with an estimated growth of USD 111.2 million by the end of 2016 will continue being one of the fastest growing software markets, despite sluggish economic growth in some regions, as organizations continue to turn to BI as a vital tool for smarter, more agile and efficient business. Advanced analytics will play a vital role in determining the success of BI efforts in organizations. While BI serves a distinct purpose for sharing, summarizing and manually exploring data and metrics, more advanced analytics can aid and even automate decision-making. In the coming years, next generation analytics will expand beyond measuring and describing the past to predicting what is likely to happen and optimizing what should happen based on an increasingly varied set of data sources and types. The addition of mobile, social and collaborative technologies to advanced analytics tools will give a broader set of users insights for decision making based on their requirements.
While BI serves a distinct purpose for sharing, summarizing and manually exploring data and metrics, more advanced analytics can aid and even automate decision-making
u Ayushman Baruah
ayushman.baruah@ubm.com
january 2013 i n f o r m at i o n w e e k 39
Interview
‘Business analytics is a key priority for us in 2013’ Please tell us about Hero MotoCorp’s experience with Business Intelligence and analytics, and how has the company benefitted from these technologies? We extensively use business analytics across various levels and it helps the organization to take faster decisions. Investment on business analytics tool is considered as strategic in nature at Hero MotoCorp, as it empowers our business to take right and timely decisions, instead of quantitative ROI calculations. Top two benefits that I see of using these tools are: a) Using information analysis to achieve business objectives: With the implementation of analytical tool, a lot of advanced analytical capabilities get added that enable the business to analyze situations from various perspectives, thus enabling better and speedy decisions. I have seen people kind of getting ‘addicted’ to analytics and since we also do lot of ‘pushing’ of dashboards and analytics to individual mailboxes, one does get reminders if anything is delayed. Now as a next step, we are getting into predictive analytics. b) One version of the truth: The data which comes out through analytical tools becomes the single window of information for all the information needs of the organization. Will spending on business analytics solutions — especially predictive analytics — be a priority for you in 2013? Yes, spending on both business analytics and predictive analytics is a key priority for us in 2013. For us at Hero MotoCorp, all our processes today are IT enabled and there is a major focus on business analytics. We run Business Objects and have dashboards at various levels right from leadership team to even shop floor. Also, our major customer segment is youth. With today’s youth being very active on social media, social media analytics, sentiment analysis, etc., is also becoming a key focus area for us.
40
informationweek january 2013
Going forward, we need to cater to not just structured data but also unstructured data to gain more meaningful insights. What are some of the key challenges that CIOs might face while implementing analytics tools and how should these be tackled? In my view, if done properly, Business Analytics (BA) can bring great value to the company. One could start with a pilot program. The objective of pilot program in my view is not to ‘test’ technology but to ensure that BA tool works in your environment. Business Analytics is a proven technology and can be used for any type of analysis. However, success of a BA project is not dependant just on technology, there are other factors — people, processes (of data collation, extraction, review etc.), and the culture of organization. Based on learning derived out of the pilot, complete rollouts should be undertaken. User education and change management are one of the key areas that a CIO would have to focus on, while moving onto analytics tools. For example, most of the time, people do not differentiate between reporting tools and analytical tools. They would like to see the same Excel format and analysis on a tool, whereas analytical tools have much more capabilities. So, users need to be educated on various algorithms to extract maximum value of an analytical tool. Once the users start using the analytics tool, and the solution starts demonstrating its capabilities, the demand for information will go on increasing. Where, initially the managers have a tendency to ask for transactional information, as time passes by it moves from reporting to intelligence to analytics. Both IT and users undergo an evolution on this perspective. So considerable change management effort is required, along the way, to make users use these kinds of systems. u Amrita Premrajan
amrita.premrajan@ubm.com
Vijay Sethi, Vice President and CIO, Hero MotoCorp, talks in-depth about the role of analytics tools in making businesscritical decisions. He elaborates how the new breed of business analytics technologies like social media analytics, sentiment analytics and predictive analytics are increasingly becoming relevant for businesses www.informationweek.in
Interview
‘Data mining and BI can make all the difference between success and failure of a business model’ How can BI and analytics tools help an organization gain an edge over its competitors? How has your organization benefitted from the deployment of these technologies? In a networked world, the use of analytics has become a key basis of competition and growth. This is especially true in the service industries, such as telecom, banking, hospitality etc., where data volumes are substantial. In most industries, established competitors and new entrants alike will leverage data-driven strategies to innovate, compete and capture value from deep analytics and real-time information. Telecom operators, for example, are continuously generating customer information in terms of their usage pattern, behavior, etc. Such data can be put to use for providing them innovative products and services, which may increase the age on network and reduce churn. Due to the explosion of potential data sources and the sheer scale of global business today, data mining and subsequent business intelligence can make all the difference between success and failure of a business model. At MTS India, we had adopted the basic concepts of analytics much earlier than the competition. MTS uses the analytics platform to gain customer the insights for improving service, and quality of products and services, as well as generating up-selling and crossselling opportunities. Can you give us an example of a customer-facing program at MTS, where analytics tools were utilized to churn out business-critical information, which was then tapped to derive business benefits? MTS India has come up with an innovative reward program, m-Bonus, which dynamically computes what the consumer needs and offers the best fit deal for it. For example, if a subscriber has made five calls to Chennai on a
Rajeev Batra, CIO, MTS India, elaborates how leveraging business analytics tools is increasingly becoming vital for businesses to gain a competitive advantage. He also shares how MTS India is deriving concrete business benefits by using BI and analytics solutions to target customers with customized campaigns. Excerpts from an exclusive interview with Amrita Premrajan of InformationWeek january 2013 i n f o r m at i o n w e e k 41
Interview particular day, he could get a 30 percent off on his next call to Chennai on the same day. In this program, the customer is no longer a part of a market segment but he is a market segment in himself. It is this personalization in the reward program, which makes m-Bonus a unique program, not only in India but anywhere in the world. The m-Bonus program is powered by a complex yet ‘simple to use’ system, which is an amalgamation of realtime business intelligence, a flexible campaign management system and a dynamic service provisioning system. The system provides end-to-end functionality of a campaign manager — right from customer selection, automating communication for promoting the campaign to creating rules for providing benefits to customers based on certain qualification criteria and campaign feedback. The solution has been designed in such a way that within a few hours of a customer using a mobile phone, the call detail records are made available to the m-Bonus platform so that campaign qualification can happen on a near real-time basis. This has empowered the business to create its own campaigns with minimal or no IT intervention. As we are able to target customers with customized campaigns, this initiative helped us increase our revenue by 2 percent on a month-on-month basis and enabled cost savings of approximately 20 percent in operational costs. Is spending on analytics solutions a top priority for you? Our analytics platform enables our business to create micro-strategies and segmentations of our customer base that is instrumental in developing service offerings tailored for the customer needs. With insights on customer behavior, channel behavior and vendor behavior, the impact of analytics can be felt where it impacts the most — top-line and bottom-line. Hence analytics is a priority area for us in 2013. The icing on the cake is that given the right processes and architecture in place, BI also gives business teams an opportunity to react faster. With BI, everything becomes meaningful —
42
informationweek january 2013
from a comment posted on a blog by a customer to feedback on your latest advertisement. How are you looking at tapping Big Data to derive business benefits for the company? One of the areas where telecom operators are focusing on is managing analytics around Big Data, including both structured and unstructured type of information, and deriving useful business insights from it. As discussed earlier, with the m-Bonus initiative we are using a massively columnar analytical database — that’s a step towards leveraging Big Data. It is more on the structured data side. We are currently using Big Data primarily for marketing campaigns, churn management and to achieve business intelligence on the operational database. We are building the streams for unstructured data as well, where we gather information from social networking sites to create pertinent products that are relevant to different customer segments. We can say with conviction that telcos are only domains which are geared up to handle Big Data better than any segment at this point of time. What are your views on embracing open standards like Hadoop? I feel that adopting common standards will accelerate growth and the pace of innovation, while lowering costs in the telecom industry. The industry is struggling with traffic volumes and costs that are rising faster than revenues. Interoperability and easier integration between diverse systems and elements through a rugged open standards implementation will enable all parties to gain in the medium term. The ability to provide new communication services and features to consumers faster and at lower costs is the key advantage of implementing open standards. With open standards, we can more easily integrate components to build a best-of-breed communication solution, or develop our own uniquely customized solution.
Analytics enables our business to create microstrategies and segmentations of our customer base that is instrumental in developing service offerings tailored for the customer needs
u Amrita Premrajan
amrita.premrajan@ubm.com
www.informationweek.in
Opinion
On-demand Big Data analytics: An emerging trend
O
Dr Matt Wood
Emerging trend of on-demand analytics is enabling businesses of all sizes to avail supercomputing power on a payper-use model and analyze complex data sets to derive business-critical information
http://www.on the web ‘Analytics helps you see your customer as a person’ Read article at:
rganizations have known about the benefits of analyzing data for some time. However, with the increase in availability of data within an organization, and beyond, traditional, static methods of capturing, storing, analyzing and collaborating around that data have acted as barriers to unlockthose potential benefits. Organizations are now moving from the undifferentiated heavy lifting associated with racking and stacking disks and servers to being able to focus on the questions they want to ask of their data, their own business insights, and how best to serve their customers. Data analytics is relevant to pretty much every industry and the availability of on-demand computing could give businesses of all sizes the ability to gain access to supercomputing power on-demand and explore complex data sets but only pay for what they use. This would help in identifying new business opportunities, improving existing ones, and significantly increasing the levels of innovation within an organization. Ten years ago managing and collaborating around a single genome (about 3GB of data) was relatively straightforward. But today, we are looking at the challenge of managing and sharing data for thousands of genomes being sequenced regularly, under the 1000 Genomes Project, which is an international collaboration to produce an extensive public catalog of human genetic variation by sequencing the genomes of a large number of people. Amazon Web Services (AWS) has helped in this area, by making the data of the 1000 Genome Project freely available on Amazon Simple Storage Service (Amazon S3). Let’s look at the examples of some companies that are benefitting by using Amazon S3. Take the case of a U.S-based consumer review website, Yelp, which had more than 39 million unique visitors to the site, as of November 2010. In order to give users a better
experience on the web, the company had been including certain features on the website, like: ‘People Who Viewed this Also Viewed’, review highlights, auto complete as you type, top searches, etc. And all the data that powers these site features was being kept as logs. Now, the challenge for the company was the ever-increasing volumes of traffic on the site, which directly affected the ability of the company to easily provide the mentioned features. To tackle this, Yelp availed Amazon S3 and started storing around 100 GB of logs per day. Also, the company started using Amazon Elastic MapReduce, a managed Hadoop service, to power approximately 20 separate batch scripts, and make available the mentioned features. Using Amazon Elastic MapReduce, the company was able to save USD 55,000 in upfront hardware costs. Another interesting case is that of Razorfish, a U.S.-based digital advertising and marketing firm, which targets online advertisements based on data from browsing sessions. A common issue Razorfish was facing was the need to process gigantic data sets resulting from holiday shopping traffic on a retail website, or sudden dramatic growth on a media or social networking site. Normally, crunching these numbers took it two days or more. With Amazon Elastic MapReduce, the company has been able to drop its processing time from two days to just eight hours. The company is now equipped to offer a multi-million dollar client service programs on a small business budget, which has helped it increase its return on ad spend by 500 percent. On-demand analytics, is increasingly helping customers explore their data and drive greater business benefits by removing the constraints applied by traditional infrastructure, and accelerating the data analytics pipeline. u Dr Matt Wood is Chief Data
Scientist, Amazon Web Services
january 2013 i n f o r m at i o n w e e k 43
Opinion
Big Data, bigger opportunities for insurance companies
T
Prashant Gupta
Technology will ignite postfinancial crisis opportunity and growth for the insurance industry
http://www.on the web Does Big Data mean big security issues? Read article at:
44
informationweek january 2013
he global financial crisis of 2008 continues to have far reaching implications, from the health perspective of global markets to shifting consumer fiscal attitudes. Although they are not as drastically impacted as the financial industry, many insurers found themselves on unfamiliar ground — and on the wrong side of risk management. Though long an industry considered recession proof, large insurance companies are finding themselves at the center of a storm. But even those that maintained conservative, riskadverse strategies are facing plenty of tough new challenges. For example, consumers are shopping for discount insurance, making it increasingly difficult to acquire and keep low-risk customers. Also, decreased economic activity is leading to fewer highpremium policy sales — fewer policies in turn means fewer funds available to pay claims. Apart from this, rightsizing of employees and reduced investments in technology is resulting in less focus on innovative solutions. Despite recent miscues, the insurance industry tends to be conservative and risk adverse. It’s a mindset that often extends across the enterprise, so technology adoption tends to be slower than in other industries. And that needs to change. Technology will be central to the industry’s ability to respond to these and other post-financial crisis challenges, and develop new, more profitable operational models. A platform that integrates cloud services, machine to machine (M2M), data mining, and analytics will help companies gain a clearer view into risk, optimize insurance pools, deliver differentiated services, connect with and better serve tech-savvy customers, and expand into emerging markets. It’s true that the challenges facing today’s insurance companies are considerable. But the opportunities enabled by emerging technologies are
even greater.
INFORMATION DRIVES POOL OPTIMIZATION
In every branch of insurance, optimizing the insurance pool is a central challenge. Profitability is dependent in not only identifying and drawing in low–risk customers but also accurately matching premium rates to risk levels. Any insurance pool will contain both high and low-risk customers. But pools can become unbalanced and revenue lost if the competition lures away all of the low-risk customers with more competitive premium rates. And those customers, empowered by a wealth of online information, are shopping around, looking to be rewarded with the lowest possible premiums. To create and maintain better balanced and more profitable pools, insurance companies need to adopt technologies that can help them make more informed underwriting decisions. Because the financial risk of getting stuck with a poorly balanced pool far outweighs that of investing in new technology.
INNOVATION EMPOWERS DIFFERENTIATION
As the insurance industry becomes more technology based, cloud solutions will provide the foundation for new data –driven underwriting models and pool optimization. Cloud computing plays a crucial role in the aggregation and storage of the vast amounts of valuable data that can now be collected from mobile devices and M2M applications. That information can be enhanced, mapped, and combined with data from other sources, like actuarial models, to calculate risk and possibly even predict events. Usage based insurance (UBI) systems provide near real-time insights into customers’ driving habits and risk level. UBI systems include a small device that contains a cellular radio, an accelerometer, a GPS unit, and a processor;
www.informationweek.in
this device is installed in the customer’s vehicle to provide the carrier with information on driving habits, based on how the vehicle performs. This information is augmented by additional proprietary information in a cloud environment and then analyzed to help insurers set rates tailored to the individual’s level of risk. Health insurers are envisioning similar scenarios, such as proactive healthcare monitoring. Down the road, health insurance customers may voluntarily monitor their vital signs to help insurers write truly personalized policies. All this data is opening the door to utilization of cloud-based analytics that will empower insurers to develop new predictive risk models. That data could lead to a more proactive, less reactionary approach to underwriting. For instance, let’s take the case of identity theft and how it will work in different scenarios. The current model for identity theft insurance is wholly reactionary. Once a person’s identity has been stolen, the policy pays compensation and provides legal representation to untangle the multitude of issues associated with re-establishing identity and reputation. In the near future, the entire discussion will move to predictive analysis. Insurers could use real-time monitoring, universal identity models, and contextual identity proofing to model a customer’s identity, including how it interacts with retailers and service providers and other facets of life. By applying predictive analytics, the insurer could then develop anticipatory intelligence about when and where theft might occur, and take measures to prevent it. This could be a major shift, one that leads to a whole new spectrum of personalized and differentiated offerings that move far beyond cost incentives.
GLOBALIZATION PROPELS EXPANSION
Another new, technology enabled frontier for insurers and a major opportunity will be in the emerging markets. It is estimated that the emerging six (E6) will contribute 47 percent of the global GDP growth between 2006 and 2020 while global six
(G6) will contribute less than 24 percent during the same period. The potential market for insurance in developing economies is estimated to be between 1.5 and 3 billion. To serve customers in emerging markets, insurers will need to partner with a network provider that can deliver seamless, reliable, highly available access to people, systems, and data around the planet. With a truly global technology infrastructure, insurers can expand operations and service offerings, and, at the same time, become more agile and efficient. As consumption in these countries increases, demand for insurance products will not only give rise to countless startup agencies, but also propel many existing enterprises into the global market.
COLLABORATION ACCELERATES SUCCESS
For today’s insurance companies, the real risk is in doing nothing. Because the companies that quickly move to develop new technology–enabled models will be the ones with the most balanced pools and unique, varied and individualized offerings. Forwardlooking insurers should begin building their platforms now. Critical for the next three to five years will be: l Building technology platforms that securely integrate: Mobility, M2M, Cloud, Analytics and Security l Developing a service-centric architecture that supports flexible, responsive, and agile business models and global capabilities. l Creating social platforms to drive business intelligence and create new customer channels, and the employees to support them. l Partnering with trusted global technology provider(s) Tomorrow’s underwriting and operational model will be radically different from today’s. It will require a whole new mindset, and skill sets. But it will also attract a new generation of young, tech-savvy employees, who will bring their experience and insight into mobile, social, network, and datadriven technologies to re-energize the industry.
Huge amount of data available is opening the door to utilization of cloud-based analytics that will empower insurers to develop new predictive risk models
u Prashant Gupta is Head of Solutions, India, Verizon Enterprise Solutions
january 2013 i n f o r m at i o n w e e k 45
Opinion
Big Data driving the role of Customer Experience Officer
A
Ryan Hollenbeck
With companies investing in the technologies that capture and analyze Big Data, Customer Experience Officer can be the conduit for ensuring that the relevant information is shared throughout an organization
http://www.on the web 6 lies about Big Data Read article at:
46
informationweek january 2013
s we near the end of 2012, it’s amazing how much the topic of Big Data has monopolized the discussion for IT this year, and we aren’t likely to see it end anytime soon. At the C-level, the discussion transitions from managing Big Data to making Big Data actionable. How do we use Big Data to improve operations and fulfillment, marketing, sales and support, etc.? Even more importantly, how do we make Big Data actionable to improve customer experience, satisfaction and loyalty? To complete our Big Data story for the year, let’s explore Big Data as the “driver” for the role of a Customer Experience Officer (CEO). This is what will move Big Data from a “managing” exercise to an “actionable” event, where all the work and investment made by organizations for leveraging Big Data will be personified either through an “Office of the Customer Experience.” The movement for a formalized senior level, customer-centric title is well underway. We’ve seen elevated titles associated with customer operations, customer experience, customer success, customer feedback, and customer advocacy across the industry and our customer organizations. When you add “Big Data” to the equation — and all of the areas where Big Data comes into play in representing the Voice of the Customer (VoC) — you understand my point in Big Data accelerating the concept of the Customer Experience Officer. Big Data and VoC intersect across the entire corporation in areas such as customer service departments that gather input based on transactional interactions; marketing departments that analyze interactions on websites and in marketing campaigns; and product management that examines input to assess feature requirements and more. In all these cases, each business unit gathers information via its own methodologies and for its own use, stranding critical information in silos —and depriving companies the
benefit of inter-group communication. With companies investing in the technologies that capture and analyze Big Data, Customer Experience Officer can be the conduit for ensuring that all the relevant information is shared throughout an organization. Only when the sharing of information becomes part of the corporate design and philosophy can the informational insights be shifted from informal and sporadic to purposeful and ongoing. The Intelligent Enterprise insists on collaboration — not functional silos. Creating a specific title and charter for the role of the Customer Experience Officer will lend executive endorsement and send the signal that an organization has completely shifted towards a customer-centric model. This is reinforced in a 2012 Strativity whitepaper titled ‘The Path to Customer Experience Success’. The firm states: “Having strong technology support for the new customercentric business strategy enables companies to engage their complex, multichannel customer in meaningful dialog-driven conversation while at the same time driving rich analytics reflective of the complete view of the customer experience.” Furthering this point, a 2012 Capgemini study indicates that 85 percent of respondents say the issue is not about the volume of data but the ability to analyze and act on data in real-time. Another report sponsored by Oracle in 2012 indicates that 93 percent of executives believe their organization is losing revenue on average, 14 percent annually, as a result of not being able to fully leverage the information they collect. A dedicated and formalized role is required to leverage the value of Big Data in Voice of the Customer initiatives. Big Data and the Customer Experience Officer — seems like they should arrive hand-in-hand. u Ryan Hollenbeck is Senior Vice
President, Marketing, Verint Systems
www.informationweek.in
Interview
‘Public-Private Partnership is key to enhance security’ Cyber security is perhaps our nation’s greatest threat? How vulnerable is India? Cyber security may not be the greatest threat India is facing today; however, as our nation advances and makes more strategic use of technology especially in all the critical sectors, it may become the greatest threat in future. Cases in point are developed countries, such as the US whose critical infrastructure is highly dependent on technology. In this new era, countries are using state and/or non-state actors to launch attacks on the cyberspace. From a national security perspective, security of critical information infrastructure is becoming a top priority. Over the years, targeted attacks on critical information infrastructures of a nation to disrupt
and impact normal functioning, with wide economic consequences have been observed. Attacks on power grids, oil rigs, national ICT assets and other critical infrastructures causing heavy outage have made digital nations realize the significance of securing their critical assets due to their increasing interdependency in digital arena. Additionally, nations including India are highly vulnerable to cyber espionage — both government and commercial establishments worldwide have been victims in the recent past. Cyberspace has emerged as the fifth domain after land, air, sea and outer space, which nations need to defend. Being battle-ready in the cyberspace is the responsibility of concerned agencies.
There was one highly publicized incident where thousands of Indian citizens in places like Bangalore abandoned their places of work due to a highly targeted and coordinated misinformation campaign on social media. How do you think we could have prevented such a thing from happening? What could have been done? Technology is neutral but like everything else, it can be put to uses that serve the purpose of interested parties. We’ve seen the way technology has been used selectively in Syria and Libya — the so-called Arab Spring, and nearer home the way Anna Hazare movement has used it. But there is nothing infallible about the way it
With more and more critical infrastructure owned and operated by the private sector, public-private partnership is increasingly becoming important for strengthening national security, says Kamlesh Bajaj, CEO, DSCI in an interview with Srikanth RP of InformationWeek. He also updates on initiatives taken by DSCI to improve national security policies january 2013 i n f o r m at i o n w e e k 47
Interview is used. The state can also master it. What we saw in this case is a part of the larger picture called cyber security, which in turn is critical for national security. Misuse of social media is but one dimension of cyber security. While the intelligence agencies have to master their use, cyberspace has no boundaries, even though nations are trying to police and control activities within their jurisdictions. Traditional dividing lines between defence and security, civilian and defence, military solutions and law enforcement, and the public and private sectors, are breaking down. No single ministry can handle all facets of cyber security; they need to coordinate. Lead agencies have to be appointed and empowered. Government agencies need to share intelligence over secure networks. But ultimately it is human intelligence at
4
5 6 7
8
crisis, this inventory can determine the possible impact on various information infrastructures and contain the attack. Establish a center for best practices in cyber security that will help us focus on real threats in their environment instead of creating extensive documentation. Establish a National Threat Intelligence Center for early watch and warning. Build capacity in law enforcement agencies in cybercrime forensics and cyber forensics. Build legally-sound interception capabilities to balance national security and economic growth. We badly need a national center for research in encryption and cryptanalysis. Establish a center of excellence in
identified to be executed in PPP mode are: l Setting up of a pilot testing lab l Conducting a test audit l Study vulnerabilities in a sample critical information infrastructure l Establishment of a multidisciplinary Center of Excellence (COE). This is a very good policy initiative, as the government and industry can overcome the challenges of cyber security only by working together. What role can private players play in strengthening national security? Public-Private Partnership is the key to enhance cyber security, as more and more critical infrastructure is owned and operated by the private sector. The government has a larger role to lead
No single ministry can handle all facets of cyber security; they need to coordinate. Lead agencies have to be appointed and empowered the top that cannot afford to fail. This calls for a cultural change in outlook. From a policy point of view, what are the steps or policies that the Indian government can adopt or implement to have stronger cyber security? In April this year, the NASSCOM and DSCI, gave a report to the government. We recommended 10 things that the government needs to do: 1 Create a national structure for cyber security that is positioned at the highest level within the government. We had recommended that the head report to the National Security Advisor and to the Prime Minister. 2 Design and implement a competency framework to build an adequate and competent cyber security workforce. 3 Create and maintain an inventory of critical information infrastructure so that in case of a cyber-attack or
48
informationweek january 2013
cyber security research. 9 Set up testing labs for accreditation of ICT (Information and Communication Technology) products to manage ICT supply chains. 10 Establish a cyber command to defend Indian cyberspace. This report, ‘Securing Our Cyber Frontiers’ generated positive response from the government. The National Security Advisor (NSA) constituted a Joint Working Group (JWG) comprising representatives from the government and industry, under the leadership of Dy. NSA. The JWG was mandated by the government to formulate recommendations for strengthening public-private partnership in cyber security. The JWG report was released by the NSA in October this year, highlighting its key recommendations across institutional frameworks, capacity building, security standards and audits, and testing and certifications. The pilot projects
such initiatives from the front since national security is involved. The policy challenges to incentivize the private sector to spend more on security than what the business case would justify, have to be addressed. On the other hand, the industry needs to be more proactive on engaging with the government on cyber policy issues. For instance, the IT industry, through NASSCOM, has taken proactive steps such as the establishment of Data Security Council of India (DSCI) to work dedicatedly on the subject of cyber security and data protection. The industry has to take security seriously by raising it to the board level and giving security leaders more authority and support. The JWG report is a good start and its implementation will bring out new partnership models between the government and industry. The role of private sector and government has also been detailed in the ten recommendations made by the
www.informationweek.in
‘Securing Our Cyber Frontiers’ report. As more government services go online, what in your view are the key gaps in security? What can be done? The electronic service delivery bill seeks to provide electronic delivery of public services by the Center, the States and all public authorities under them to all individuals to bring about transparency, efficiency, accountability, accessibility and reliability. While ensuring availability, government is also working to ascertain that the services delivered are secure. National eGovernance Service Delivery Gateway or NSDG is the messaging middleware component of the service-delivery network as specified in India’s National e-Governance Plan (NeGP) with a
provides a list of security controls based on the risk categorization of particular assets. Most of the threats are now targeting Layer 7, i.e., the Application layer as per the OSI model. Rigorous and periodic application testing is a must to ensure that weak links are rectified before any zero-day threat surfaces. Moreover, there is a need to issue security guidelines for these projects which are more practice oriented. Please elaborate on the key initiatives taken by DSCI that have led to an improvement in national security policies? l NASSCOM-DSCI Report – ‘Securing Our Cyber Frontiers’ generated positive response from the government. The National Security Advisor (NSA) constituted a Joint
to provide training in cyber crime investigation and cyber forensics — four of these are supported by DeitY. Over 20,000 law enforcement officers, members of judiciary, military, public prosecutors, PSUs have been trained so far. Cyber labs supported law enforcement in investigating over 220 cyber crime cases. DSCI is the knowledge partner to the Ministry of Home Affairs for capacity building in cyber forensics under the Cyber Crime Investigation Program. DSCI developed standard operating procedures in the form of cyber crime investigation manual and standardized the cyber crime training material for investigators. l DSCI actively contributes to cyber security policies and frameworks as a part of the Council for Security
The policy challenges to incentivize the private sector to spend more on security than what the business case would justify, have to be addressed purpose to facilitate interoperability among technologies running on heterogeneous platforms and exchange of data throughout the network. Adequate measures are being taken to ensure end-to-end secure communication between the service seeker and the service provider. The National e-Governance Division (NeGD), under the Department of Electronics and Information Technology (DeitY), is the Program Management Office of NeGP. Among its various activities, including facilitating implementation of NeGP by various ministries and state governments, the agency is also responsible for issuing cyber security and data security standards and guidelines for all the e-governance projects under NeGP. For securing e-Governance projects, Standardization Testing and Quality Certification Directorate (STQC) has developed e-Governance Security Assurance Framework (e-SAFE), which
Working Group (JWG) comprising representatives from the government and industry, under the leadership of Dy. NSA. The JWG was mandated by the government to formulate recommendations for strengthening publicprivate partnership in cyber security. The JWG report was released by the NSA in October this year, highlighting its key recommendations across institutional frameworks, capacity building, security standards and audits, and testing and certifications. The pilot projects have also been identified in PPP mode. l DSCI has been actively involved in the making of rules under the IT (Amendment) Act, 2008 and also provided inputs, post industry consultation, on the draft National Cyber Security Policy released by DeitY in 2011. l DSCI is running eight cyber labs
Cooperation in the Asia Pacific (CSCAP). l DSCI has developed DSCI Security Framework (DSF) and DSCI Assessment Framework for Security to enhance the maturity of security practices of organizations, especially those operating critical infrastructures. l DSCI has been creating awareness among government organizations, businesses and citizens through its different outreach programs including conferences, workshops, trainings, etc. and has been recognizing organizations, law enforcement agencies and individuals who have done outstanding work in security, privacy and cyber crimes through DSCI Excellence Awards .
u Srikanth RP srikanth.rp@ubm.com
january 2013 i n f o r m at i o n w e e k 49
Interview
‘There’s never been a better time to be in IT’
I
n the last few years, EMC Corporation has transformed itself from storage company to a provider of data management and cloud computing solutions. It is also an early adopter of Big Data analytics. To do this, EMC had to make some bold decisions, virtualize its IT assets, and then move its global infrastructure to a private cloud. The IT organization within EMC delivered a functional cloud to EMC Corporation within five years, with savings of more than USD 100 million in equipment, power, space, and operational expenditures. And this massive effort was led by Sanjay Mirchandani, CIO and COO, Global Centers of Excellence, EMC Corporation. Mirchandani’s efforts were widely acclaimed and he received Drew University’s prestigious Achievement in Business award in September 2011. He is also one of Boston Business Journal’s CIOs of the year 2012. Excerpts from an exclusive interview with Brian Pereira of InformationWeek India: In the context of what you offer and what you do, how would you define business transformation? First let me talk about the trends and then how these are impacting businesses. Cloud computing is transforming IT. And we have been building a capability and infrastructure, a set of functionalities that allows a company to transform the way it delivers IT. Big Data is transforming business. The business is asking for predictive analytical applications, real-time or real-enough time. Trust is enabling the cloud, so you have to ensure that the cloud is trustworthy, trusted, etc. And all of this is transforming IT. The one thing that the business wants from me is agility (speed). There’s also costs, because we’ve optimized the infrastructure and we’ve got standards. In some cases, such as manufacturing, you need perfection. But in a lot of cases of business supporting applications and business supporting capabilities,
50
informationweek january 2013
it is really about good enough, but we keep working together (to perfect the solution). This is how the need for agility is transforming the business. Businesses acknowledge that technology is essential for competitive advantage. So now we see faster adoption of the latest technology in business. Is this due to the increased expectations of tech-savvy customers/consumers or is it because of the rapid pace at which technology is advancing? I think it is both. We’ve never seen more disruption in the world of technology in the past five–six years. In my opinion, what took five–six years for the cloud, and took two years for Big Data. There is a lot of compression in adoption and understanding. Things like scale-out capabilities and Hadoop are changing the dynamics of the volumes, velocity and variety of information that is being produced. So, technology is clearly an enabler and a catalyst in the process. Customers are far more technology savvy and more receptive to doing online and connected ways of doing business, than ever before. So the two things are coming together: technology making itself more pervasive, and consumers being more sophisticated and more savvy about technology, and more connected. You do not have to explain the basics or deal with the mundane. We’ve thankfully elevated ourselves to a point where the business and the consumers are talking the same language. Regardless of what kind of business you are, customers want to be connected to the business. The enterprise has become very social; the way you engage with your customers is extremely connected. You have been with EMC since 2006, and ever since we’ve seen the organization evolve from storage company to one offering data management and cloud solutions. So how have you led
In an exclusive interview with InformationWeek India, Sanjay Mirchandani, CIO and COO, Global Centers of Excellence, EMC Corporation talks about EMC’s business transformation and its adoption of cloud and Big Data technologies. He also tells us how the future of business will be impacted by analytics www.informationweek.in
the IT transformation within the company? Our mission is to enable our customers’ journey to the cloud. As a company we are focused on a combination of cloud technologies, Big Data and trust — and bringing those elements together for the benefit of our customers and partners. Over the years we have made very focused R&D investments averaging 11–12 percent a year. We’ve also made multi-billion dollar acquisitions (70–80 companies) and this has enabled us to build the most rigid portfolio of products in the space of cloud, trust and Big Data in the last few years. With regard to Big Data, three years ago, I’d give myself and my team kudos for bringing that (volume) of data down. You want to have some control on that data growth. This is not an issue with structured data, but when you look at semi-structured data, things like log files, tweets, Facebook posts, Hadoop — it’s no longer about managing it, but harnessing it. And the business asks what competitive advantage do I get with this information we’ve gathered? How is this Big Data helping the business? Thus, you need cloud infrastructure to be able to manage this Big Data and to give you that global reach. And all of that needs to be wrapped in trust. As a company we’ve evolved our strategy around it, we’ve built our products and capabilities around it, and we are taking it to our customers around the world. Can you tell us about EMC’s own journey to the cloud? What were the challenges and how did you overcome these? Internally, as an IT organization for a USD 20 billion dollar enterprise (EMC), we are on a journey to the cloud. In fact we are 95 percent virtualized; we run a private cloud infrastructure and that story is well documented. We are early adopters of Big Data technologies and our lines of business run mission-critical applications. We are breaking new ground; we’ve hired data scientists and we’re creating all kinds of capabilities around Big Data.
What’s your advice for organizations that hesitate to start their journey to the cloud? We are no longer in a hype cycle. According to a global survey, most companies in the world want to do cloud. It took us four–five years to get to the cloud. We took three steps or phases in our journey to the cloud. The first was IT production. We were out of allocated space and were not optimally utilizing what we had. Back then virtualization technology was new, the ecosystem was young, and there was a shortage of skills, but we embraced it and made some big decisions around technology — we went with VMware and x86. That took us to 35 percent virtualization. Today, we run 8,000 production servers in our enterprise. And of those 8,000 servers, 95 percent are virtualized. Then we went to the business production phase to virtualize the business platforms. It was expensive to run all these physical stacks that were single points of failure. And we said we would do it with the best Quality of Service available today. So, that took us two years to reach 75 percent virtualization (for all the business apps). The third phase that we are in is everything-as-a-service, or truly running IT as a Service. This means I can deliver infrastructure, platforms, applications, user interfaces as a service, off my private cloud. The advice is that you’ve got to make some decisions and you’ve got to be bold about them. You’ve got to be committed to it. One of the boldest decisions we made is to run a multibillion dollar organization on x86 architecture. We learnt how to port applications from Unix and Sparc on to x86, and Linux and Windows platforms — wrapping it around vSphere. You also have to ensure that you get the support you want. The second piece of advice is you should invest in skills, early. Thirdly, commit all the way. Drive to a virtual (environment) 100 percent and get your cloud benefits. And don’t live in two worlds (physical and virtual) too long. The faster you get to where you want to be, the less stressful it is on your people. The process and the
methodologies need to quickly evolve to the virtual space. Lastly, benchmark. Start out knowing what your costs (OPEX and CAPEX) are, and try and figure it out upfront and then benchmark yourself as you go. Because you have to show the business the benefits you are driving. Measure all the metrics maniacally; some of the metrics are percent of virtualization, cost per server saved, number of tools removed (due to standardization), upgrading a proprietary architecture to a virtual architecture, etc. Document all this to show the cost savings. How will the future of business be driven by Big Data analytics? The business wants agility. The more we can give them by way of insights across different aspects of the business, the more competitive advantage or value we add to the business. (Using analytics) can we predict a failure or an outage or something that can save us and our customers money, and drive up satisfaction ahead of time? A lot of it is historical information, some of it supply chain information, there’s analytics on what’s happening about our products — and if you can bring these dissimilar things together and start analyzing it using smart people like Data Scientists, you’ve got a winner; you’ve got insights that give you competitive advantage. If you use tools for your sales force or support organization that are giving them nextgeneration analytics information, you are enabling a different capability for the business. And it does not have to be real-time, but real enough time. There’s a certain element of cost with real-time. There’s never been a better time to be in IT. You are putting IT back in the word ‘opportunity’. Now we get to do the things that we’ve always wanted to do, in terms of adding value to the business. Previously, technology made it hard for us to do these things. Today, the business is also more receptive to the technology.
u Brian Pereira brian.pereira@ubm.com
january 2013 i n f o r m at i o n w e e k 51
Interview
Our success in the industry clearly points to an industry transition: Cisco Data Center SVP, David Yen For a company that has been traditionally known for its networking expertise, how has Cisco fared in the data center market with UCS? Cisco prides itself on identifying market transitions before they take place. Our entry into the data center through UCS is also an example of this. Cisco started to recognize the convergence of networking, computing and storage in 2004. In 2005, we started a project internally called ‘Project California’, which later came to be known as Unified Computing System (UCS). The timing was perfect because it also coincided with the phenomenal rise of cloud — UCS architecture is extremely suitable for the cloud. We formally entered the market in 2008. Within this short span, Cisco UCS has achieved over 15,000 customers and top tier ranking. There was some initial
skepticism when UCS was launched, on whether a new computing technology could take hold in this highly competitive market, but the cost and business benefits proved compelling across all industries and all workloads. Our competitors have a huge installed base and have been nurturing this space for over two decades. However, our success in the industry clearly points to an industry transition away from the rigid platforms and toward more flexible, integrated, and virtualized environments that Cisco UCS offers. Going forward, we expect to see continued growth in UCS, which is one of the fastest growing segments for Cisco. How different is the company’s data center strategy from its established competitors? Data center professionals are being challenged to address major market,
globalization, and technology trends that will drastically change the way data center infrastructures are deployed and operated over the next decade. As a result, the data center architecture is evolving and is increasingly moving to a shared and virtualized model. This transformation is more heavily reliant on the network than ever before. Cisco’s strategy and vision is to leverage network as the underlying platform to transform data center into a flexible, scalable and virtualized model. It combines an innovative architecture with an integrated technology roadmap, best practices, and services, designed to help customers transform their data center infrastructures, processes, and organizations with maximum effectiveness and minimum risk, cost, or disruption. A successful path to cloud requires a partnership with a data center infrastructure vendor that
Data center virtualization is one of the fastest growing business units for Cisco, globally. Cisco entered the data center market in 2009 and in just three years, has achieved phenomenal growth. In a detailed interview with InformationWeek’s Srikanth RP, David Yen, Senior Vice President and General Manager, Data Center Technology Group, Cisco, shares his views on Cisco’s data center strategy, emerging technologies such as SDN and the new technologies that will define the data center of tomorrow 52
informationweek january 2013
www.informationweek.in
can deliver proven solutions and help customers on their journey to deliver IT-as-a-service. Cisco continues to invest in innovation to pave the way to the world of many clouds, and is uniquely positioned to unify the data center, by enabling physical and virtual, public and private, commodity and value add technologies to work together. The Cisco portfolio is structured around three main pillars that address the needs of three key buyers in the data center (server buyer, network buyer and system administrator), while bridging the gaps between the three areas and unifying the architecture in a more holistic view. Through our Unified Data Center platform, we and our strong ecosystem of partners deliver a fabricbased, holistic architecture that changes the economics of IT by simplifying the IT operations required to manage the complexity of the landscape, enable the business agility companies require to succeed, and contains the infrastructure costs required to deliver high performance solutions. A wide range of case studies of real customer implementations have shown that Cisco solutions deliver a 30 percent improvement in performance with a 30 percent reduction in infrastructure costs, a 60 percent reduction in power and cooling costs, and a 90 percent reduction in deployment time, while being able to manage twice the capacity with the same level of staffing. What is your strategy with respect to SDN? Please elaborate on the importance of SDNs for the future of data centers? At Cisco, we have found that our customers have different definitions and expectations for SDN; there is not one size that fits all. The name applicationdriven networks are perhaps more representative of the change customers are looking for rather than softwaredefined networks. As a result, Cisco’s strategy involves more than just SDN. It includes network virtualization and programmability via APIs. Softwaredefined networking and OpenFlow are integral aspects of the Cisco Open Network Environment (ONE) strategy that was unveiled earlier this year.
Cisco has the following point of view when it comes to the interplay of software and hardware in the data center: l Software for operational management plays a critical role in the data center, but does not define it. l Applications, a compute solution, play an increasingly important role in defining the infrastructure requirements (as opposed to the virtualization or management software doing it), and that’s why we work very closely with our ecosystem of ISV partners to deliver an application-aware infrastructure, built on a combination of industry standard components, performance-boosting ASICs and Intelligent Network Services. l Innovation in the physical infrastructure drives the ability to deliver superior application performance, as well as shaping the ability to manage and automate data center operations in a cost effective way. l Management software has become one of the largest components of data center spend, because the legacy infrastructure wasn’t designed to support a virtualized and automated environment. That’s what we have addressed with the three pillars of our Unified Data Center platform: Unified Computing, Unified Fabric and Unified Management. Cisco’s Unified Data Center platform combines hardware and software to enable manageability, orchestration, automation and app performance, thus providing the ideal foundation for cloud and mission-critical apps, while the reducing TCO for customers. In October 2012, Cisco released the Cisco Global Cloud Index Forecast. It stated that global data center traffic will grow nearly four-fold from 2011 to 2016, primarily due to cloud computing. Does this mean that we will see a shift from traditional workloads? Cloud computing will definitely drive a move from traditional workloads in the future. At Cisco, we see four categories of cloud currently in the marketplace
or emerging in the near future: public cloud, private cloud, virtual private cloud, and eventually inter-cloud. Since every organization will have its unique need for network, computing, and storage quality, the overall IT complexity differs from agency to agency. Therefore, there is no one-size-fits-all cloud solution. Your views on some exciting and emerging technologies that will transform the data center of tomorrow? As we gear up for greater adoption for different models of cloud computing, whether public, private or hybrid models, data center technologies for compute storage and networking are all adapting in this direction. One major impact of cloud is that the industry, both from application developer and user sides, are demanding a way to abstract underlying infrastructure to make it more manageable. Therefore, the most significant technology challenge from a data center provider perspective is that we need to find ways to enable this. Cisco has been working to address this issue both from a compute as well as switching perspective. Basically, we find that the focus is moving from command language, to a level higher. Cisco intentionally takes an architectural approach, rather than suboptimal technology approach. Automation of service delivery is the high bar needing policy-aware management software to manage risks and grow business. One must have the architecture with smart software through this evolution. With the convergence of compute functions, data management, storage and networking capabilities, the unified approach creates a complete architecture that incorporates increased intelligence for automation and context in the network. Unified Fabric is how we use software to bring the Unified Data Center to life. Data centers will be heterogeneous and support multiple stacks, multiple hypervisors and networking stacks. Cisco will support all of them and is positioned to do so. u Srikanth RP srikanth.rp@ubm.com
january 2013 i n f o r m at i o n w e e k 53
Outlook 2013
Outlook 2013
Industry leaders predict key trends across various technology segments that are set to influence enterprise IT in 2013
54
informationweek january 2013
www.informationweek.in
Top 10 IT industry trends:
Hitachi Data Systems CTO ties like dynamic storage provisioning, virtualization and non-disruptive data migration. Storage vendors may also provide managed services to help organizations convert CAPEX to OPEX. the explosion of 3 Managing data replication: Replication multiplies data growth and backups are the biggest driver of data replication. Object stores will help solve the issue by reducing the need to back up and replicate unchanged data.
Hu Yoshida
M
oving to 2013, Big Data will continue to be the primary concern for the IT industry. For example, exabytes will enter into planning discussions and petabytes will be the new norm for large data stores. A lot of attention will be focused on secondary data that is generated for copies and backups. Here are top 10 IT industry trends to watch out for in 2013:
The emergence of enterprise flash controllers: The use of high-performance flash solid state drives (SSDs) in the enterprise has been slow due to their high price and limited durability compared to hard disk drives. The year 2013 will see the introduction of flash controllers with advanced processors that are built specifically for enterprise storage systems and increase the durability, performance and capacity of flash memory.
4
Dramatic changes in OPEX and CAPEX: Over the past 10 years the total cost of storage has been increasing by about 7 percent a year. The increase has been mainly due to operational costs (OPEX), while the cost of hardware (CAPEX) has been relatively flat. CAPEX will trend upwards in 2013 and become a greater share of total cost of ownership because of the increase of functionality in hardware and the demand for storage capacity.
New requirements for entry enterprise storage systems: The increasing use of hypervisors like VMware and applications such as VDI have changed the requirements for midrange storage systems. The gap between enterprise and midrange storage architectures is narrowing as the industry begins to demand entry enterprise storage systems. These systems can scale up with increasing workloads by adding more processors, ports and cache and still offer a midrange price point.
New consumption models: Instead of buying all their storage today and spreading CAPEX over the next 4 to 5 years, organizations will buy what they need when they need it. To do this, organizations need to leverage technologies and capabili-
The need for object-based file systems: The growth of unstructured data will require larger, more scalable file systems. Standard file systems will need to be replaced by object-based file systems to meet this growing demand. Managing file system
1
2
5
6
data and metadata as objects enables fast file system restores, allows highperformance file access, and provides automated file tiering. Accelerating use of content platforms for data archiving and data sharing: Storage virtualization enables applications to share storage resources; however, the application data remains locked in separate silos. In 2013, the use of content platforms for data archives and data sharing will accelerate as users try to correlate information from different applications.
7
Hardware assist controllers to satisfy increasingly complex workloads: Storage controllers will be equipped with advanced processors and hardware assist ASICs to address increasingly complex workloads and higher throughput.
8
Creating a secure platform for the adoption of mobile devices: The adoption of mobile devices increases productivity and innovation, but it also creates a nightmare for corporate data centers. The year 2013 will see the emergence of secure platforms for data sharing that will minimize the security threat of mobile devices and further increase the productivity of mobile workers.
9
More tightly integrated converged solutions: Certified, pre-configured and pre-tested converged infrastructure solutions are gaining traction. In 2013, we will see the growing acceptance of unified compute platforms where the management and orchestration of server, storage, and network resources will be done through a single pane of glass.
10
u Hu Yoshida is VP and CTO at Hitachi Data Systems
january 2013 i n f o r m at i o n w e e k 55
Outlook 2013
Top 7 storage trends: Symantec OUTL K 2013 Software-defined data center becomes the buzzword: Software-defined data centers
1
will take on cloud computing to become the new industry buzzword. In 2013, software-defined storage will have significant impact on cloud, appliances and SSDs/Flash storage implementations. Also, commodity hardware, appliances and cloud will become increasingly reliant on smart software that will define and drive the future of data center computing.
2
Increased adoption of flash storage: 2013 will see an
increase in solid state/flash storage adoption. Flash storage will be trialed at nearly every Fortune 1000 enterprise, and will begin to replace tier one storage.
3
Rising interest in integrated backup appliances: Dedupli-
cation at the storage target will increasingly be replaced by deduplication closer to the source. Also, purpose built deduplication appliances will be
replaced by integrated backup appliances that combine source and target deduplication, backup software, replication, security and cloud integration in a single appliance.
Big Data investments stay small but risks increase dramatically: Big Data projects will
4
increase dramatically, but the monetary investments by brand name vendors will stay small. New high price infrastructures will be prohibitive so organizations will turn to more targeted do-it-yourself, project-based investments using open source Hadoop. The risks associated with Big Data will significantly increase.
5
Cloud outages become more prevalent: There will
be a significant increase in cloud outages in 2013, resulting in millions of dollars lost. However, the companies will continue to pour resources into cloud offerings. Infrastructures that have scaled quickly with handwritten code and utilize inefficient shared resources will result in major outages. However, backup and disaster recovery appliances and cloud service providers will begin to innovate efficient recovery of data and apps.
6
Hypervisor market share will dramatically change:
The market share of hypervisor vendors will begin to balance out between the largest vendors, with each taking close to an equal market share. Organizations of all sizes will evaluate and adopt multiple hypervisors into their virtualization and computing environments. This diversity will cause specific hypervisor point tools to be ripped out and replaced by platforms with more capabilities that support multiple hypervisors, physical, virtual, snapshot and cloud-based infrastructures for backup, recovery and management. As a result, more SMBs will become 100 percent virtualized and use multiple hypervisors in both testing and production environments.
Defensible deletion & predictive coding become information governance mainstays: Defensible deletion will emerge
7
as the central approach to overcome the costs and legal risks associated with managing data proliferation. Predictive coding will become mainstream and overturn traditional thinking that manual, linear document review is the gold standard and meets legal hold requirements.
8 predictions for business mobility: AT&T
B
efore we close out the year with holiday feasts and family fun, let’s look at predictions for 2013 around mobility. The major trend I see running through this list is that mobile strategy in enterprise is no longer a side-stage project. It has become the main event. We’ve reached a point where many businesses already have basic mobility in place: smartphones, data plans, some security, some device protection, compliance policies, and mobile e-mail access for employees.
56
informationweek january 2013
And, more importantly, almost everyone in the organization — IT, marketing, sales and operations — is thinking about what to do next. It’s all about the applications that improve productivity, the “Internet of Things” that connects a company’s assets and the security services that keep this information protected. 2013 is going to be the year when we will see many more examples of businesses, industries, and eco-systems transforming through the innovative use of mobility.
Here are the eight ways I see mobility evolving in the coming year: 2013 will be a tipping point for machine-to-machine (M2M) communications. With greater standardization throughout the industry and falling costs of key components, I believe that M2M is poised for much more rapid adoption in the coming months.
1
2
Customers will heighten their focus on end-to-end mobile
www.informationweek.in
security, including devices, data, and networks, in light of increased sensitivity to major breaches that may be looming. Everyone realizes that real innovation will only come if we adequately address security and policy issues. Business mobile users will continue to diversify their mobile operating systems. I think that the Windows mobile platform has a chance to be a surprise hit in the business world.
3
Adoption of mobile applications in business will spread across multiple platforms. Native applications are currently the standard for enterprise apps, but we expect HTML5 and mixed-mode applications to gain more traction in 2013.
4
Businesses will start to layer more “platforms” on top of their basic mobility services. Think solutions like
5
Near Field Communications, push-totalk, and biometrics. Enterprise app stores (or downloadable enterprise apps) will become more prominent in 2013. And they’ll be integrated with application management, blessed by IT.
6
across different lines of business. We will all be surprised at this time next year by what the industry as a whole was able to accomplish in 2013!
We’ll start to see more “mobile first” applications. Right now, most companies mobilize apps that already exist as desktop or web-based software. In 2013, we expect to see a shift toward apps designed originally for mobile platforms. Some potential examples: collaboration software, context-, location- or persona-aware access and mobile-based assistants in business-to-consumer industries.
7
Enterprises will start developing mobile centers of excellence and mobile IT teams to handle the creation and management of mobile strategies
8
Mobeen Khan u Mobeen Khan is Executive Director, Mobility Marketing, AT&T Business
5 key enterprise technology trends for 2013: Brocade
T
he year 2012 will be remembered for many remarkable events, including the Summer Olympics in London that not only showcased the best in human achievement and global collaboration, but also illustrated how technologyobsessed the world has become. More than 4.8 billion people watched the event, with digital viewers outnumbering traditional television viewers for the first time in history. While 2012 was a breakthrough year for broadcasters, enterprises and consumers, let’s see what 2013 holds:
Software-Defined Networking deployments begin: The Olympics proved that
1
consumer demand for information accessibility shows no sign of abating. However, as service providers try to meet said demand, and juggle the complexities of running a profitable business – for instance, to reduce CAPEX (capital expenditure) and OPEX
(operating expenditure) and increase service responsiveness — they are looking for technology alternatives that will streamline service creation and foster innovative applications and services. Software-Defined Networking (SDN) offers a means of doing this. SDN links networks and applications, enabling direct programmatic control of both networks and orchestration layers in line with end-user application needs, rather than programming around the network, as is done today. IDC predicts that by 2016, the market will be worth USD 2 billion a year, up from just USD 168 million today. With the promise of SDN architectures radically decreasing total cost of ownership (TCO), and vendor innovation/ support continuing to increase. We’ll see pockets of actual SDN service deployments across the globe, primarily in the U.S. and Japan.
2
Death of the “TransactionBased” user: A “transaction-
based user” is an individual who will
connect to the Internet to conduct an activity (such as to make a purchase or to stream content), and then log off. The premise is that the user will sparingly use bandwidth and not rely on connectivity for every aspect of their life. With the roll-out of higherperforming networks (such as 4G and LTE) and devices that offer seamless connectivity, I predict that 2013 will be the year that we see the decline of the “transaction-based” user and the rise of the “always-connected” user. Operators will vie for consumer loyalty by offering attractive pricing models, fuelling the trend for 24/7 connectivity. Businesses will leverage this phenomenon and increasingly turn to social media and communities to host a larger portion of their customer experience and support processes. While this will transform engagement models; to be successful, operators and business will need to ensure that their back-end networks can meet user expectations. In situations such as this, even one
january 2013 i n f o r m at i o n w e e k 57
Outlook 2013 service disruption could be fatal to the customer relationship.
3
Cloud under the microscope: Last year, Brocade
predicted that non-IT organizations would move towards “Cloud Service Provisioning” in order to uncover new ways of optimizing and monetizing cloud and service provider networks. For the next 12 months, I see this trend continuing, but in an evolved state. First, businesses will scrutinize the impact of the cloud, its benefits, usage and ROI more than ever before. For example, are the deployments delivering the agility and cost savings predicted? Are users benefitting? How can one measure cloud ROI when, by design, the assets are not owned by the organization? With this in mind, I predict that IT organizations will attempt to take back control of their own assets and budgets and the deployment of private cloud architectures will accelerate during the second half of the year. IT organizations will also challenge the new breed of service provider by offering competing hosted services. This strategy will help them bolster revenue opportunities from a market that will be worth USD 73 billion by 2015. This will be good news for users, but will need to be accompanied by a thorough evaluation strategy to ensure that performance, resilience and cost models meet organizational expectations.
4
Customers bite back on vendor lock-in: As consumers,
we don’t like being shackled. More generally, “product de-siloing” is a clear macro-trend. Whether it is the flexibility to personalize apps on our phone, or select the optional extras on our vehicles, choice is paramount. Within the networking space, choice is even more imperative. As alliances ebb and flow, and integrated offerings continue to break into the mainstream, the importance of open architectures and multivendor solutions will become more prevalent in 2013. Trust is essential when building a network to support mission-critical applications, and enterprises will turn to trusted vendors to deploy flexible and scalable solutions. As such, those vendors not able to coexist in multivendor environments will struggle in this more demanding, and competitive landscape. Only the agile and collaborative will prosper, and I expect at least one major vendor casualty in the coming year.
Innovations such as this overcame human shortcomings and simplified the user experience, and I predict more such innovations over the next 12 months. Think of voicecommand devices and the fact that modern smartphones can deliver communications, content and compute services in a single form-factor. This kind of human multitasking would not happen, but technology has innovated to such a degree that it is now second nature to even the most basic user. I expect more breakthrough advances this year.
Technology will begin to overcome human shortcomings: I remember when
5
desktop icons did not exist; a time when a user had to manually type the name of an application in a shell-like text window to access applications. This was time-consuming, error-prone and frustrating, but the desktop icon revolutionized the user experience.
K P Unnikrishnan u K P Unnikrishnan is APAC Marketing Director, Brocade
Top 10 strategic technology trends for 2013: Gartner 1
Mobile device battles
By 2013, mobile phones will overtake PCs as the most common web access device worldwide. By 2015 over 80 percent of the handsets sold in mature markets will be smartphones. However, only 20 percent of those handsets are likely to be Windows phones. By 2015, media tablet shipments will reach around 50 percent of laptop shipments and Windows 8 will likely be in third place behind Google’s Android and Apple iOS operating
58
informationweek january 2013
systems. The implications for IT is that the era of PC dominance with Windows as the single platform will be replaced with a post-PC era where Windows is just one of a variety of environments IT will need to support.
2
Mobile apps and HTML5
The market for tools to create consumer and enterprise facing apps is complex with well over 100 potential tools vendors. Currently, Gartner separates mobile development tools
into several categories. For the next few years, no single tool will be optimal for all types of mobile application so expect to employ several. Six mobile architectures — native, special, hybrid, HTML 5, Message and No Client will remain popular. However, there will be a long term shift away from native apps to web apps as HTML5 becomes more capable. Nevertheless, native apps won't disappear, and will always offer the best user experiences and most sophisticated features. Developers will
www.informationweek.in
also need to develop new design skills to deliver touch-optimized mobile applications that operate across a range of devices in a coordinated fashion.
3
Personal cloud
The personal cloud will gradually replace the PC as the location where individuals keep their personal content, access their services and personal preferences and center their digital lives. The personal cloud will entail the unique collection of services, web destinations and connectivity that will become the home of their computing and communication activities. Users will see it as a portable, always-available place where they go for all their digital needs. In this world no one platform, form factor, technology or vendor will dominate and managed diversity and mobile device management will be an imperative. The personal cloud shifts the focus from the client device to cloudbased services delivered across devices.
4
Enterprise app stores
Enterprises face a complex app store future as some vendors will limit their stores to specific devices and types of apps forcing the enterprise to deal with multiple stores, multiple payment processes and multiple sets of licensing terms. By 2014, many organizations will deliver mobile apps to workers through private app stores. With enterprise app stores the role of IT shifts from that of a centralized planner to a market manager providing governance and brokerage services to users and potentially an ecosystem to support apptrepreneurs.
5
The Internet of Things
The Internet of Things (IoT) is a concept that describes how the Internet will expand as physical items such as consumer devices and physical assets are connected to the Internet. Key elements of the IoT which are being embedded in a variety of mobile devices include embedded sensors, image recognition technologies and NFC payment. As a result, mobile no longer refers only to use of cellular handsets or tablets. Smartphones and other intelligent devices don't just use the cellular
network, they communicate via NFC, Bluetooth, LE and Wi-Fi to a wide range of devices and peripherals, such as wristwatch displays, healthcare sensors, smart posters, and home entertainment systems. The IoT will enable a wide range of new applications and services while raising many new challenges.
6
Hybrid IT and cloud computing
As staffs have been asked to do more with less, IT departments must play multiple roles in coordinating IT-related activities, and cloud computing is now pushing that change to another level. A recently conducted Gartner IT services survey revealed that the internal cloud services brokerage (CSB) role is emerging as IT organizations realize that they have a responsibility to help improve the provisioning and consumption of inherently distributed, heterogeneous and often complex cloud services for their internal users and external business partners. The internal CSB role represents a means for the IT organization to retain and build influence inside its organization and to become a value center in the face of challenging new requirements relative to increasing adoption of cloud as an approach to IT consumption.
7
Strategic Big Data
8
Actionable analytics
Big Data is moving from a focus on individual projects to an influence on enterprises’ strategic information architecture. Dealing with data volume, variety, velocity and complexity is forcing changes to many traditional approaches. This realization is leading organizations to abandon the concept of a single enterprise data warehouse containing all information needed for decisions. Instead they are moving towards multiple systems, including content management, data warehouses, data marts and specialized file systems tied together with data services and metadata, which will become the "logical" enterprise data warehouse. Analytics is increasingly delivered to users at the point of action and in context. With the improvement of per-
formance and costs, IT leaders can afford to perform analytics and simulation for every action taken in the business. The mobile client linked to cloud-based analytic engines and Big Data repositories potentially enables use of optimization and simulation everywhere and every time. This new step provides simulation, prediction, optimization and other analytics, to empower even more decision flexibility at the time and place of every business process action.
9
In-memory Computing
10
Integrated ecosystems
In-memory Computing (IMC) can also provide transformational opportunities. Millions of events can be scanned in a matter of a few tens of millisecond to detect correlations and patterns pointing at emerging opportunities and threats "as things happen." The possibility of concurrently running transactional and analytical apps against the same dataset opens unexplored possibilities for business innovation. Numerous vendors will deliver in-memory-based solutions over the next two years driving this approach into mainstream use. The market is undergoing a shift to more integrated systems and ecosystems and away from loosely coupled heterogeneous approaches. Driving this trend is the user desire for lower cost, simplicity, and more assured security. Driving the trend for vendors is the ability to have more control of the solution stack and abiity to offer a complete solution stack in a controlled environment, but without the need to provide any actual hardware. The trend is manifested in three levels. Appliances combine hardware and software, and software and services are packaged to address and infrastructure or application workload. Cloud-based marketplaces and brokerages facilitate purchase, consumption and/or use of capabilities from multiple vendors and may provide a foundation for ISV development and app runtime. In the mobile world, vendors including Apple, Google and Microsoft drive varying degrees of control across and end-to-end ecosystem extending the client through apps.
january 2013 i n f o r m at i o n w e e k 59
Outlook 2013
Top 6 trends for 2013: Citrix
Sanjay Deshmukh Driving business transformation emerges as top priority: Given that the business environment is expected to remain volatile and uncertain, IT service providers and practitioners will be expected to drive business transformation initiatives that help organizations optimize cost and increase productivity. IT will drive programs like work from home or move work to a more optimized location and cut real estate costs. IT will enable employees to work with people, apps and data from any device and any location, thus increasing the productivity of employees.
1
The role of IT organization is changing: Compared to the old PC era, wherein companies locked down computers allowing employees to only use monolithic products through a wired network in an office environment, today’s workers are driving the change because of the empowerment they’re getting from consumer devices and self-service cloud apps. So in effect, the set of assumption for IT has now been turned on its head and IT must get into service delivery. The role of IT has now transformed to respond to and for the way people need to consume the services, apps, data and information. In order to solve this problem, the only thing that will work is a holistic solution with many piece parts working together well. You can’t just
2
60
informationweek january 2013
solve a single-sign-on problem or a mobile device problem as an island. These narrowly defined problems are much more interrelated and need to be solved inter-relatedly. IT organizations need to think of themselves as "internal service providers", and have to "compete" against commercially-available apps/products. The "IT-as-a-Service" trend extends to service providers, who are able to provide Windows and desktops more efficiently than the IT organization itself — and IT shops are compelled to source desktops from outside their walls. IT's security/compliance models shift from the "locked-down" device, to the virtually-issued and controlled device. Consumerization and mobility will continue to change: The levels of complexity will get even deeper with a plethora of heterogeneous form factors, platforms and devices coming into the workplace. The core challenge here is about delivering and securing the app and the data, not the device. With this difficulty, it will become impossible for IT to find one solution or set of solutions to secure these devices. To add to this, the heterogeneity is not just about the device, it will extend to the worker, the app, the location as well. The standard office-based worker has morphed into office-based, temporary, remote, flextime and mobile workers. What used to be primarily Windows apps has now become Windows, mobile, web and SaaS apps. Workers need secure access to all their apps regardless of where they are “located”. And location becomes heterogeneous. Apps may be delivered from the data center, from the web, or native on a mobile device and they will be delivered to many more locations than before.
3
Mobility is a lot more than just devices or apps: The purview and definition of mobility will undergo a change as organizations will now need to consider how they will deliver all these apps to any user at any time,
4
regardless of the device. Mobility will not be just about user mobility but app mobility. An application may be delivered from two different locations to a user who could be anywhere. In the context of these newer realities, mobility won’t really be about the device that enables it but being able to do what we want, where we want and when we want. BYOD hits mainstream: The move by IT to allow BYOD by employees hits mainstream; the corporate-issued laptop begins to become a bygone piece of equipment. With BYOD becoming mainstream, draconian mobile device management measures will cease to be effective and the contours of MDM will be redefined. Companies will not want a blanket kill pill but will need to adapt to the changing devices that are coming into the enterprise. In the form of a powerful feature of app delivery or a feature of data protection solutions (whether in the OS, app delivery or platform), mobile device management will become an enhancement to a much broader enterprise mobility strategy.
5
App explosion and the shift of application architectures: Existing enterprise apps have always been about the local interface but with anywhere, anytime access, enterprise apps will quickly need to be rebuilt to have cloud context and a cloud interface; not just web-enabled but deep cloud integration. Apps without cloud integrations won't survive. The other facet to this is that enterprise apps are no longer the only standard. Micro apps and alternative apps (like Gmail, Google docs) are rising in the enterprises. Windows apps for viewers, notes apps are transforming into micro apps and cloud based services. Web-ifying and mobilizing of apps is the future, so that apps are synced and can be used on any device.
6
u Sanjay Deshmukh is Area Vice
President, India subcontinent, Citrix
www.informationweek.in
Industry Interface
DSCI: Cyber attacks on national infrastructure increasing
T
he NASSCOM-Data Security Council of India (DSCI) conducted its Annual Information Security Summit in Mumbai between December 11-12. The theme for the 7th edition of this summit was ‘India meets for Security’ and the focus was on the national and enterprise aspects of Cyber Security. The Annual Summit is the flagship event of DSCI, where security professionals from industry verticals and government meet to exchange ideas, share knowledge, discuss and deliberate on common issues faced by the security community globally. The summit was inaugurated by N. Chandrasekaran, Chairman, NASSCOM and CEO, TCS. The special guests in attendance were Jack Christin Jr., Associate General Counsel, Global Asset Protection, eBay, and Lata Reddy, Deputy, NSA. Speaking at the inauguration of the summit, Dr. Kamlesh Bajaj, CEO, DSCI said, “The global and national landscape for data protection has witnessed many changes. And we have modified our work plan to stay in tune with these changes so as to deliver value to the industry, user organizations, public sector and the government.” Alluding to some of the changes in the landscape, Dr. Bajaj said individuals now face a greater privacy risk, especially on social networks. He said there is a need for individuals to control their data. And countries in the EU and the U.S. are looking to revise existing laws. He mentioned the U.S. Consumer Bill of Rights and the EU’s Data Protection Regulation. “Cyber threats are on the rise globally. There’s a rise in cyber crimes, cyber espionage, and attacks on critical information infrastructure. It is not just the financial frauds, identity thefts, and copyright and trademark violations, but attacks on critical information infrastructure that are catching attention and linking cyber security to national security. The global losses
(due to cyber attacks) are estimated to be between USD 3.7 to USD 6.9 trillion dollars, with losses in the banking sector being up to USD 1.3 trillion,” informed Dr. Bajaj. He pointed out that attacks on high value institutions and highly secured organizations are on the rise — even security companies were being attacked these days. Espionage and theft of intellectual property and state secrets are on the rise. In fact, IPR losses are estimated to be about USD 1 trillion every year. Attacks on social media platforms and applications have also increased. Social networks are also being misused to spread rumours and arouse feelings of hatred and contempt in national communities. Being a global hub for outsourcing, India also faces ICT supply chain risks. All this calls for more awareness and regulations from both government and industry. It also calls for modification of Indian laws to address new technologies like cloud computing and social networks. In fact, the Government of India has already set up a working group to formulate a cloud policy. And on the industry side, Microsoft for instance, is about to launch a radio campaign targeted at college students. The campaign aims to create awareness about using social networks in a
responsible manner. The summit was spread over two days, during which a number of keynotes, roundtables, and panels around specific security and privacy themes were held to promote security approaches and solutions. One of the panels this year was ‘Women in security’, which raised some interesting points about gender equality. There was a general session on careers in security too. And in other sessions there were experts from the industry, academia and government deliberating on a range of issues from cyber security and Internet governance, to engagement with EU on India’s ‘adequacy’ under the Data Protection Directive for transborder data flows.
DSCI UPDATE
Dr. Bajaj also updated the congregation about DSCI’s accomplishments in the past year. DSCI set up a task force called the Cyber Security Advisory Group, under the chairman of NASSCOM. The Group released a report in April 2012 titled ‘Securing our Cyber Frontiers’. The report, which was presented to the Home Minister and the National Security Advisor, has 10 key recommendations. “The government now realizes that it cannot secure cyberspace alone — private sector support is essential.
N. Chandrasekaran, Chairman, NASSCOM and CEO, TCS
january 2013 i n f o r m at i o n w e e k 61
Industry Interface DSCI and NASSCOM have had several discussions with the government. NASSCOM member companies have contributed to those discussions at various levels,” said Dr. Bajaj. He informed us that the government has set up a permanent joint working group under the chairmanship of Deputy, NSA to oversee and coordinate implementation. The sub-groups are focused on areas such as capacity building, testing and certification in the context of ICT supply chain, standards and audits, critical infrastructure protection, centres of excellence, and international cooperation and advocacy. “In the context of privacy, we have held a number of discussions with the European Union, on data transfer and adequacy of India. We made presentations about how India has a strong data protection regime and how the privacy principles under section 43 of the IT Act correspond to what the Europeans have for data protection. It is an on-going engagement,” informed Dr. Bajaj. With regard to privacy developments in India, it was felt that, apart from the IT Act 2008, there was a need for a horizontal law covering all the areas of government and private sector. The government is implementing many projects, such as NATGRID and UID/Aadhar which impact the privacy of individuals. DSCI and NASSCOM are now involved in the establishment of a new international standard ISO 27036 for Security for Supplier Relationships. The standard, currently under development, will impact the outsourcing industry. DSCI has also been called upon by the Government of India to express its stand on Internet governance. There has been an international debate on how ICANN, a U.S. organization, is governing Internet controls, and the need for more international participation in technical and security controls of the Internet. DSCI also released some reports on thought leadership and assessment frameworks for privacy and security during the summit.
Cloud and mobile: A nightmare for security?
u Brian Pereira brian.pereira@ubm.com
u Jasmine Kohli jasmine.kohli@ubm.com
62
informationweek january 2013
Som Mittal, President, NASSCOM
W
ith organizations increasingly going mobile and taking the route to cloud, there are a number of challenges emerging on the security front. To enable organizations to address the security concerns and reap the benefits of these technologies, NASSCOM-DSCI (Data Security Council of India) organized a panel discussion on the topic ‘Revolution named Clobile, Nightmare for Security?’ at its Annual Information Security Summit 2012. The panel discussion, held on the day one of the two-day security event, highlighted the security threats emerging from the heightened use of cloud and mobile — ‘Clobile’ — by organizations. The panel was chaired by Som Mittal, President, NASSCOM and panelists included Pratap Gharge, Executive VP & CIO Bajaj Electricals; N Nataraj, CIO, Hexaware; Jagdish Mahapatra, MD, McAfee India and SAARC; and Sajan K Paul, CTO, Juniper. The panelists unanimously agreed on the cloud and enterprise mobility as revolutionary trends that have the potential to transform the business. Panelists further said that it is imperative that the organizations address the information security risks emerging from the clobile ecosystem, which includes mobile companies, cloud providers, app developers, increasing front-end/backend services and financial intermediaries to realize the full potential of these technologies. The panelists discussed that the young gadget-savvy workforce today expects the flexibility to access the data from anywhere, any place and on any device. The trend though promises many benefits such as better worklife balance and improved productivity, but it also increases pressure on IT to protect data, and manage and secure these devices. “With today’s generation being gadget-savvy, we have to include a BYOD policy. The security with respect to BYOD is great. Security, be it in relation with gadgets or cloud is a challenge. Efforts to keep the data and information safe within the corporate should be a continuous and not a one-time action,” said Gharge. Resonating the same thought, Nataraj said, “The consumerization of IT is compelling the enterprises to adopt next-generation technologies. The emerging use of clobile requires a new and focused approach for data protection. We need a multi-pronged strategy to tackle the issue of risks associated with the clobile ecosystem.” He further said that to combat security issues, organizations need to deploy new technological solutions, have a holistic view of the entire security landscape, and frame organizational policies that address the new security requirements. Dr. Mittal stressed on the importance of employee training in preventing data breaches and tackling security challenges. “Human acts are more responsible for security breaches than technology flaws,” he said.
www.informationweek.in
DSCI honors organizations, individuals for undertaking excellent security initiatives T
he DSCI Excellence Awards, now in the second year, were given away by Latha Reddy, Dy. National Security Advisor and Secretary, NSCS during the summit, and the ceremony was inaugurated by N. Chandrasekaran, Chairman, NASSCOM and CEO, TCS. Speaking at the award ceremony, Dr. Kamlesh Bajaj, CEO, DSCI said, “Last year, we decided that it was important to recognize, honor and reward organizations and individuals who have implemented strong, effective and resilient security programs to help the organization address real risks, build resilience, increase trustworthiness and create a conducive environment for doing business. Therefore, we instituted the DSCI Excellence Awards. We received an overwhelming response; for the Corporate awards we received nearly 60 nominations representing 52 organizations. This year, the response has been even bigger and we received 78 nominations.� This year, DSCI invited nominations for the corporate segment in sectors like banks, telecom, e-governance, IT services (Large), IT services (SME), Global BPO and Captive BPO. Three new categories were introduced in the corporate segment: e-governance, privacy, and emerging information security product company. In the Law Enforcement Agency (LEA) segment, there were two new awards for Cyber-cop of the year and for Capacity Building of Law Enforcement Agencies. The latter was for a state police or a policy agency that has done its best to train its policy officers, and has set up facilities for cybercrime. The award was presented for best investigated case. DSCI received 17 nominations for the two awards in the LEA segment. The jury for corporate awards included: Dr Ganesh Natrajan, Vice Chairman and CEO of Zensar; S Sambamurthy, Director, Institute for Development and Research in Banking Technology; Kesri Tavadia, CIO, BSE Ltd;
Dr. Kamlesh Bajaj, CEO, DSCI; Dr. Triveni Singh, DSP, Cyber Crime Cell- Noida, UP Police (Award winner); Lata Reddy, Deputy, NSA; Som Mittal, President, NASSCOM
BK Syngal, Ex Chairman and Managing Director, VSNL; Prof. (Dr.) Asim K Pal, Management of Information Systems Department, IIM Calcutta; and Prof MP Gupta, Department of Management Studies, IIT Delhi. The jury for the law enforcement awards included: Loknath Behera, IGP, NIA; Pratap Reddy, IGP, Western Range, Karnataka; Nandkumar Saravade, Citi Security and Investigative Services, South Asia; and Vakul Sharma, Advocate, Supreme Court. PricewaterhouseCoopers acted as the process partner of DSCI for a transparent process in shortlisting the finalists, while the jury decided the winners. In the corporate segment, DSCI
Excellence Award 2012 for Security in Bank was given to HDFC Bank; Security in Telecom was bagged by Aircel; Security in E-Governance was awarded to Central Board of Excise & Customs (CBEC); Security in IT Services (Large) was given to Tech Mahindra; Security in IT Services (SME) was awarded to Ugam Solutions; Security in Global BPO was given to Genpact; and Security in Captive BPO was bagged by ADP Pvt Ltd. Apart from this, DSCI Excellence Award 2012 for Privacy was bagged by IBM Daksh Business Process Services and DSCI Excellence Award 2012 to the Emerging Information Security Product Company was won by Pawaa Software. Col. Arun Kumar Anand, NIIT Technologies; Nandita Mahajan, IBM Daksh Business Process Services; Pankaj Agrawal, Aircel; and Vishal Salvi, HDFC Bank were individually named the Security Leader of the Year. In the LEA category, DSCI Excellence Award 2012 India Cyber Cop of the Year was won by Dr. Triveni Singh, DSP, Cyber Crime Cell- Noida, UP Police, and DSCI Excellence Award 2012 for Capacity Building of Law Enforcement Agencies was bagged by Karnataka Police.
Dr. Kamlesh Bajaj, CEO, DSCI
u Brian Pereira brian.pereira@ubm.com
january 2013 i n f o r m at i o n w e e k 63
Event
Industry experts discuss pros and cons of cloud computing
I
nformationWeek conducted a roundtable on ‘Leveraging the cloud for strategic business advantage’ in Bangalore and Delhi. Moderated by InformationWeek editors, Brian Periera and Srikanth RP, this panel discussion was done in association with CA. The theme of the roundtable was on how the cloud could help businesses gain competitive advantage, with specific emphasis on how the cloud can make business more nimble and enable companies to quickly respond to sudden changes in the business environment. The Bangalore event was attended by N Gajapathy, CEO & vCIO, VG Solutions & Services; Philu Thomas C, Director- ETS, Manhattan Associates; and Udayan Banerjee, CTO, NIIT Technologies. In Delhi, the panel discussion had participants from different industries such as manufacturing, healthcare, oil and gas, and insurance. Some of the distinguished panelists who spoke at the event included Vijay Sethi, CIO, Hero MotoCorp; Daya Prakash, CIO, LG Electronics; Pertisth Mankotia, Head-IT, Sheela Foam; Mathew C George, Chief Manager, Indian Oil Corporation; M Thyagaraj, President, Ardee Resources; Ajay Jassal, Senior VP, Business Process & IT, Centre For Sight; Harnath Babu, VP-IT, Aviva Life Insurance Company;
Satish Mahajan, DGM- IT South Asia, VFS Global Services; Pradeep Saha, Sr. VP- IT, Max Healthcare Institute; and C G Prasad, Director- Information Systems, Premier Inn India; Mrigank Tripathi, Founder, Applied Mobile Labs; and Surinder Kapur, Consultant- IT, Delhi State Cancer Institute. The panelists shared how the cloud enabled organizations to roll out products and services faster to the market. M Thyagaraj, President, Ardee Resources and Former CIOONGC, explained how the cloud can help organizations save on CAPEXrelated costs. He explained this with an analogy, by saying, “If you want milk, why buy a cow.” Vijay Sethi from Hero MotoCorp shared his perspective on how his firm uses cloud for competitive advantage specifically in the area of dealer management. Daya Prakash from LG Electronics spoke on early experiences using the cloud, while George Mathew from Indian Oil Corporation spoke on the importance of standardization for the cloud. Other CIOs like Harnath Babu from Aviva Life Insurance, spoke on the tremendous time and scalability advantage that cloud computing technologies can enable. The cloud can also help in putting in standard best practices across the globe, as CG Prasad from Premier Inn explained.
Daya Prakash, CIO, LG Electronics; Subramaniyan Venkataraman, VP, Software Engineering, CA Technologies; and Vijay Sethi, CIO, Hero MotoCorp
64
informationweek january 2013
Udayan Banerjee, CTO, NIIT Technologies
While every CIO agreed that cloud computing as a concept has truly arrived, the common view was that organizations must initially start off with small applications, and later on expand it to other core enterprise applications.
Pertisth Mankotia, Head-IT, Sheela Foam; Ajay Jassal, Senior VP, Business Process & IT, Centre For Sight; and Pradeep Saha, Sr. VP- IT, Max Healthcare Institute
www.informationweek.in
Panelists discuss security risks in financial industry
I
n a roundtable organized by InformationWeek on the topic ‘Managing risks, frauds and privileged identities in the financial world’ in Mumbai, imminent CIOs and industry leaders discussed various security challenges affecting the community in the BFSI space. Done in association with CA, the session was moderated by InformationWeek’s Executive Editor, Srikanth RP. The panel saw participation from prominent industry experts from various micro-segments of the BFSI sector, such as banks, insurance companies, credit companies, etc. Some of the panelists included
V Subramanian, CISO, IDBI Bank Ltd; N D Kundu, Chief Manager - IT Security, Bank of Baroda; Onkar Nath, CISO, Central Bank of India; Sunder Krishnan, Executive VP & Chief Risk Officer, Reliance Life Insurance; Hiren Shah, AVP – Technology, ICICI Lombard General Insurance Co; Sagar Karan, Information Security Officer, Fullerton India Credit Company; Atanu Bhaumik, Chief Manager – IT, Swadhaar FinServe; Sunny K P, CIO, Dy General Manager, Federal Bank; Manoj Nanda, VP – Projects, HDFC Securities; Kersi Tavadia, CIO, BSE Ltd; Prem Kumar Gurnani, Assistant GM, State Bank of India; and Narendra Misra, Dy GM (ISO), State Bank of India.
Kumar Gurnani, Assistant GM, SBI; Atanu Bhaumik, Chief Manager – IT, Swadhaar FinServe; and Manoj Nanda, VP– Projects, HDFC Securities
Onkar Nath, CISO, Central Bank; ND Kundu, Chief Manager, IT Security, Bank of Baroda; and Narendra K Misra, Dy. GM ISO, SBI
The panelists discussed current security risks encountered by the firms in the BFSI sector. They also highlighted new-age threats emerging in the sector, with nextgeneration banks adding delivery channels, such as mobile and social media. The panelists also discussed in detail the importance of securing data and maintaining confidentiality in the sector, keeping in view the guidelines given by regulatory bodies like RBI, SEBI and IRDA. The panelists shared best practices to manage risks and frauds across various channels and discussed steps that can be taken to ensure security.
Sunder Krishnan, Executive VP & Chief Risk Officer, Reliance Life Insurance and V Subramanian, CISO, IDBI Bank
Kersi Tavadia, CIO, BSE and Sagar Karan, CISO, Fullerton India
january 2013 i n f o r m at i o n w e e k 65
Event
Analytics in business transformation
CIOs get together to discuss challenges with managing Big Data and the use analytics in the BFSI vertical
B
usiness Transformation depends on the quality of decisions made for business. But to make smart decisions you need deep insights to business data. Organizations need to spot trends, patterns and anomalies to predict opportunities and threats. The challenge is that there is so much data to analyze, and it’s coming in at a faster clip. In 2011, 5 exabytes of data was generated every day globally (an exabyte is equal to one billion gigabyte). In fact, 90 percent of all the data in the world was generated in the last two years alone.
Decisions cannot be made only on operational or transactional data. You need to look at unstructured data like customer feedback, and details in contracts and agreements. Yet 80 percent of data is unstructured. So you need the right tools to help you analyze your unstructured data. To discuss this important topic, InformationWeek and IBM conducted a symposium titled Transformational Analytics in Mumbai on 29th November 2012. The event included a panel discussion where the panelists explained what transformation meant in the context of their business, and
The panelists (L-R): Sunil Rawlani, Business Transformation Advisor and Founding Partner, MetamorphosYs; Sankarson Banerjee, Former CIO, India Infoline; Sameer Ratolikar, Global CISO, Bank of India; Nikhil Gundale, VP Systems and Technology, Lowe Lintas India; Prashant Tewari, IBM Software Group, ISA region
Pallav Nuwal, Business Manager - Predictive Analytics India-South Asia, IBM makes a presentation titled Predictive Analytics for your Business
Rajesh Shewani, Country Manager - Technical Sales, IBM Business Analytics, India/South Asia makes a presentation titled Business Intelligence for Better
66
informationweek january 2013
how they contributed to the business transformation. The discussion eventually veered to Big Data and the panelists shared how they managed Big Data and tapped meaningful insights. They also discussed how they use BI and analytics tools, and the challenges they currently face. One of the panelist was Prashant Tewari, Business Unit Executive, Business Analytics brand, IBM Software Group, India South Asia region. Tewari shared some use cases for Big Data and business analytics in certain verticals. He also talked about the top applications for predictive analytics in the BFSI sector.
Networking activity at the symposium
The audience listens attentively
www.informationweek.in
Analyst Angle
The power of a great marketing-IT relationship
T
Linda Price
Seeing the CMO and CIO as wanting different things is an old way of thinking. Now the enterprise benefits when CMOs and CIOs combine their strengths
http://www.on the web IT and marketing: How digital media’s changing the relationship Read article at:
here is a significant transformation starting to take place as digitization continues to change the business landscape. We are seeing the control of an organization’s IT budget slowly slipping away from the CIO, with many vendors now selling around the IT department and new disruptors presenting alternative solutions. Gartner predicts that 35 percent of enterprise IT expenditure for most organizations will be managed outside the IT department’s budget by 2015. Five years from now, we believe that an organization’s chief marketing officer (CMO) may have a bigger slice of the IT budget than the CIO. As the business landscape continues to change and digital marketing increasingly drives business growth, marketing is becoming a fundamental driver of IT purchasing. The race to the customer has been fuelled by tough economic times and the realization that a focused business strategy creates better results for the enterprise. It’s simply not an option to treat the customer as a commodity. In this environment, the relationship between marketing (the custodian of the customer) and IT (the custodian of technology) becomes more central to success. This is good timing, because technology is ready as never before. Clearly, it is time for the head of marketing and the head of IT to build better and more strategic partnerships. A powerful CMO-CIO relationship creates economic and strategic value in four areas: customer engagement, product innovation, integrated business processes and market/customer/competitive intelligence. By joining forces, marketing and technology can present a clearer vision of what is achievable and produce a road map that the enterprise will readily follow. Historically, a natural tension has existed between marketing and IT. Marketing has always had a need for speed, and technology has always needed
enduring systems. Until recently, getting technical infrastructure to work commanded most of a CIO’s attention. Most CIOs now have infrastructure under control, thanks in part to alternate delivery models such as the cloud. This not only frees CIOs from direct involvement with infrastructure; it also means IT has the core strength to help marketing generate revenue. Today, they have more in common than ever, which underscores the importance of working together. Seeing the CMO and CIO as wanting different things is an old way of thinking. Now the enterprise benefits when CMOs and CIOs combine their strengths. Marketing and IT are a good match on another level. IT has the systems and technology skills to deliver marketing initiatives, and marketing has customer relationships that IT can help strengthen. Three inescapable facts confront today’s CMOs and CIOs: l Enterprises that create a seamless experience for consumers and business customers will win. l Enterprises that anticipate what consumers and business customers are going to buy next will win. l Enterprises with a strong marketingIT relationship will win. Where then, should a marketingoriented CIO focus these days? The logical choice is the front office, where the money is made. Moreover, as the pace of change in the marketplace picks up, no organization wants to be left behind. This is also an exciting and personally rewarding place for a CIO to make a mark. Put simply, a strong marketing-IT relationship helps bring more business results to fruition. Now is the time for enterprises to invest in this relationship. The benefits to the organization in general, and to the CMO and CIO in particular, are too big to pass up. u Linda Price is the Group Vice President at Gartner
january 2013 i n f o r m at i o n w e e k 67
Global CIO
A proposal for IT: Set just one goal for 2013
T
Chris Murphy
Be measurably more relevant to your customers
LOGS Chris Murphy blogs at InformationWeek. Check out his blogs at:
68
informationweek january 2013
his time, rather than creating a short list of priorities, IT leaders might consider setting just one goal. And if it’s only one, that goal should be one that makes you at least a little uncomfortable to think about. So here’s what I propose for 2013: Make IT measurably more relevant to your customers. I’m talking about end customers, the people or companies that buy our stuff — not IT’s internal customers, a.k.a. employees. Following are three avenues to do that. Understanding customers: Companies always have craved information about their customers. What’s different today is the potential to predict consumer behavior with greater accuracy. Better analytics and the ability to more affordably crunch huge amounts of data create this opportunity. This “understanding customers” category might sound like the safest, easiest way for IT to get closer to customers — just watching and analyzing. But it doesn’t stop there. Yes, customer data helps you run the business internally, but would Big Data analysis help the customer even more if you gave it away? This is the classic FedEx breakthrough — package tracking was an essential internal tool, but sharing package status transformed customer relationships in the industry. Sharing your analysis with customers can get uncomfortable. I talked with a CIO whose company shared usage data with one customer, only to have that company conclude it should buy less. The customer was underutilizing what it had. Not only might customers not thank you for the data, they might ask why you sold them that last unit and how long you’ve had these insights. In consumer products, predicting behavior based on social media sentiment is a new opportunity. Can social media predict success of a new product launch, and even help steer production levels and reduce stockouts? Can social media analysis spot new product ideas
or get customers involved in product development? 2013 will be a year of experiments, embarrassing failures and early successes in social media analysis. Promotions for customers: A big reason marketing is increasing spending on technology is to make promotions more data-driven and efficient. But there’s a long way to go in customizing them. Just because people sign up for a loyalty card doesn’t mean they want a stream of promotions. I love my bike and skis and wouldn’t mind hearing more from the companies that make them, but I never need to hear from my coffee company. Does your data make those distinctions? Sears is an innovator in using Hadoop to mine Big Data for customer insights, and it’s in the early stages of applying that to promotions. Big-box retailers need to test more customized, real-time promotions to compete with online stores. But when I wrote about Sears’ efforts, this note from a reader warned of the risks: “I think that retailers may have forgotten that the customer is a person that does not always make logical decisions, but will hold grudges.” Products for customers: The ultimate customer-facing role is to help drive technology that the customer actually uses. Building customer-facing products often takes higher quality and faster development than IT is used to for internal projects, so don’t underestimate the cultural change required. Mobile apps have accelerated this trend, giving companies a new way to interact with customers. Those apps also create a new data source, which brings us full circle to understanding customers. We’re living through a historic shift that makes technology more important — in fact, indispensable — to building close customer ties. IT leaders can seize the moment by ruthlessly focusing 2013’s goals on the customer who buys their products. u Chris Murphy is Editor of
InformationWeek. Write to Chris at cjmurphy@techweb.com
www.informationweek.in
Practical Analysis
HP’s identity crisis continues
I
Art Wittmann
HP’s problems go much deeper than poor sales in core businesses and some alleged accounting mischief by Autonomy
LOGS Art Wittmann blogs at InformationWeek. Check out his blogs at:
’m not sure what’s worse, the HP earning calls where Meg Whitman just reports progress on HP’s slow slide into becoming Dell, or the ones where she drops bombshells like saying Autonomy messed with its books before HP purchased it. Either way, it’s a melodrama that mostly confirms the opinion many investors have long held: The road to profitability will lead to a much smaller company — and one that hopefully has some focus. And not just any focus, either. A smaller HP must find a niche to own, much as IBM hung its hat on highend services and consulting. But that type of deliberation doesn’t seem to be happening. At one time, HP talked about secure, cloud-based information management. That effort would be led by Autonomy, bolstered by technology from ArcSight, Fortify and TippingPoint, and further enabled by HP services, networking, servers and storage. It wasn’t a bad story. Except, that is, that it relied almost completely on software — something HP has never been known to do well. The way around that is pretty simple: Buy good software companies and do your level best to leave them alone culturally. The road to a cohesive product portfolio is certainly longer than if you have a history of good software lifecycle management, but not as long as attempting to grow that competence from the ground up. To that end, the acquisitions of ArcSight, Fortify and TippingPoint (part of HP’s 3Com purchase) all happened in 2010, followed by Vertica and Autonomy in 2011. The good news is that HP’s software business is now in fact doing pretty well, growing 14 percent year over year for the last quarter with an operating margin of 27 percent. Software services revenue was way up, with a 48 percent jump. Networking sales improved a bit too. Unfortunately, that’s about all the good news. Printing revenue is down, on both the consumer and commercial sides. PC sales are slumping in both markets, too. Yes, these are tough times
for all PC vendors, but even so, HP’s losses here outpace the average by about 50 percent. Server and storage sales are also down. But it was the business critical computing, or BCC, systems unit that took the worst hit. In terms of a strategy, HP will still focus on enterprise software and wait a while to see what happens with Windows 8 — which is what HP management appears to be doing both for servers and desktops/laptops. There seems to be great hope that the HP/Microsoft partnership will bear sufficient fruit to show that HP should still be in the business of making end user devices. The Autonomy debacle does bring one significant question to the fore, however. After a string of bad purchases and worse staffing decisions, why is Ray Lane still there as board chairman? Though Lane wasn’t board chairman at the time of the Palm acquisition, he certainly played out every bizarre possibility with the company, including attempts to put the PalmOS into the public domain, one assumes in hopes that it would become another Linux. Poor handling of Mark Hurd’s firing, a board spying scandal and the choice of Leo Apotheker was another bizarre string of actions, given that Apotheker had no experience on the consumer or hardware side of the business. Prior to 2012, HP typically made three to six major acquisitions per year. This year, it’s made none. It’s doubtful that the company has all the building blocks for whatever management’s vision might be. It’s also doubtful that growing competency and products internally is the right move, so it’s hard to see why it isn’t making some acquisitions. Is it the timidity that comes from being insanely wrong on some of its largest purchases, or is it internal disagreements? I have no idea. But if I were a shareholder, I might want a new chairman. u Art Wittmann is Director of
InformationWeek Analytics, a portfolio of decision-support tools and analyst reports. You can write to him at awittmann@techweb.com.
january 2013 i n f o r m at i o n w e e k 69
Down to Business
Are we giving CIOs an inferiority complex?
A
Rob Preston
CIOs need to be technical, without apology, just as chief medical officers need not apologize for their grounding in medicine
LOGS Rob Preston blogs at InformationWeek. Check out his blogs at:
70
informationweek january 2013
re people losing respect for the CIO profession, or have they just lost their perspective? While other C-level executives command authority and are lauded for IT savvy if they know how to buy a cloud service, CIOs are nitpicked for spending too much time on their core technology competency and not enough time parked in other parts of the business. In a recent column, I wrote that CIOs need to form stronger bonds with their C-level peers and take on formal responsibilities outside of the IT organization. One former CIO, CEO and COO wrote to me to object. Why do other C-level execs think they can assume the IT function, he wrote, and why do CIOs appear to have an “inferiority complex” about their technical capabilities? The writer, Steven Poole, whose career spanned senior executive positions in the public and private sectors in Canada, raised other valid points. Some clarifications are in order. CIOs need to be technical, without apology, just as chief medical officers need not apologize for their grounding in medicine. CIOs must have experience managing and developing applications, systems, projects and architectures. There are exceptions to this rule: the HR or customer service exec who steps in and runs a first-class IT organization. But those execs are usually placed in the CIO position to fix a dysfunctional organization, institute cost discipline, bring silos together or instill a customer focus. And then they’re rotated out. Rarely does the nontechnical CIO thrive in that position long term. But that doesn’t mean CIOs should rest on their technical laurels either. While Poole noted that other chiefs (HR, marketing, finance, etc.) “are generally quite secure in their seat in the executive boardroom” without feeling pressure to move outside their core competencies, CIOs sit in a different place. Because they’re building systems
for sales, marketing, logistics and other departments, CIOs must understand those areas far better than the average exec. And don’t think for a minute that other executives aren’t called on to expand their expertise. Consider the CFO position. In a 2005 report titled “The Activist CFO,” CFO Research Services and Booz Allen Hamilton urged CFOs “to take on an expanded and increasingly activist role within their companies ... not just supporting the business with information and analysis, but also ensuring that the entire enterprise delivers on its commitments.” Likewise, we’re not asking “activist CIOs” to become superheroes, but we’re urging them to get more experience in operational and customer-facing roles. When I recently asked one leading CIO what he’s looking for in a successor, he immediately talked about breadth of experience, and not just in IT. That’s not because he has an inferiority complex about the importance of technical acumen. It’s because he understands that when your position is intertwined with so many lines of business, it’s essential to truly understand their processes, challenges and opportunities. In the end, I don’t think Poole’s view is all that different from mine. “Any CIO who only manages IT operations is clearly not contributing sufficiently. This is no different from the CFO who is really just an accountant or the [chief human resources officer] who is merely a recruiter,” he wrote. Poole continued: “A good CIO knows how to leverage IT to enable the business in a manner that is evident to the executive team. The principle is the same for all C-level executives. CIOs simply need to take their seat at the C-level table.” u Rob Preston is VP and Editor-in-Chief of InformationWeek. You can write to Rob at rpreston@techweb.com.
www.informationweek.in