S P I N E
CTO FORUM
Technology for Growth and Governance
May | 07 | 2012 | 50 Volume 07 | Issue 18
SEVEN STEPS TO AVOID FAILURE PANIC | ASSESSING MOBILE APPLICATIONS | MANAGING YOUR PRIVATE CLOUD
BIG DATA CHALLENGE OR OPPORTUNITY? Page 24
Volume 07 | Issue 18
BEST OF BREED
NEXT HORIZONS
TECH FOR GOVERNANCE
It's Time to Rethink Outsourcing
Is Siri Smarter than Google?
Firewalls, AntiVirus Aren’t Dead Should they be?
PAGE 18
A 9.9 Media Publication
PAGE 42
PAGE 36
EDITORIAL YASHVENDRA SINGH | yashvendra.singh@9dot9.in
Big Data, Big Hype? While vendors are aggressively pushing Big Data solutions, do you actually need them?
J
ust like Cloud Computing, Social Media, and BYOD, Big Data has fast emerged as one of the most popular IT terms of today. But does the expression (used to describe the explosion in the growth of data, its availability and usage) have any significance for Indian CIOs or is it just big hype? To put things straight, Big Data is not significant to a large number of CIOs. At least not yet. While critical data and information are getting generated at a
EDITOR’S PICK 24
mind-boggling pace in enterprises, it still can’t be dubbed as Big Data — massive volumes that are growing beyond the performance capacity of traditional database management systems and data warehouse. Most CIOs already have the storage infrastructure that can meet their requirements. There are certain verticals for which Big Data is critical. BFSI and retail are some of the sectors that not only generate humungous amounts of data but also need to run this data
Big Data CTO Forum investigates if Big Data is a challenge or an opportunity for enterprises
through analytics for continuous growth and performance. It is no longer a subject of debate that Big Data enables enterprises to become more productive. It helps corporates become smarter by exploiting data in a hitherto unavailable manner thereby presenting newer growth opportunities. At the same time, however, technology leaders in most sectors need to carefully evaluate whether their businesses actually demand Big Data solutions or not. They should cautiously assess vendors pushing Big Data solutions. For one, such solutions are expensive. Secondly, it impacts the traditional approaches to Enterprise Architecture (EA). According to Gartner, Big Data disrupts traditional information architectures — from a focus on data warehousing (data storage and compression) toward data
pooling (flows, links, and information shareability). While Big Data (both from a management and implementation perspectives) could be a challenge, it is also an opportunity for technology leaders. Not only can they harness data to their company’s advantage, they also have a golden chance to come up with new business models. For instance, as part of its turnaround exercise, an enterprise garnered such insights from analysing its information that today it provides consultancy services to other firms. We have tried to present both aspects — the opportunities and the challenges — associated with Big Data in this issue’s cover story. We hope it will clear the air around the subject.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
1
MAY12 CONTE NTS
THECTOFORUM.COM
24 COLUMNS
COVER STORY
24 | Big Data: CTO Forum
4 | I BELIEVE: RELATIONSHIPS ARE KEY TO SUCCESS IN ANY ORGANISATION
tries to investigate if big data is a challenge or an opportunity for enterprises
BY VIJAY SETHI
48 | VIEW POINT: THURSDAY THOUGHTS ON FRIDAY Mostly because I forget to hit send before I headed out last night BY STEVE DUPLESSIE
S P I N E
CTO FORUM
Technology for Growth and Governance
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Volume 07 | Issue 18
2
SEVEN STEPS TO AVOID FAILURE PANIC | ASSESSING MOBILE APPLICATIONS | MANAGING YOUR PRIVATE CLOUD
Please Recycle This Magazine And Remove Inserts Before Recycling
COPYRIGHT, All rights reserved: Reproduction in whole or in part without written permission from Nine Dot Nine Interactive Pvt Ltd. is prohibited. Printed and published by Kanak Ghosh for Nine Dot Nine Interactive Pvt Ltd, C/o Kakson House, Plot Printed at Tara Art Printers Pvt Ltd. A-46-47, Sector-5, NOIDA (U.P.) 201301
May | 07 | 2012 | 50 Volume 07 | Issue 18
FEATURES
BIG
18 | BEST OF BREED: IT’S TIME TO RETHINK OUTSOURCING
DATA CHALLENGE OR OPPORTUNITY? Page 24
BEST OF BREED
NEXT HORIZONS
TECH FOR GOVERNANCE
It's Time to Rethink Outsourcing
Is Siri Smarter than Google?
Firewalls, AntiVirus Aren’t Dead Should they be?
PAGE 18
PAGE 42
PAGE 36
A 9.9 Media Publication
COVER DESIGN PETERSON
A revolution is taking shape in business services
www.thectoforum.com Managing Director: Dr Pramath Raj Sinha Printer & Publisher: Kanak Ghosh Publishing Director: Anuradha Das Mathur EDITORIAL Executive Editor: Yashvendra Singh Consulting Editor: Sanjay Gupta Assistant Editor: Varun Aggarwal Assistant Editor: Ankush Sohoni DESIGN Sr Creative Director: Jayan K Narayanan Art Director: Anil VK Associate Art Director: Atul Deshmukh Sr Visualiser: Manav Sachdev Visualisers: Prasanth TR, Anil T & Shokeen Saifi Sr Designers: Sristi Maurya & NV Baiju Designers: Suneesh K, Shigil N, Charu Dwivedi Raj Verma, Prince Antony, Peterson Prameesh Purushothaman C & Midhun Mohan Chief Photographer: Subhojit Paul Sr Photographer: Jiten Gandhi ADVISORY PANEL Anil Garg, CIO, Dabur David Briskman, CIO, Ranbaxy Mani Mulki, VP-IT, ICICI Bank Manish Gupta, Director, Enterprise Solutions AMEA, PepsiCo India Foods & Beverages, PepsiCo Raghu Raman, CEO, National Intelligence Grid, Govt. of India S R Mallela, Former CTO, AFL Santrupt Misra, Director, Aditya Birla Group Sushil Prakash, Sr Consultant, NMEICT (National Mission on Education through Information and Communication Technology) Vijay Sethi, CIO, Hero MotoCorp Vishal Salvi, CISO, HDFC Bank Deepak B Phatak, Subharao M Nilekani Chair Professor and Head, KReSIT, IIT - Bombay
14 A QUESTION OF ANSWERS
14 |Mobile is Driving Cloud Testing David Taylor, President of Asia Pacific & Japan, Micro Focus talks about the mainframe business in India and how COBOL is no more restricted to the platform 42
38
REGULARS
01 | EDITORIAL 06 | LETTERS NTERPRISE 08 | E ROUND-UP
advertisers’ index
42 | NEXT HORIZONS: IS SIRI SMARTER THAN GOOGLE? As the CIO, you need to determine how your company can use its UIEA in various functions
38 | TECH FOR GOVERNANCE: SEVEN STEPS TO AVOID FAILURE PANIC These steps are not only prepared, but also past the panic stage when bad things happen
IBM Schneider Cisco Datacard SAS CTRL S Dell Microsoft
IFC 5 7 11 13 23 IBC BC
SALES & MARKETING National Manager – Events and Special Projects: Mahantesh Godi (+91 98804 36623) National Sales Manager: Vinodh K (+91 97407 14817) Assistant General Manager Sales (South): Ashish Kumar Singh (+91 97407 61921) Senior Sales Manager (North): Aveek Bhose (+91 98998 86986) Product Manager - CSO Forum and Strategic Sales: Seema Menon (+91 97403 94000) Brand Manager: Gagandeep S Kaiser (+91 99999 01218) PRODUCTION & LOGISTICS Sr. GM. Operations: Shivshankar M Hiremath Manager Operations: Rakesh upadhyay Asst. Manager - Logistics: Vijay Menon Executive Logistics: Nilesh Shiravadekar Production Executive: Vilas Mhatre Logistics: MP Singh & Mohd. Ansari OFFICE ADDRESS Published, Printed and Owned by Nine Dot Nine Interactive Pvt Ltd. Published and printed on their behalf by Kanak Ghosh. Published at Bungalow No. 725, Sector - 1, Shirvane, Nerul Navi Mumbai - 400706. Printed at Tara Art Printers Pvt ltd. A-46-47, Sector-5, NOIDA (U.P.) 201301 Editor: Anuradha Das Mathur For any customer queries and assistance please contact help@9dot9.in
This index is provided as an additional service.The publisher does not assume any liabilities for errors or omissions.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
3
THE AUTHOR HAS more than 22 years of expereince in the industry and consulting environment. Prior to Hero MotoCorp, he has worked with Ranbaxy and TCS.
PHOTO BY SUBHOJIT PAUL
I BELIEVE
BY VIJAY SETHI VP & CIO, Hero MotoCorp
Relationship is the Key to Success in Any Organisation Integrity,
honesty and knowledge of your subject are must for becoming a true leader ONCE you are part of the management team, you have to learn to disassociate yourself from the thinking that you are the head of a particular functionality. Every part of the business becomes integral and the function that you are currently handle is not the only responsibility. As a
4
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
CURRENT CHALLENGE LEADERS SHOULD BE ABLE TO COMMUNICATE THEIR THOUGHTS IN THE RIGHT WAY
leader, one has to understand that the most important thing is to get the trust and faith of the peers and other members of the company. This can be achieved only by delivering. If I have told the management that I will be getting a particular task done, I will have to ensure that I do it with conviction and ensure that the results are visible. Once this is done, the management understands that by whatever I say and do, I mean business. This is the only way one can gain trust and be a true leader in the organisation. Moreover, as a CIO, one has to understand that once he/she becomes part of the management team, he/she should never talk technology with the management. Each and every discussion should be business-oriented and IT should only be an enabler. IT should help in terms of ease-of-use or in terms of cutting down costs. It seems quite difficult at the beginning but slowly and steadily one gets used to it. Most of the technology talks happen within the team but not with the management. Another important aspect for a CIO is to ensure that he invites feedback from the employees. This is a good way to know what are the things that are bothering the employee. Once you get the feedback, you have to work on it so that the problem gets solved. By doing this, I ensure that employees get a convenient environment to work and this also enhances the respect and trust level. To be a good leader, some of key things that are required are integrity, honesty, knowledge of your subject and the way of communication. Another important thing is relationship building within the organisation, within your team as well as with other teams of the organisation. I believe professional relationships are even more important because they are a key to becoming a respectable leader.
LETTERS COVE R S TO RY C O N S U M E R I S AT I O N O F I T
COVE R S TO RY C O N S U M E R I S AT I O N O F I T
1/
2/
Do you view consumerisation of IT as a risk?
Is consumerisation of IT driving unrealistic expectations from IT departments?
CTOForum LinkedIn Group
Work is so much more fun than before!
Yes, you can do it!
CTO FOR UM
Techno logy for Growth SELF and ASSESSMENT
“No,
THE ETER
I do not consider it a risk. Rather it is an opportunity. However, a careful approach is required.” NAL STR UGG
Subhash Mittal, SrED (MS&IT) & Group CTO, IFFCO
CTO FORUM 07 APRIL 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
ASSESSMENT
“Yes,
it is driving unrealistic expectations. YES There is a huge investment both in managing and providing mobile devices to employees. This scenario is putting April a lot | 07 | 2012 Volum of burden on the IT department.” | 50 e 07
NO
Saradindu Paul, Associate Vice President Corp IT, Electrosteel Group
LE | CLO UD
22
SELF
Gover nance
YES
| Issue 16
THE CHIEF TECHNOLOGY OFFICER FORUM
NO
CTO FORUM 07 APRIL 2012
23
SOU RCING, INFO AGE AND GLOBAL
Indian
ISATION
| THE HEA
vis-à-v is
RT AND SOUL OF VALU E IT
es ins ensiv think ab ights into wh e survey at CIO out this s Page 21 hot trend
www.linkedin.com/ groups?mostPopular=&gid=2580450
Some of the hot discussions on the group are:
C IOs
consu merisa CTO Fo tion o rum's compreh f IT provid
Join over 900 CIOs on the CTO Forum LinkedIn group for latest news and hot enterprise technology discussions. Share your thoughts, participate in discussions and win prizes for the most valuable contribution. You can join The CTOForum group at:
The Big WNext ave
I BELIEV E
PAGE 04
Open Source vs Proprietary SOFTWARE Practically how many of you feel OpenSource Free software are best solutions than any proprietor software's?
Volum e 07 | Issue 16
NO HOLD S BARR
From InfED
A 9.9
Media
Publicatio
n
to Inno ormation Office vation r
PAGE 12
BEST OF
BREED
Best Pr for In actices PAGE no 14
vation
ARE CTOS MORE INTERESTED IN SATISFYING THE CFO & BOARD RATHER THAN THE CONSUMER?
If CTO is aligned to the CFO and the Board in that order, the CTO will have to also be good at resume writing as he will not last too long. But then the question arises, is the CFO aligned to the Consumer? If he is not, then even he may be in hot water sooner or later.
I would rather mention that, you call should depends on the criticality of the application to serve the enterprise business requirement, as opensource application can have security breaches and lack of support in worst come senario
—Vishal Anand Gupta, Interim CIO & Joint Project Director HiMS at The Calcutta Medical Research Institute
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
http://www.thectoforum.com/content/ ddos-attacks-haveincreased-20-timeslast-3-years-0
BUSINESS READY TO INVEST
When I saw the news about business wanting to spend on IT, it was like an oasis Progression of CEOs showed IT investment trend line going north To read the full story go to:
WRITE TO US: The CTOForum values your feedback. We want to know what you think about the magazine and how to make it a better read for you. Our endeavour continues to be work in progress and your comments will go a long way in making it the preferred publication of the CIO Community.
6
With increasing number of DDoS attacks, Brad Rinklin, CMO Akamai talks about some of the ways in which they can be mitigated
OPINION
ARUN GUPTA, Group CIO, Shoppers' Stop
Send your comments, compliments, complaints or questions about the magazine to editor@thectoforum.com
CTOF Connect
http://www.thectoforum.com/content/businessready-invest-0 ARUN GUPTA CIO, CIPLA
Row Data centre physical infrastructure
Rack
Room
Building
Introducing the industry’s only integrated data centre physical infrastructure Flexible, agile, easy-to-deploy, integrated Schneider Electric data centres The only integrated infrastructure that moves with your business Schneider ElectricTM has redefined today’s data centres. We’ve uniquely bridged facilities and IT by providing the industry’s only end-to-end supporting architecture and ‘all-in-one’ management software needed to ensure the highest availability and energy efficiency.
Why Schneider Electric data centres?
> Reduced design and deployment time from months to just weeks
> Out-of-the-box self discovery and
configuration via integrated software
> Applied expertise, industry relationships,
thought leadership, and life cycle services from a single company
Business-wise, Future-driven.TM
We call this holistic system ‘data centre physical infrastructure’. Not only has it revolutionised data centres, it has also transformed data centre managers’ day-to-day responsibilities. It’s faster and easier to deploy, and it’s just as simple to manage via software that gives you integrated visibility from rack to row to room to building. And, most important, it’s agile enough to adapt to
APC™ by Schneider Electric is the pioneer of modular data centre infrastructure and innovative cooling technology. Its products and solutions, including InfraStruxure™, are an integral part of the Schneider Electric IT portfolio.
your business needs — today and tomorrow.
Tap the business value of your data centre. Download White Paper #107 and learn how ‘Data Centre Infrastructure Management Software Improves Planning and Cuts Operational Costs’. Visit www.SEreply.com Key Code 17930p Call 1800-4254-272 ©2012 Schneider Electric. All Rights Reserved. All trademarks are owned by Schneider Electric Industries SAS or its affiliated companies. Schneider Electric India Pvt. Ltd., 9th Floor, DLF Building No. 10, Tower C, DLF Cyber City, Phase II, Gurgaon - 122 002, Haryana, India. Tel. 1800-4254-877/272 • 998-4729_IN-GB
CTO_Forum_0501_17930p_IN.indd
1
2012-4-24
10:39:37
E NTE RPRI SE ROUND -UP
Enterprise
FEATURE INSIDE
UTM market crossed $1bnmark in 2011 Pg 10
IMAGE BY PHOTOS.COM
ROUND-UP
Top 5 Indian ITeS Firms Grew 23.8% in 2011 Cognizant displaced Wipro to become
the third-largest Indian IT services provider THE top five India-based IT services providers grew 23.8 percent in 2011, compared with 7.7 percent growth for the global IT services market as a whole, according to Gartner Incorportaed. “Cognizant displaced Wipro to become the thirdlargest Indian IT services provider, and it experienced the highest growth rate of 33.3 percent amongst the top five providers in 2011. “2011 signaled a change in the mind-set of European buyers, particularly Continental Europe for offshore services,” said Arup Roy, principal research analyst at Gartner. “Indian providers have historically
8
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
found it more difficult to gain market share in the Western Europe IT services market than in the US market, but, as a group, the top five increased market share in the region from 2.3 percent in 2010 to 2.8 percent in 2011.” On average, the Tier 1 providers dramatically outperformed the growth rates of Tier 2 and Tier 3 providers, despite the consolidation and acquisitions among some of the smaller firms. There were some standouts, however, with Genpact up 27 percent and Syntel up 21 percent. Smaller providers were charged with creating a more compelling marketing message that went beyond labor arbitrage.
DATA BRIEFING
43% GROWTH IN INDIA’S DIGITAL PROJECTOR MARKET IN 2011
E NTE RPRI SE ROUND -UP
THEY VILASRAO SAID IT DESHMUKH
IMAGE BY PHOTOS.COM
India has announced a $5 million contribution to a joint initiative with a US agency to identify and invest in innovation by civil society, and bring them to scale. This was announced after an interaction with US Secretary of State Hillary Clinton.
IBM Acquires Big Data Software Provider Financial terms were not disclosed announced that it would acquire Vivisimo, a provider of federated discovery and navigation software for big data. Vivisimo is a privately held company based in Pittsburgh, Pennsylvania. Financial terms were not disclosed. Vivisimo’s software automates the discovery of structured and unstructured data and helps to navigate it with a single view. “Navigating big data to uncover the right information is a key challenge for all industries,” said Arvind Krishna, general manager, Information Management, IBM Software Group. “The winners in the era of big data will be those who unlock their information assets to drive innovation, make real-time decisions, and gain actionable insights to be more competitive.” The combination of IBM's big data analytics capabilities with Vivisimo software will automate the flow of data into business analytics applications, helping in understanding consumer behaviour, managing customer churn, network performance, detecting fraud in real-time, and performing data-intensive marketing campaigns, the company said in a release. “Businesses need a faster and more accurate way to discover and navigate big data for analysis,” said John Kealey, chief executive officer, Vivisimo. IBM
QUICK BYTE ON MOBILITY
“I am happy to announce the DST’s contribution of $5 million through the Technology Development Board to USAID Millennium Alliance. We expect to leverage creativity of US to maximise quality and Indian strength in optimising resources.” — Vilasrao Deshmukh, Science and Technology Minister
Over 41 million mobile subscribers in India have submitted requests till the end of March 2012 to port their number to a different network through mobile number portability (MNP) with 4.76 million requests made in a single month, official data showed —Source: TRAI
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
9
IMAGING BY SHOKEEN SAIFI
E NTE RPRI SE ROUND -UP
UTM market crossed $1 bnmark in 2011 North America was
the largest market for unified threat management products
WORLDWIDE unified threat management (UTM) revenue reached $1.2 billion in 2011, a 19.6 percent increase from 2010 revenue of $972 million, according to Gartner Incorporated. The UTM market is in the midst of a transition of its customers from older technologies, such as stateful firewall inspection, to the latest next-generation firewall technology supporting application control capabilities,” said Lawrence Pingree, research director at Gartner. “Many UTM vendors delivered new products during the last several years with some
vendors performing product refresh efforts to their UTM portfolios while others worked to expand their small or midsize business (SMB) offerings and wireless UTM offerings,” he said. Fortinet remained the No. 1 vendor in revenue in 2011, accounting for 19.6 percent of the market. SECUI showed the strongest growth, increasing its revenue 59 percent year over year. SECUI predominately focuses its sales in the Asia/Pacific market, and it has goals of expanding globally with initiatives emerging to target the US market. North America was the largest market for UTM products, totaling $431
GLOBAL TRACKER
Mobility
10
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
SOURCE: TRAI
Telecom players such as Bharti Airtel, Idea Cellular and Vodafone owe BSNL Rs 550.88 crore as interconnect usage charges
million in 2011, an increase of 15.5 percent from 2010 revenue of $373 million. North America was driven by strong payment card industry demand requiring both firewalls and intrusion prevention technologies in midsize businesses with higher credit card transaction volumes and greater interest to protect against network-based attacks. Western Europe was the second-largest market for UTM products with revenue reaching $310 million in 2011, up 16.7 percent from 2010 revenue of $266 million. The growth was driven by the same macro consolidation and regulatory and threat-based product acquisition themes as in North America, but with an emphasis on EU data protection mandates. Consolidation into virtualised environments, as well as IT cost-savings measures related to regional recessionary forces, also drove stronger growth in the combined UTM appliances in 2011. Eastern Europe UTM revenue totaled $113 million in 2011, a 28.1 percent increase from 2010 revenue of $88 million. Eastern Europe's broad array of SMBs provided relatively strong growth for UTM products. One driver for this growth is the adoption of UTM for many managed security services in Europe and the use of UTM to support growing managed security services providers' business models. Middle East and Africa UTM revenue reached $28 million in 2011, up 13.5 percent from 2010 revenue of $25 million. Growth in this region was driven by a mixture of transformational government projects, high-speed Internet connectivity adoption and a heavier reliance of regional SMBs using Internet access to conduct business. In Japan, significant UTM market impacts were seen in 2011 after the earthquake and tsunami events. UTM revenue in Japan totaled $32 million in 2011, a 4.7 percent increase from 2010 revenue of $30 million. Government debt increases and tax reform concerns further dampened spending in the region as organisations continued to assess the Japanese economic situation. However, the region is still growing and this slow growth is believed to be a temporary condition. Asia/Pacific had the strongest growth of any region in 2011, growing 31.8 per cent in 2011. UTM revenue reached $204 million in 2011, up from $155 million in 2010.
®
Another Innovation from DATACARD Reliable & Robust Printer
®
DATACARD SP30 PLUS
Quality and convenience in a low cost Card Solution High productivity. The SP30 Plus card printer prints up to 750 monochrome cards per hour. High quality, low-cost supplies. Exclusive SP30 Plus ribbons and cleaning supplies are available in convenient, low cost bulk packaging. High quality images. Advanced Imaging Technology dramatically improves quality and sharpness of photos, graphics and logos. Easy to operate. The printer driver provides message prompts, recovery instructions, color image preview and online user help. Easy to integrate. Enhanced product features that leverage our exclusive Intelligent Supplies Technology™ including automatic ribbon identification, ribbon saver and ribbon low warning. Small footprint. Compact printer easily fits into any workspace.
Specifications Physical Dimensions L 16.5 in X W 7.8 in X H9.0 in (41.25 cm X 19.5cms X 22.5 cm) Weight :8 to 9 lbs(3.6 kg to 4.1 kg) depending on options
Attractive Price Offer The Datacard® SP30 Plus card printer and exclusive supplies offer low cost-percard, high yield printing for on-demand issuance.
Full color printing: Up to 160 cards per hour ymcKT short panel color print ribbon with inline topcoat, 650 images (Sold in case of 16) Contact us at Datacard India Private Limited, B-302, Flexcel Park, S. V. Road, Next to 24 Karat Multiplex, Jogeshwari (West) Mumbai 400 102. India. Tel: +91 22 6177 0300 Email: India_Sales@datacard.com
E NTE RPRI SE ROUND -UP
IMAGE BY PHOTOS.COM
India is 5th most cyber-crime affected country The report is compiled by McAfee and SDA
INDIA stands fifth in the world wide ranking of countries affected by cyber-crime and much of the vulnerability in India is the result of widespread computer illiteracy and pirated machines, according to new report, ‘Cyber-security: The Vexed Question of Global Rules’ released by McAfee and the Security and Defence Agenda (SDA). The report which dwells on India’s IT security scenario identifies that the main challenge for
India is to train and equip its law enforcement agencies and judiciary, particularly outside big cites. According to the report, the premium on Internet privacy in India is low, and data control therefore tends to be neglected. This is another reason for the success of phishing and other scams. “India is acutely aware that cyber -crime is based for its reputation as a country where foreign investors can do business and has been investing heavily in cyber-security. But it still lacks a single operator to control the Internet, telecoms and power sectors and even if CERT-in is the official coordinating authority, a multiplicity of other agencies are still involved. As more and more financial service companies set up their back office operations in India, the authorities know the problem of controlling cybercirme has to be addressed urgently. On the plus side, India has developed valuable experience in dealing with compliance regulations from around the world with the IT Amendments Act of 2008 that established strong data protection. Other key report findings from the SDA report include the following: • Need to address expected shortage of cyberworkforce: More than half (56 percent) of the respondents highlight a coming skills shortage. • Low level of preparedness for cyber-attacks: China, Russia, Italy and Poland fall behind Finland, Israel, Sweden, Denmark, Estonia, France, Germany, Netherlands, UK, Spain and the United States. • Cyber-security exercises are not receiving strong participation from industry: Although almost everyone believes that exercises are important, only 20 percent of those surveyed in the private sector have taken part in such exercises.
FACT TICKER
Fake Instagram app infects Android devices Infected Androids
are now effectively part of a botnet, under the control of malicious hackers IT SECURITY and data protection firm Sophos is warning Android users about malware being distributed disguised as the popular photo-sharing app Instagram. Cybercriminals have created fake versions of the Instagram Android app, designed to earn money from unsuspecting users. Cybercriminals
12
CTO FORUM 07 MAY 2012
have played on the popularity of the Instagram app — which has millions of users around the world, and was recently acquired by Facebook for a staggering $1 billion. If Android owners download the app from unapproved sources, they run the risk of infecting their smartphone. Once installed, the app will
THE CHIEF TECHNOLOGY OFFICER FORUM
send SMS messages to premium rate services earning its creators revenue. Sophos products detect the malware, which has been distributed on a Russian website purporting to be an official Instagram site, as Andr/Boxer-F. Graham Cluley, Senior Technology Consultant at Sophos said “Android malware is becoming a bigger problem. Last week, we saw a bogus edition of the Angry Birds Space game and it’s quite likely that whoever is behind this malware are also using the names and images of other popular smartphone apps as bait.”
SOCIAL MEDIA
B
eing connected to social networks through mobile devices has become quite a necessity from majority of mobile users. But this connection is fast turning out to be lucrative for cyber-criminals to spread malware, AVG’s Q1 2012 Community Powered Threat Report released by AVG. The report highlights that hot target is particularly the devices running on Android operating system. Michael McKinnon, Security Advisor at AVG (AU/NZ) told CSO, “AVG detected a big increase in the use of social networks such as Facebook and Twitter to target Android users. Cyber criminals are finding it very convenient to distribute their malware straight to a mobile device via these networks. The growth of the Android platform has been phenomenal, which has not gone unnoticed with cyber criminals who have discovered it to be a target for their malware. In 2011, Google had to remove over 100 malicious apps from the official Android market, Google Play.” Social networks have become a key source of information and communication. Twitter now has more than 140 million active users ; and Facebook has over 845 million users , with some analysts expecting that figure to reach 1 billion this year . The result: targeting those who use Facebook is like targeting around 14 percent of world’s population or approximately 43 per cent of internet users.
A QUESTION OF ANSWERS
PERSON' S NAME
Move towards mobile: Organisations are increasingly looking at porting their apps on the mobile platform
14 14
CTO FORUM 07 MARCH MAY 20122012
THE THE CHIEF CHIEF TECHNOLOGY TECHNOLOGY OFFICER OFFICER FORUM FORUM
D AV I D TAY LO R
A QUESTION OF ANSWERS
DAVID TAYLOR | MICRO FOCUS
Mobile is Driving Cloud Testing
In a conversation with CTO Forum, David Taylor, President of Asia Pacific & Japan, Micro Focus talks about the mainframe business in India and how COBOL is no more restricted to the platform Do you see new COBOL deployments in India, apart from upgradation and migration? Yes, but a lot of the time it gets done by our partners. For example, TCS' banking application, which is written on Micro Focus COBOL is earning new customers not only in India but globally and customers are running working on the COBOL platform without even realising. We are seeing a lot of work where we are upgrading customers from all versions of COBOL to new versions. So it isn’t dying. It is mainly expanding but through other applications. And at the end of the day those customers in most cases never realise they are running COBOL, nor do they really need to know.
How big is cloud testing for you and how big is the market shaping up especially in India? We have a solution called CloudBurst where we can basically test extremely large volume virtual servers globally. That service is doing very well for us. The growth in that is coming from the growing use of mobile devices in the enterprise. Many of our customers are trying to offer their users, mobile access to enterprise applications and therefore, they need to be able to test those applications effectively. In India we are seeing a lot of development work being done on mobile applications. I think companies will allow mobile access to applications where it makes sense.
However, I am not sure that they will run them on the cloud. At this stage, I see there is a lot of talk in around cloud adoption in India but I don’t know how many companies are really running mission critical applications in the cloud. I am not familiar with any. What we are saying is that we work with other applications to be able to provide services to the end-users. We always have discussions with our customers, and we put in a lot of focus in their ability to modernise their applications to be able to do what they want and we are seeing a lot of values there. Cloud really at the end of the day is just another platform, another deployment. So, from a Micro Focus perspective, if I migrate from the Mainframe to Windows, I am not THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
15
A QUESTION OF ANSWERS
D AV I D TAY LO R
seeing a lot of demand to do that for critical application where we see the mobile applications coming in from an enterprise perspective. We see a lot of banks already providing their customers with mobile applications that can be used to view account details, transfer money and so on. They develop a mobile application which ultimately connects to customer data that lies on the bank’s mainframe. The customer doesn’t have to worry whether it is delivered through a mainframe or a cloud that is using mainframe. Considering most of your customers in India are in the banking and financial sector, what specific applications do you see moving onto the mobile and to the cloud? I do believe mobile applications being deployed over the cloud. If I think of myself as a customer, I need to access my account on the go. So what customers are now doing is to tailor in those applications so that they perform much better in terms of mobile devices. And we think a lot of that is taking place in many organisations. One of the important thing that is happening now includes insurance data applications. Cloud has lot of advantages as it allows an individual to work on the go and not be stuck in their desktops. In terms of security, there has been a lot of work which is happening but a lot still needs to be done so that people can get confident about migrating to the cloud. In terms of financial institutions running their applications in the cloud, I am not sure much of this is happening in India but we think that things will change in the coming years as people will eventually shift to the cloud environment. According to you which model of cloud is the best for the banking customers and
16
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
how do they get the optimum benefit from that? It is quite difficult to answer that because different banks have different approaches. When we have to provide any kind of installations we have to first visit the location and understand the structure of the organisation. There can be a lot of complexities which we need to address and we have to customise our technologies for different banks. I think private cloud is getting recognition in different banking organisations as it is easy to deploy and there are many companies which are offering competitive services to the financial organisations. Cost is the primary thing that comes to your mind when you think of migrating to the cloud. Today, storage of data has become critical for financial organisations and I think people will have no option but to migrate to the cloud if they would like to keep their recurring costs at the lower side. Eventually, we would witness different models of cloud being adopted. The trype of adoption would vary from customer to customer according to the need and the assets of the company. How do you see the economics of cloud working for smaller banks in India that cannot afford to set up their own data centres and host their own applications? Yes, the economics of it absolutely makes sense. However, at the end of the day from a broad perspective, that is going to be a risk. It then boils down to the risk appetite of the bank and whether or not they want migrate to the cloud environment. I do see risk is associated with migrating to the cloud. But the other thing is if you just think about migration the next logical extension of that would be to be able to sell those services outside of India? But then you have another issue in terms of storing the data outside of
THINGS I BELIEVE IN Cloud allows an individual to work on the go The mainframe business is not big in India as there are not that many accounts Micro Focus could run any mainframe application on different platforms including Linux and Windows
a country. Now for example, we did a migration in Standard Chartered Bank in Singapore recently we did it because they were processing the accounts from Hong Kong and the Chinese government introduced the legislation that said that financial data of a Chinese citizen cannot be stored outside of China. So the company had a choice to buy another mainframe or run the application on a different platform which obviously is what we did. So you are going to get a lot of government regulatory compliance implications coming in when you start looking outside of borders. So I don’t know what India’s rules are in that area right now, you will have to store the data outside of India for Indian companies and that is one of the tricky things.
D AV I D TAY LO R
A QUESTION OF ANSWERS
COBOL that is sold around the world and so that is what expands the footprint of COBOL.
“We see a lot of banks already providing their customers with mobile applications that can be used to view account details, transfer money and so on. They develop a mobile application which connects to customer data that lies on the bank's mainframe” What percentage of your COBOL install base in India or even globally is on non-mainframe environment? It is huge. It is massive. We would probably have to do some research. Our COBOL does not run on the IBM mainframe. We have developed a COBOL compiler, which basically is perfectly compatible with the IBM mainframe COBOL platform. So what that enabled us to be is to built COBOL compilers through different platforms. For example, we could run any mainframe application on different platforms including Linux and Windows. Windows and Linux are the most common platforms today apart from some of the proprietary UNIX environments. I use the word proprietary because they are also on a different base from the hardware vendor. So, we have never run on the IBM mainframe itself, but we can. So, we can do develop-
ment on these platforms because our COBOL compiler is very compatible that has enabled us to offers different types of development. And when you send them back to the mainframe, it just runs perfectly, you don’t have to do anything. Growth for our business comes from two main areas. The first is our migration business. Migration means that we take an application from the mainframe and run it in the distributed world. So we expand the footprint of COBOL as a result of that. And then second area is actually the people who have developed their applications in and around COBOL platform and run it on the variety of different platforms. TCS’ banking application is an example, Oracle’s People Soft is written on our platform, and literally there are thousands and thousands of applications that have been written in Micro Focus
What are your plans in India when it comes to migration business as well as Cloud testing? With the business, we are seeing a lot of growth. We are seeing more companies developing on our platforms to take products to market. The mainframe business as you know is not big in India as there are not that many accounts. But what we are doing is instead of talking to those customers about unnecessary migrations we are trying to help them in terms of lowering their cost of ownership of mainframe applications. We are doing a lot of work with the system integrators around their mainframe capabilities. India dominates that market globally as you know. And so what we have is an analytical technology which enables a person sitting at their business to do their job perfectly. When there is a person sitting at a desk answering multiple support issues from multiple customers and maintenance issues from multiple customers, he needs technological solutions to make his task easier and we are offering just that which makes us acceptable to the customers. We have technologies that can very quickly analyse the code that any person is working on and that makes the job of that person easier. The testing business in India is very strong, providing services globally. We interact directly with the end-users and our business model is always to work with our partners. At the end of the day we are a software company and our partners are the ones who provide the services to the customers, to the end-users. We have a very strong technical team here to provide high-end technical support to assist our partners in delivering their services. So while we are not going to double our business next year, but I am confident we will have a very good year in terms of growth and our business will keep on increasing as we have a very strong focus for India as a country. We also do specific promotional activities which ensures that we add new customers and also retain the existing customers.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
17
BEST OF
Managing Your Private Cloud Pg 21
ILLUSTRATION BY SHIGIL N
BREED
FEATURES INSIDE
It’s Time to bn Rethink Outsourcing
DATA BRIEFING
$2
WILL BE THE WORTH OF GLOBAL CARRIER WIFI EQUIPMENT MARKET BY 2016
18
CTO FORUM 07 MAY 2012
A revolution is taking shape in business services that disregards the traditional outsourcing paradigms BY CLIFF JUSTICE
I
t’s amazing to me that, according to industry research, nine out of 10 organisations are using some form of shared services and outsourcing to deliver their information technology and business processes. When I talk to our clients who have been outsourcing for THE CHIEF TECHNOLOGY OFFICER FORUM
some time, I’m amazed at how advanced and intertwined many of these relationships have become. Despite this success, the market is shifting and it’s clear that the outsourcing and business services industry is about to go through a change like never before.
OUTSOURCING
Labour arbitrage, India and the new customer In the past 20 years as a consultant and business operator, I’ve been fortunate to work with some of the world’s leading companies and participate in their transformation journey. Starting with the ITO mega-deals of the 1990s to the expansion of offshoring and BPO in the 2000s, companies have consistently sought ways to use sourcing strategies to reduce the cost of back office services. The improvements in productivity and technology have been significant, but there has been no better way to achieve cost reduction than through global labour arbitrage, i.e. offshoring. The easiest way for most companies to get there has been to outsource it to a third party, often based in India. However, the past five years have introduced a number of significant changes which have begun to transform the traditional underpinnings of business service delivery in the Western world. For example, cloud technology and social media are ubiquitous. They are changing not just in how we connect with family or store music, but how we do business, collect data and deliver technology. These are more than new technologies; they represent a change in behavior in how the customer and business agree to interact, share information and conduct trade. Perhaps as significant a change is who the customer is. Our traditional low-cost arbitrage markets have been India, China and parts of Asia. However, the success of outsourcing and global manufacturing has spawned a rapidly growing middle class in these regions, which is both increasing cost of labour and broadening the potential customer base for many companies. As this success causes the benefits of labor arbitrage to disappear, how do
BEST OF BREED
Shared services concepts have also steadily evolved from the days of accounts payable and data entry. In many companies, shared services have moved up the value chain to provide a wider range of more complex services and they are establishing an internal brand companies effectively serve new markets and where is the next level of back office savings? I think most companies would opt for a few hundred million new consumers over 20 percent additional savings on IT and F&A. The competitive advantage will go to the companies that can both connect with new customers and do business effectively in these new markets with lower costs, better data and market insights, and operational flexibility. When you think about SG&A functions for your business — HR, IT, purchasing or accounting, etc. — do you view these support services as a tactical necessity or a strategic weapon? Are they a cost center or a competitive advantage? While it’s not an either/or question (or answer), that’s the degree of contrast we see in the strategies and objectives of new business services models.
Traditional outsourcing model The traditional models of delivering support services are still alive and well. Outsourcing and shared services have continually improved as service delivery vehicles, and
have consistently become more valuable to the organisation. Outsourcing has evolved considerably in the past decade. When part of a comprehensive strategy, it has proven to be a transformation catalyst that has helped companies implement new processes and technologies, reduce costs, access a global talent pool and change their overall business through the use of partners. Outsourcing is going through change, however. The average deal size today is smaller, performance expectations are higher and many providers are delivering more complex services with more industry knowledge and business orientation. While cost is still key, success in a mature relationship is more often determined by its contribution to the business as a result of the partnership than by cost savings alone. Shared services concepts have also steadily evolved from the days of accounts payable and data entry. In many companies, shared services have moved up the value chain to provide a wider range of more complex services and they are establishing an internal brand. Multi-functional captive delivery
BEST OF BREED
OUTSOURCING
centers are an example of the success of the global shared services concept. Many companies have monetized the asset and sold off their captives to become commercial service providers with specialties in an industry and function.
EGE: A more holistic approach There is, however, a revolution taking shape in business services that disregards the traditional shared services and outsourcing paradigms and centers the design of the business services on the needs and priorities of the enterprise as a whole. This design has given companies the ability to enter new markets easier, integrate new acquisitions faster, rapidly adopt new processes and access and analyze a wider range of data which, most importantly, serves their customer better. While many of these new business services organisations have different objectives, most share common characteristics. They are centrally managed and involve an integrated portfolio of capabilities made up of usually a combination of external service providers (outsourcers), internal specialist capabilities, and internal shared services. Most of these organisations are enabled by common technologies and governed by common processes. These companies have the characteristic of a delivery concept we call the extended global enterprise, or “EGE.” Companies are using this and similar concepts to not only transform the back office but also the overall business. The EGE is not a specific model or set delivery structure. But rather a paradigm for delivering business services that is based on the concepts of end-to-end processes, internal and outsourced service providers, high value services, and strong central governance. Instead of relying on resources within the four walls, the EGE leverages a global pool of internal and outsourced resources, and to deliver a service that is nimble, aligned to the business and connected with customers, employees and suppliers. Think of how just-in-time (JIT) changed manufacturing, or how cloud is changing the way buy and consume technology. The EGE concept is similar in its differences from the traditional models.
20
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
EGE has four key attributes: First, the overall goal of the EGE is to increase value to the business, and help achieve competitive advantage. Beyond meeting service levels and cost benchmarks, it also has the ability to be flexible, quickly adopt new processes, assemble talent, deliver new technology, and centrally collect and analyse relevant data; Second, the design of the EGE is influenced more by the customer need and business strategy than by the traditional organisational structure; Third, the EGE consists of a balanced portfolio of services and processes that spans across functions and deploys the most appropriate capabilities whether they
The External Global Enterpirse is a great compass to guide the design of your back office and prepare for the future and the market today is ripe for it are internal, globally sourced or technology enabled; and Fourth, the EGE is governed by an empowered organisation that has a charter to support the business and manage the delivery model. It is measured by the value it creates.
Alignment: There’s that word again That’s what I mean by thinking differently. It’s about alignment of global capabilities, processes and governance to deliver services in a way that doesn’t just support the business but also to advances it. It’s about using the knowledge and capabilities of
service providers to make your organisation more successful. Does this approach always work? Not for everyone. But our recent surveys tell us that some companies that have designed services on similar concepts have reported cost savings of 15 percent to 20 percent above and beyond the traditional shared services and outsourcing or decentralised models. They’re reducing facilities costs by 10 percent to 15 percent, for example. And, thanks to improved processes, they’re getting eight percent to 13 percent savings on indirect goods and services.
Overcoming inertia While the business impact is positive to the bottom line and to the objectives of the business, adopting this paradigm represents a disruptive change that is almost always met with resistance. For example, because end-to-end processes are blind to organisational structure, this model impacts the traditional concept of functional empowerment. That’s a hard change management discussion. Other companies face inertia — ironically — because of their initial success. Some functional managers may have used outsourcing and shared services to reduce support costs by up to 30 percent and, justifiably, they looked like rock stars. So now, when you ask these people to revisit their model and take it to the next level, there is naturally some resistance. So pick the spots where you want to improve, based on finances, culture and appetite for risk. You might focus on developing a commercial orientation, while going slower on the relocation of some services. Or you might put your efforts into governance and improving customer value. So what can you do at your organisation to get into action? Here are some steps you can take: First, objectively review where you are and consider where you want to go with your support services. From there, take a look at current outsourcing relationships and shared services capabilities. Next, consider service delivery. What kind of strategy do you have? Are you aligning services with corporate objectives? Consider geography. Are all of your com-
M A N AG E M E N T
pany’s current markets being served? Are you prepared for expansion into future markets? Look at your technology. Consider whether your platform supports the integration of processes and providers. Don’t forget culture. Will your culture support a governance structure that provides comprehensive oversight of services — including performance metrics?
$12bn ESTIMATED VALUE OF SAAS ADOPTION GLOBALLY
By answering these kinds of questions, you can identify gaps and opportunities and then develop a roadmap for improvement. The Extended Global Enterprise is a great compass to guide the design of your back office and prepare for the future and the market today is ripe for it.
BEST OF BREED
those of the author and do not necessarily represent the views and opinions of KPMG LLP. —Cliff Justice, based in the firm’s Houston office, leads KPMG's U.S. Shared Services and Outsourcing Advisory practice, one of the largest and most comprehensive sourcing and business services advisory practices in the world. Cliff is an established and recognized leader in shared services and outsourcing with more than 20 years of experience in industry operations, outsourcing, offshoring and enterprise services transformation.
—The views and opinions are
Managing Your Private Cloud
The true value of developing a private cloud environment lies in an ongoing communication between IT and business units BY STEVE PELLETIER
A
ssembling the servers, storage, networks and backup that are the underlying technology for your private cloud environment is the easy part. The infrastructure for a private cloud can be defined in the familiar language of specs and reqs. A server is what it is, even a virtual one.
Getting the full potential out of the cloud
ILLUSTRATION BY SHIGIL N
To realise the full potential of your private cloud environment, however, you will also need to address several key concepts that are not so easily defined. These involve squishy terms like “management,” “automation” and “orchestration” that need to be defined in terms of where you are today and where you want to be tomorrow ... however far off tomorrow may be. By itself, your private cloud infrastructure is a proverbial tabula rasa, i.e., before your private cloud environTHE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
21
BEST OF BREED
M A N AG E M E N T
ment can live up to its potential, you have to teach it to think. Management, automation and orchestration provide the tools you will need to write on the blank slate that is your private cloud’s brain. Most IT departments have some of the tools and some of the skills they need to create an intelligent cloud environment; at least enough to get started. One of the benefits of developing a private cloud, in fact, is that it provides the opportunity, however challenging, to define exactly what you are trying to do with IT in your organisation as you go forward. The key to management is monitoring. You can’t manage what you can’t see. Most IT departments are able to monitor their hardware, storage and networks to some level. Once you start spinning up and tearing down virtual servers, sharing resources and providing self provisioning, however, you will need to be able to see what’s happening at a virtualisation level and an application level, as well. The Information Technology Infrastructure Library (ITIL) provides best practices that can guide development of the comprehensive IT service management (ITSM) strategy that you will need to develop to ensure that your cloud is serving the needs of your business users. ITSM components you will need to define, if you haven’t already, are a service catalogue; Incident, Problem and Change procedures; and a configuration management database (CMDB).
Orchestration defines the workflow to link a variety of automated tasks and IT resources to complete a process like provisioning an entire new service 22
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
easy access to resources sends shivers through many in IT. To ensure that self service does not get translated as “instant gratification” by your users, you need TOTAL MARKET to have things like chargeback FOR SEMIand showback so you can demonstrate that you are spending CONDUCTORS IN money by serving yourself these MOBILE HANDSETS corporate assets. Without some Automation and GLOBALLY IN 2011 way to allocate cost, self-service orchestration provisioning is like the Wild Automation streamlines repetiWest all over again. The good tive data center processes such news is, there are tools available today that as server operating system, application will do just about everything from orchestratdeployment and other administrative proing workflow rules involving complex interaccedures. There are many automation tools tions to calculating the total cost per month of available, and most IT departments have spinning up a single server. automation procedures for many of these tasks already defined. Orchestration is a more complex undertakThe IT MBA ing. It defines the workflow to link a variety Knowing which tools you need and being of automated tasks and IT resources to comable to tell them what you want them to do, plete a process like provisioning an entire however, requires a set of skills and a knowlnew service. Orchestration is often combined edge of business processes that can leave with a portal to designated resource users. the IT professional feeling like a computer Ideally, the combination of automation and science major who suddenly got beamed orchestration allows IT deployment to match into the MBA programme. This is where the speed of business, providing rapid time the cloud hype about aligning business and to market for new initiatives. Orchestration technology gets real. Unfortunately, developneeds to be able to interact with both your ing and managing a private cloud environmonitoring tools and your configuration ment is typically an IT initiative. To realise management database to ensure that the its potential, a private cloud needs to be more approval procedures and technical parambusiness and process driven than most IT eters you have established are met. It goes departments can implement on their own. without saying that you need to establish the IT is obviously critical to building a cloud parameters and approval procedures before but, if it’s an IT-only project, your private you can orchestrate them. As “smart” as cloud is not going to be as good as it could technology has become, it still cannot think be. The true value of developing a private for itself. cloud environment, in fact, may well be that it requires the kind of open, ongoing communication between IT and business units Self-provisioning — compelling that your organisation will need to succeed in and scary the future. Many of what are considered the attributes of cloud technology are both compelling and scary. Self-service provisioning, for example, — Steve Pelletier is a solution architect for is compelling because it enables IT and desLogicalis, an international provider of integrated ignated users to spin up capacity as needed. information and communications technology soluIt is the cloud feature most directly linked tions and services. He is responsible for designing to providing the Holy Grail of business agilboth public and private cloud strategies for ity. Self-service provisioning is scary for the Logicalis’ clients. same reason. The runaway proliferation of virtual machines that virtualisation unleashed — The article has been reprinted with permission in many data centers is a good example of from CIO Update. To see more articles regarding what can happen when IT has easy access IT management best practices, please visit www. to resources. The thought of users getting cioupdate.com
The discipline of developing an ITSM strategy will help you determine the parameters you want your cloud to live by. And, once they have been defined, an effective way to apply them consistently is through automation and orchestration.
$30bn
92
What would you do with an extra 92 hours?
High-performance analytics from SAS® helped a financial services firm reduce loan default calculation time from 96 hours to just 4 Early detection of high-risk accounts is crucial to determining the likelihood of defaults, loss forecasting and how to hedge risks most effectively. Now, SAS can help you speed that time to decisions from days to literally minutes and seconds –transforming your big data into relevant business value.
high-performance A real analytics game changer. High-Performance Computing Grid Computing In-Database Analytics In-Memory Analytics Big Data
sas.com/92
to learn more
For more information please contact Jaydeep.Deshpande@sas.com
Each SAS customer’s experience is unique. Actual results vary depending on the customer’s individual conditions. SAS does not guarantee results, and nothing herein should be construed as constituting an additional warranty. SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. © 2012 SAS Institute Inc. All rights reserved. S90309US.0412
COVE R S TO RY
B I G D ATA
CHALLENGE OR OPPORTUNITY?
24
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
As data within enterprises grows at an exponential pace, the need to store, manage and analyse it is becoming a challenge. CTO Forum tries to investigate if this big data problem is just a challenge or an opportunity for the enterprise BY VARUN AGGARWAL
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
25
COVE R S TO RY
B I G D ATA
ber of terabytes. As technology advances over time, the size of datasets that qualify as big data will also increase. Also, the definition can vary by sector, depending on what kinds of software tools are commonly available and what size of datasets is common in a particular industry. With those caveats, big data in many sectors today will range from a few dozen terabytes to multiple petabytes.
BIG DATA CHALLENGES
At a time when storage was expensive and people never sent email attachments for over 1 MB, Gmail services, launched in the year 2004, supported users not only to send and receive large email attachments but also encouraged them to maintain their inbox without deleting any email, ever. One GB storage offered by Gmail was considered quite a lot those days as most users were never able to touch that storage limit. Today, however, things have changed. While, Gmail has increased its email quota for users to 10GB, it has led to a culture wherein people believe in maintaining every document that they ever create, even if those documents don’t hold any value. Cheap storage has added to this cultural change. Today, with a $600 hard drive, you can store all of the world’s music! From social networks to pictures, music and videos on smartphones, people are creating content at an unimaginable pace. What all this leads to is a deluge in data that organisations have realised, cannot be managed using traditional databases. Everyday or rather every minute, trillions
26
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
of bytes of information is being collected by enterprises around the world about their customers, suppliers, and operations and millions of networked sensors are being embedded in the physical world in devices such as mobile phones and automobiles, sensing, creating and communicating data. Take for example, New York Stock Exchange. Euronext, the company managing the exchange says this one stock exchange alone generates 2TB of data everyday. Loading that volume of data into a typical database would take hours and perhaps even more than a day, making it impossible for such an organisation to handle data using traditional methods. Big data technologies thus become the only way forward. McKinsey defines big data as datasets whose size is beyond the ability of typical database software tool to capture, store, manage, and analyse. This definition is intentionally subjective and incorporates a moving definition of how big a dataset needs to be in order to be considered big data — ie McKinsey doesn’t define big data in terms of being larger than a certain num-
According to Gartner, going into 2012, there is an increase in the amount of information available to organisations, but it's a challenge for them to understand it. Given the shifts in control of systems that IT organisations are facing, the loss of ability to guarantee consistency and effectiveness of data will leave many struggling to prevent their organisations from missing key opportunities or from using questionable information for strategic decisions. Gartner says that it sees no regulatory help on the near horizon, leaving each business to decide for itself how to handle the introduction of big data. This voluminous data is growing everywhere, including some of the largest Indian enterprises. Take for example the Reliance group. Reliance has been growing consistently and hence the data management practices are in place, benchmarked to international standards. The company is constantly adding new customers every minute and is looking forward to grow faster than ever. The growth at Reliance has also translated into massive growth in data. “Our data is also growing enormously due to subscriber growth and consumption of our services. We are applying innovative methods, system and process changes to manage and
make best use of resulting data,” said Alpana Doshi, Group CIO, Reliance. “We have voluminous usable data which has a 70 percent structured and 30 percent unstructured data. This data has shown a growth of 20 percent in transitional data, 32 percent in analytical data,” she added. When it comes to this kind of growth, it is not restricted to companies the size of Reliance. Take for example, Xentrix, one of the most promising studios that was founded in Bangalore in June 2010. In less than two years of operations, the company’s data grew to about 24TB. “Data was stored in different media and it became a time consuming task to just access that data,” says company’s CEO, Jai Natrajan. The challenges for Xentrix didn’t stop at that. Multiple creative personnel were unable to access the data concurrently. This limited collaborative work, in turn affected productivity and turnaround times. Finally, there was the issue of scalability as technology changes were rapid and the company did not deem it necessary to replace technology every few years. For both Reliance as well as Xentrix, the only solution was to adopt big data technologies. The deployments however, went beyond just solving the problems at hand and helped the companies create deep value from big data. There are however, further challenges in the deployment of big data. As Sid Deshpande, Senior Research Analyst, Gartner puts it, “Staffing could be one of the biggest challenges for big data deployments. For a large scale deployment, enterprises would need to invest into training the staff on big data technologies. Moreover, cultural mindsets need to change to allow use of open
Staffing Issues: Staffing could be one of the biggest challenges for big data deployments
“For a large scale deployment, enterprises would need to invest into training the staff on big data technologies” —Sid Deshpande, Senior Research Analyst, Gartner
source technologies as many big data tools are open source.” However, he cautions, “cost-reduction should not be the reason to select open source products since the cost-savings can be offset by more expensive professionals and the need to change business processes.”
BIG DATA OPPORTUNITIES Big data can turn into opportunity if handled well. Reliance took a highly compre-
hensive approach towards big data. Reliance uses a data management system, which monitors the data growth, as desired. The data is segregated under three buckets: Customer Centric – Required for customer services Business Data – Required for analytics, trend analysis and business forecast etc. Legal Data – Managed for regulatory requirements The company has configured its data storTHE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
27
COVE R S TO RY
B I G D ATA
age system based upon the above three criteria. This helps them analyse and manage the data performance requirements, TCO. “Our transition data goes into high-speed enterprise-class storage, which has a very fast response time. The analytical kind of data is segregated into low-cost enterprise storage capable of high analytical performance. Multiple parallel processing capable storage/application, with compression enabled is used for legal data. This does not warrant high-end storage,” Doshi explained. Reliance uses compression features of RDBMS and storage applications to control the data growth. The company has been actively involved in big data solutions due to sheer size of its databases. “We have adopted multi parallel processing DB to store CDR and unstructured and perform analytics on it. We have implemented storage virtualisation. We have also implemented unified storage (SAN, NAS) in a single box,” Doshi said. Doshi believes that the early adoption of big data analytics will lead to faster rollout of many customer-facing services and by applying analytics one can really change
“Analytics plays a major role in making the business enlightened on the power of information that can be carved out of the big data mart” —Alpana Doshi, Group CIO, Reliance
28
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Early Adopter: Doshi believes that the early adoption of big data analytics will help in faster rollout of services
B I G D ATA
the game in the market. “Analytics plays a major role in making the business enlightened on the power of information that can be carved out of the big data mart. This has to be driven from corporate and all relevant stakeholders (business/finance/ IT etc) should work hand-in-hand to reap the results.” The big data deployment at Xentrix studio similarly was driven by the management and it couldn’t have been a success without a close understanding between business and IT. Xentrix decided to go with EMC’s big data storage solution — the Isilon X-series. “By deploying Isilon X-Series, we unified disparate silos of 3D content into a single, highly scalable, easy-to-use, shared pool of storage, dramatically accelerating content lifecycle while significantly reducing costs,” said Surendra Kamath, IT Manager, Xentrix. This, especially made sense since the company bagged a major project eight months back and the storage requirements for this project has been on the rise. “By bringing post-production in-house and using Isilon IQ scale-out NAS, we were able to reduce our costs dramatically while only taking one-tenth the time previously required to get the post-production work done — saving us countless hours,” said Natarajan.
PHOTO BY JITTEN GANDHI
CONCLUSION In the last 6-7 years, advancements in Hadoop and other big data technologies have considerably improved analytics on extremely large data sets. In India, however, big data deployments are mainly restricted to MNCs as the concept is still at a very nascent stage in the country. “India has a great opportunity to adopt big data technologies as the country already has a fair amount of developers and professionals who can work on big data and this is only on the rise,” Deshpande said. According to Deshpande, “Even though there are some companies in India who have already deployed big data technologies, the deployments are mostly for specific workloads and projects instead of organisation-wide adoption. Large size big data projects usually require enterprise rearchitecture. What this means is that enterprises need to revisit how data in their company is
COVE R S TO RY
LESSONS FROM ABROAD O’ Reilly needed a powerful, scalable data warehouse to collect and analyse millions of IT job posts to glean insights into key trends in the technology industry. The company uses this data to come up with topics for its books and conferences, to offer consulting and custom research and for its own internal business development. O’Reilly’s goal was to tap into one of the largest sources of technology trend data available: Simply Hired. The site serves as an aggregator of millions of available jobs in the technology industry, painting a picture of the technologies garnering the most attention, investment and focus at any given moment. O’Reilly needed to consolidate hundreds of millions of job postings from Simply Hired into a data warehouse for trend analysis. Greenplum Database is designed to manage very large amounts of data very quickly, and it’s massively-parallel architecture is perfectly suited to load and query petabytes of data – making it an ideal solution for O’Reilly’s needs. O’Reilly had been collecting IT job data from Simply
Hired for several years, to see which categories of jobs were increasing in number, which were diminishing and which were emerging for the first time. O’Reilly had data on about 30 million jobs in an open source data and manually analysed query results. But the database continued to grow and the analysis process was slow and cumbersome – taking an average of 10 hours to run a single query. O’Reilly needed a faster, more robust data warehouse solution to capture and analyse the growing job data. After implementing Greenplum, O’Reilly immediately began to increase the speed of queries — from an average of one every 10 hours to one every six minutes, a 100X gain in performance. With Greenplum Database, O’Reilly was able to effectively analyse data even as the database grew to over 170 million job posts. What’s more, Greenplum’s massively-parallel structure allowed O’Reilly to effortlessly run more complex queries – for example, searching for all jobs that contain the word “Java”, but filtering out those related to coffee shops or
getting created and how it is being stored. Storage tiering is required to get optimal levels of performance before adopting big data analytics. However, enterprises also need to ensure that all this makes financial sense for them. They need to make sure what they have decided to store and manage is really useful for the organisation and not some stale data.” That said, it is not just large companies that can benefit out of big data deployments. As in the case of Xentrix
travel agencies. With this massive increase in data processing power, O’Reilly has exponentially increased the depth of its knowledge into technology trends. O’Reilly originally implemented Greenplum to quickly and efficiently query data from 30 million job posts. Today, the system handles 170 million job posts and continues to grow. With Greenplum, O’Reilly is able to analyse what has become a massive data store with alacrity, successfully performing largescale analysis against huge data sets. Now, O’Reilly believes the sky is the limit; Greenplum Database’s massively-parallel architecture is on track to analyse 500 million job postings or more, allowing O’Reilly to draw richer conclusions about the technology industry to determine up-to-theminute topics for books and conferences. Greenplum has helped O’Reilly retain its reputation as a trend spotter – and to better serve its role of educating the IT community on the latest developments in technology.
studio, companies running HPC environments have no option but to go for big data. “The success rate of a big data deployment does not depend on the scale of the deployment rather it is more to do with the alignment of IT and business. Value of a big data deployment can be measured in terms of accuracy of analysis of data. It can also be measured in terms of business efficiency improvement and insights that it offers,” Deshpande concluded. THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
29
COVE R S TO RY
B I G D ATA
BEST PRACTICES TO INCORPORATING BIG DATA INTO ENTERPRISE INFORMATION ARCHITECTURE BY MITESH AGARWAL
30
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
B I G D ATA
COVE R S TO RY
R
esearchers estimate that enterprise data grows nearly 60 percent per year (90 percent of that being unstructured) and the average amount of data stored per company is 200 terabytes. This growth is triggered by the increasing channels of data in today’s world. Examples include, but are not limited to, user-generated content through social media, web and software logs, cameras and intelligent sensors embedded in physical devices that can sense, create, and communicate data. Companies are realising that now is the time to put this data to work. However, several obstacles limit their ability to turn this massive amount of unstructured data, often termed as Big Data, into profit. The most prominent obstacle being a lack of understanding on how to add Big Data capabilities to the overall information architecture to build an all pervasive Big Data architecture. Companies have to understand that planning Big Data architecture is not about understanding just what is different. It is also about understanding how to integrate what is new when compared to what is already in place. Here are a few general guidelines to build a successful big data architecture foundation:
ALIGN BIG DATA INITIATIVE WITH SPECIFIC BUSINESS GOALS
#01
One of the key characteristics of big data is value-value through low-density and high volumes of data. As we sort through the mountains of low-value-density Big Data and look for the gold nugget, do not lose sight of why we are doing this. Follow an enterprise architecture approach. Focus on the value it provides to the business. How does it support and enable the business objectives? Properly align and prioritise big data implementation with the business drivers. This is critical to ensure sponsorship and funding for the long run.
ENSURE CENTRALISED IT STRATEGY FOR STANDARDS AND GOVERNANCE
#02
Some of the recent analysts’ surveys indicate that one of the biggest obstacles for Big Data
Biggest Obstacle: Lack of understanding on how to add Big Data capabilities to the overall information architecture
COVE R S TO RY
B I G D ATA
“Enterprises should layer that abstracts the insights making life for the business to put the rules look at big data investment simpler into a single engine from where they can be by various front-end user applias an extension to their existing consumed cations in uniform fashion. information architecture, SECURITY not a replacement” #08 ENSURE FOR BIG DATA —Mitesh Agarwal, CTO, Oracle India
is skills shortage. A 60 per cent skills shortfall is predicted by 2018. Addressing such a challenge through IT governance to increase the skill level, to select and enforce standards, and to reduce the overall risks and training cost would be an ideal situation. Another strategy to consider is to implement appliances that will give enterpirses a jumpstart and quicken the pace as enterprises develop their inhouse expertise.
USE A CENTER OF EXCELLENCE TO MINIMISE TRAINING AND RISK
#03
Establishing a Center of Excellence (CoE) to share solution knowledge, plan artifacts and ensure oversight for projects can help minimise mistakes. Whether Big Data is a new or expanding investment, the soft and hard costs can be shared across the enterprise. Another benefit from the CoE approach is that it will continue to drive the Big Data and overall information architecture maturity in a more structured and systematical way.
PROVIDE HIGH PERFORMANCE AND SCALABLE ANALYTICAL SANDBOXES
#05
When a problem occurs humans can solve it through a process of exclusion. And often, when we do not know what we are looking for, enterprise IT needs to support this “lack of direction” or “lack of clear requirement.” We need to be able to provide a flexible environment for our end users to explore and find answers. Such sandbox environments also need to be highly optimised for performance and must properly governed.
#06
RESHAPE IT OPERATING MODEL
The new requirements from Big Data will bring certain changes to the enterprise IT operating model. Provisioning of new environments will need to be more timely and userdriven. Resource management also needs to be more dynamic in nature. A well planned out cloud strategy plays an integral role in supporting those changing requirements.
#04
EMBED BIG DATA INSIGHTS INTO BUSINESS APPLICATIONS
Enterprises should establish new capabilities constantly and leverage their prior investments in infrastructure, platform, Business Intelligence and Data Warehouse, rather than throwing them away. Investing in integration capabilities can enable the knowledge workers to correlate different types and sources of data, to make associations, and to make meaningful discoveries.
To make big data operationally feasible, business should invest in solutions that enable embedding of insights generated from big data into front-end user applications. For example a bank has to put the insights they have got from Big Data into their call center app, their web banking app and potentially their core banking app as well. Solutions like Oracle Real Time Decision (RTD) and Oracle Complex Event Processing (CEP) can provide a readymade
CORRELATE BIG DATA WITH STRUCTURED DATA
32
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
#07
Enterprises worldwide have ranked IT security as one of their top priorities as increasingly sophisticated attacks, new data protection regulations, and most recently insider fraud and data breaches, threaten to disrupt and irreparably damage their businesses. Security in the Big Data world is even more crucial. Enterprises should invest in integrated security solutions to ensure that big data insights generated from the integration of old world structured data and the new world unstructured data is not compromised in any way.
LOOK AT BIG DATA AS AN EXTENSION OF EXISTING INFORMATION ARCHITECTURE
#09
Enterprises should look at big data investment as an extension to their existing information architecture, not a replacement, not a standalone island. Invest in solutions that integrate Open Source and Enterprise Technologies. They should look for economic advantages in their architecture through simplifying and standardising their IT Operations, through reduce IT Investment and building their infrastructure once for enterprise scale and resilience; through unified development paradigms, and through sharing metadata at the enterprise level for integration and analytics.
INVEST IN SHAREABLE ARCHITECTURES TO EXTEND ANALYTICAL CAPABILITIES
#10
A lack of enterprise-ready statistical analytical tools prevents the kind of analysis necessary to spot trends. Enterprises should invest in Shareable Architectures to extend their analytical capabilities in visualisation, semantics, spatial, and statistics analysis. -The author is CTO, Oracle India
COVE R S TO RY
ILLUSTRATION BY PRINCE ANTONY
B I G D ATA
VENDOR ROUNDUP
BY PAM BAKER
T
he latest hot trend is Big Data. But you knew that already having heard the hype, oh, just about everywhere. So you’re ready now to take a look at the vendor lineup and hopefully pick a winner. Let’s break the contenders down then, take a look at who they are and what they’ve got, and get on with it. As is always the case with any tech, define what you have on hand and what you want to achieve before you go vendor shopping. "Now if you reckon you have some decent quality data already in place, and a business outcome defined, this allows you to at least now focus on specific vendors who are spe-
cialised in this space,” advised Zane Moi, co-Founder and CEO of TreeCrunch, a Big Data analytics and conversation platform for brands. Why is this important? Because Big Data, like the Cloud before it, is ill-defined thus making it incredibly easy to buy a "pig in a poke" and still find yourself pitifully short on bacon. "Big Data is a very broad term, which encompasses a broad series of, sometimes, mutually exclusive activities,” cautioned Moi. “So the first two things the CIO has to think about is, what are the business outcomes that they want to achieve and where are they currently, from a master data management perspective, to be able to actually achieve this business vision."
BIG DATA MARKETSCAPE TAKES SHAPE According to Wikibon, the Big Data market stands at a little over $5 billion and will grow beyond $50 billion in the next five years. Wikibon.org is a professional, open source community that works at solving technology and business problems through sharing free advisory knowledge. It was founded by former IDC, Meta and Gartner analysts. A Wikibon report by Jeff Kelly with David Vellante and David Floyer says that only five percent of the current vendors are pure-plays; the usual big players in enterprise tech, such as IBM, Intel, HP, Oracle, THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
33
COVE R S TO RY
B I G D ATA
Teradata, Fujitsu, and others, account for 95 percent of the overall revenue. There are two things to immediately think about here. The mega vendors are going to be hard pressed to be as agile and innovative as the smaller pure-plays. But the pure-plays are facing plenty of pressure, too. “It is incumbent upon Hadoop-focused pureplays, however, to establish a profitable business model for commercialising the open source framework and related software, which to date has been elusive,” said Kelly in the Wikibon report. This creates the usual conundrum for tech buyers: go with the more innovative smaller players who may crash and burn financially at some point or, go with the financially secure mega vendors whose products may prove ineffectual and inefficient in the end. Because this market is booming, many new pure-plays will also enter the field and many mega vendors will be on the prowl to acquire the best of the lot. Thus turmoil will be the order of the day for a while yet until the market matures and settles down.
PURE-PLAY VENDOR BREAKDOWN “When looking to solve a Big Data challenge in your company, it is first essential to clearly define the challenge faced, whether it be Big Data storage, Big Data analysis, or Big Data speed, said Kearnan. “This analysis will put you in a good position to then map the identified challenge to the growing number of Big Data vendors. Companies should diligently analyse the growing Big Data solution provider landscape to better understand players and their approaches to Big Data.” That said, here are the general defining points in the four leading players: Vertica, Teradata Aster, Greenplum, and Splunk. The first three are “upending the traditional enterprise data warehouse market with massively parallel, columnar analytic databases that deliver lightening fast data loading and near real-time query capabilities,” according to Kelly. The three were pioneering Big Data products long before Hadoop emerged as the mainstream Big Data play. Interestingly, all three offered Hadoop connectivity anyway. Vertica, Aster Data and Greenplum were the three leading independent next
34
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Because the big data market is booming, many new pure-plays will also enter the field and many mega vendors will be on the prowl to acquire the best of the lot generation data warehouse vendors up until recently. Vertica defines itself as “high-speed, self-tuning column-oriented SQL database management software for data warehousing and business intelligence.” Among its more remarkable features, most profoundly evident in its latest Vertica Analytic Platform iteration, are “new elasticity capabilities to easily expand or contract deployments and a slew of new in-database analytic functions” said Kelly. Vertica 5.1 includes a revamped client framework for easier integration with thirdparty BI, ETL, analytics, and other ecosystem solutions such as Hadoop distributions based on Apache Commons Release 1.0.0, including Hortonworks Data Platform v1. Teradata Aster “has pioneered a novel SQL-MapReduce framework, combining the best of both data processing approaches” said Kelly. Specifically, the Teradata Aster MapReduce Platform combines MapReduce, the language of Big Data analytics, with SQL, the language of business analytics. This makes it easier to analyse large volumes of complex data such as Web logs, machine data, and text, while also making it easier to perform more rich analysis than is possible with traditional SQL technology alone. Aster Database 5.0 offers greater development flexibility and includes prebuilt MapReduce modules for behavioral click stream interpretation, marketing attribution, decision tree analysis, and other analysis. Greenplum is now owned by EMC. “Greenplum’s unique collaborative analytic platform, Chorus, provides a social environment for Data Scientists to experiment with Big Data,” said Kelly. Indeed, Chorus resem-
bles Facebook in that is designed specifically for social collaboration, except between data scientists rather than ordinary folks, but it differs from Facebook and industry competitors by pairing that collaboration with Big Data analytics and processes. Splunk, ranking third in Wikibon’s ranking of these top four vendors, “specializes in processing and analysing log file data to allow administrators to monitor IT infrastructure performance and identify bottlenecks and other disruptions to service” said Kelly. The company recently went public and immediately soared to a $3 billion market value. Erik Swan, the company’s CEO and co-founder, describes Splunk as “Google for machine data” in an interview with Dell. That sounds deceptively simple when you realise that Splunk identifies and tracks machine data ranging from under-the- hood and often automated machine workings to patterns in human use of these machines. From these four vendors alone one can easily see that the approaches to Big Data are as varied as the vendors doing the approaching. It is vital then to understand exactly where you want to end up and exactly how each vendor would get you to that destination before you decide to commit to the journey. And that just on the data side. Networking vendors are jumping on this bandwagon in a big way because, without the underlying network architecture to support all of this data analysis and warehousing, what do you have? A big pile of Big Data that causes a big problem and little else. —A prolific and versatile writer, Pam Baker writes about technology, science, business, and finance for leading print and online publications including ReadWriteWeb, CIO and CIO.com
advts.indd 58
3/23/2010 2:32:15 PM
T E CH F O R G OVE R NAN CE
SECURIT Y
5
POINTS
OVER THE last several years, firewalls and antivirus have been losing effectiveness MALWARE HAS become much better at avoiding detection ANTI VIRUS software still relies on signature-based detection
IMAGE BY PHOTOS.COM
FIRMS NEED to ensure that they have a good firewall management programme SHIFT TO more effective ways to secure an organisation
FIREWALLS, ANTIVIRUS AREN’T DEAD SHOULD THEY BE? To secure an organisation, reduce the amount of money and resources spent on these technologies. Shift to more effective ways. BY BEAU WOODS
36
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
SECURIT Y
T E CH F O R G OVE R NAN CE
Because of the proliferation of Internet connectivity, malicious software spreads very quickly. Instead of taking months to spread to thousands of systems, it takes seconds to spread to millions. And the malware itself has become much better at avoiding detection, taking steps to hide its signature. Most viruses today are obfuscated a number of times and checked to make sure no antivirus software can detect it — all in a matter of seconds, and all before it’s sent out to its victim. And viruses are often created and distributed in such a way that anti-virus vendors need to communicate over the Internet and don’t get a copy before the malicious software so a small set of connections were allowed to finds its victim. pass through the firewall. And when it does infect a device, it frequentThe Internet landscape has changed drastily disables any anti-virus software and hides cally since then. And with it, the Internet in such a way that it can’t be detected except threats. Modern business processes are highly by sophisticated, usually manual techniques. dependent upon and thoroughly integrated Anti-virus software largely still relies on sigwith the Internet. Organisations invite masses nature-based detection, which is increasingly of Internet devices into their network to delivgrowing ineffective and often slows down er web pages, email content, support mobile system performance. devices and dozens of other reasons. Further decreasing the effectiveness of fireAt the same time, devices within the netwalls and anti-virus in organisations is the way work routinely initiate communications to the they’re used. Because of the massive number Internet and pass data back and forth. Firewalls have gotten better, but they simply can’t of connections in and out of a network, definihandle the new ways in which organisations tions of what is and is not allowed and exactly work on the Internet, nor the more sophistihow to allow or deny network connections cated threats. They still have a use as a tool to have become a sprawling mess. protect networks, but more tools are needed. And underneath all this complexity, many Similarly, anti-virus was first developed to organisations don’t even do the basics right detect, prevent and remove individual viruses. — properly configuring and managing these These software packages were simplistic, tools. And administering anti-virus often identifying malicious programmes and files means running daily reports of issues and by looking for indicators or “signatures” that sending a technician onsite to manually were unique to each virus. This was, again, investigate what's gone wrong. Firewalls and before the Internet was widely used and most anti-virus cost many organisations millions of virus transmission was very slow. dollars a year and are failing to do what they The anti-virus industry was easily able to should. So why should we keep these things keep up with new viruses and forms of existaround? In the case of firewalls, they do exacting viruses. This was a time when the number ly what they are supposed to do and do it quite of specimens was very small and they didn’t well. Firms just need to get smarter about change very often. Updating the using them. That means limiting signatures was a task done once firewalls’ purpose to what they do a year or so, and in fact when well and handing off other duties the subscription-based licensing to other tools. In addition, organimodel for anti-virus was initially sations need to make sure they launched it was widely viewed as have a good firewall managesomewhat of a betrayal of trust — ment programme — even small 4G MOBILE paying continually for the same organisations. DEVICES TO SHIP IN software. It was a different time. And anti-virus should be reTHE YEAR 2012 But today’s situation is vastly understood as a broader concept different from what anti-virus of endpoint protection. This was designed to deal with. includes securing configurations
Most viruses today are obfuscated a
number of times and checked to make sure no anti-virus software can detect it — all in a matter of seconds, and all before it’s out to its victim Over the last several years, firewalls and anti-virus have been losing effectiveness. Many in the information security community have recognised this. Unfortunately many of the business and operations people haven’t. The threats that these technologies (tools to assist in a solution, not the solution themselves) were designed to solve have changed. That’s not to say that they do nothing — they can still be useful — but your organisation needs to know what they’re meant to counter and how to use them properly. I was inspired to finally write this down by a story Wendy Nather contributed to Infosec Island, entitled Why We Still Need Firewalls and AV. While I agree with her general premise, I think the article doesn’t get to the real heart of the issue. When firewalls and anti-virus were all we had and effectively countered the threats we faced, they tended to be used more as they were designed. But now, firewalls and antivirus don’t counter the majority of the threats and aren’t used very well. Firewalls were invented a couple of decades ago to keep Internet-borne threats out. Thefirewall has its roots in the early 1990s, a time when commerce was prohibited on the Internet and most companies didn't have any presence there. As computer networking grew in popularity, connecting to the Internet was a way to share information across organisations, as well as internally. However, within a decade, Internet attacks were prevalent and organisations needed a way to protect the devices on their network. The firewall was popularised as a way to enforce a hard separation between the outside and the inside. The major advantage to this approach was that it was much cheaper than securing every single device. And at the time just as effective, since most devices had no
87mn
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
37
T E CH F O R G OVE R NAN CE
SECURIT Y
and access, restricting software to that which is known to be safe and putting tools in place to detect anomalous behaviour. Anti-virus software packages can help fulfill the last piece — telling systems administrators that a known threat has been detected or that suspicious activity has been happening. But one thing I think we as security professionals should be
advocating is reducing the amount of money and resources spent on these technologies. Instead, shift to more effective ways to secure an organisation. For example, by providing better training to IT staff for using with the existing tools and technologies. Or improving security awareness programs so that viruses (not to mention many other types of attacks)
are less likely to be effective. In the end, this will allow an organisation to maintain the same level of security at a lower cost or to increase security at the same cost. —The article in printed with prior permission from www.infosecisland.com. For more features and opinions on information security and risk management, please refer to Infosec Island
Seven Steps to Avoid Failure Panic These steps are not only strategically prepared, but also past the panic stage when bad things happens BY RAFAL LOS
IMAGE BY PHOTOS.COM
I
t is widely acknowledged that a security incident, whether it ends up being a massive breach or not, will happen to virtually every organisation that has an Internet presence. Even those without a presence on the Internet can still expect to have some sort of issue involving security simply because there is almost no such thing as a disconnected organisation — I mean completely disconnected — anymore. Once you get past the notion that it’s not a question of if but rather when a breach or incident occurs — another issue starts to hit you. What does a breach mean for your security programme? To many organisations, a security breach means a catastrophic failure in security. A breach signifies a breakdown in the protective mechanisms installed to keep the organisation secure, and by its very nature represents failure. The problem with this situation is it represents two failures. The first failure in a situation like this is in planning and organisational strategy, more on this in a moment. The second failure this represents is in the organisation’s face to customers, investors, and the public. Failures of this nature are difficult to recover from in both perception and reality — but it doesn't have to be this way. Let's take an example from morning traffic radio. If before you leave your house for that big meeting in the morning you hear something like “The inbound I-90 lanes into Chicago are wide open,
38
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
it's going to be 30 minutes from the airport into downtown” and you get on the highway only to find yourself in an unexpected backup you curse the traffic announcer even though it’s rarely their fault traffic backed up. However if you hear “The inbound I-90 lanes into Chicago look clear for now, with travel times currently at about 30 minutes, but allow yourself for extra time because you can never predict Chicago traffic” and you get into that same traffic jam and miss your meeting you blame yourself rather than the traffic person... right? The difference is one of those announcements told you that although things look good now you can’t predict the future. This is an important lesson for information security professionals. As I mentioned above, the primary failure in a situation where an organization is breached and the resulting event is seen as a catastrophic failure is in strategy and planning. Failing to account for a breach in your strategic information security and risk management planning is a failure all in itself. It’s at very least arrogant to think you can ever reach some mythical state absolute security in your organization, and in many situations it’s irresponsible. How many CISOs are out there telling their leadership they need more budget to “secure the company” only to completely leave out the “we will never actually reach a level of complete assurance” that should accompany the request? The tragedy is that if leadership doesn’t understand
MOBILITY
T E CH F O R G OVE R NAN CE
1 There are no absolutes — what you’re providing is a level of risk risk management well enough on the technical end of things to know reduction to an agreeable, acceptable level for the task better, they’re telling the board, or even their customers that your security is Fort Knox. When Fort Knox is breached and the keep is pil2 Define tolerances to failure- make sure you’ve clearly defined your laged... then what? At that point you've completely and utterly failed tolerances for everything from downtime, to security incidents to and there is no coming back from that. While many organisations that understand clearly when a failure happens don't plan to fail (experience a breach) recover just fine in the market 3 Strategies embrace failure — a good security & risk strategy embracand with their customers, often times the sacrificial lambs wouldn't es failure as one of the realistic outcomes, and plans for it, providing agree. Many CISOs and security staff lose their jobs as a result of a a pre-defined next-step in those failure cases breach because their organisations are expecting security 4 Define failure modes — incidents and breaches happen, and when that security fails, the employee or program but they’re not all created equal — make sure you have failed and people must be held accountable. Speaking of this pre-defined before you find yourself in the middle of accountability, think about how you as a customer feel a failure mode asking “how do we explain how bad it is?” when someone promises you something and then comes 5 Define a recovery — when planning for a potential failback to tell you that they screwed up, and they’re sorry. EXPECTED WORTH ure scenario define clearly what your steps are to recover Your reaction is typically to get angry, point to previous from the various failure modes you’ve already defined, OF IT SERVICES statements about guarantees and say that there was never and execute MARKET IN INDIA any mention of ‘oops’ planning and to demand account6 Test recovery methods — there is nothing worse than ability. Rightfully so. If your vendor would tell you that having a well-defined plan for recovery that you find out is BY 2014 they do their best-effort to maintain the privacy of your non-workable in the heat of the moment — test, test, test data but have a strategy which includes failure response and validate your strategic recovery methods you'd at least understand and not expect perfection. Am I right? Let's 7 Validate sentiment — make sure that what you consider acceptrelate this to what we all do, and specifically what I’m focused on able and reasonable, and a sound strategy which includes failure is which is the security of things like cloud environments. Many of my acceptable by those that matter to you, namely customers, investors customers are providers of cloud services themselves... whether that’s and the public to an internal customer through a shared service model or external You’ve heard the phrase “Failing to plan is planning to fail” right? customer in the case of a small-market cloud provider. When they Let’s modify it slightly to say “Failing to plan for failure, is planning to promise security to their customers it's important for me to help them fail catastrophically.” work through what that actually means. Here are the 7 steps I work people through to make sure they're —The article in printed with prior permission from www.infosecisland. not only strategically prepared, but also past the panic stage when bad com. For more features and opinions on information security and risk things happen. management, please refer to Infosec Island
$15bn
Assessing Mobile Applications A look at three areas that need to be tested in every mobile application BY TOM ESTON
H
aving spoken at both at the SANS Mobile Device Security Summit as well as OWASP AppSec DC recently about testing mobile applications I’ve encountered that like the old saying goes: “There are many ways to skin a cat.”
There are also many ways to assess a mobile application. I’ve seen very detailed testing methodologies, not so detailed and everything in between. I’ve also heard other security professionals say that testing mobile applications are just like testing a web application. This is simply a wrong and
inaccurate statement. Mobile applications are fairly complex and just assessing the application layer is only a small look into the overall security of a mobile application. While the OWASP Mobile Security Project will help define a complete mobile application testing methTHE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
39
T E CH F O R G OVE R NAN CE
MOBILITY
odology (which is in process), here are three areas that need to be tested in every mobile application.
1. The Mobile File System How’s the application storing data and where is it being stored? You’d be surprised how much information is being stored in files, SQLite databases, system logs and more. If you’re lucky you will sometimes find private keys and hardcoded passwords. As a great example, the mobile Facebook application suffers from a file system vulnerability as I write this. The author likes to call this a “plist hijack attack”. Simply move the plist file to another mobile device and you are logged in as that user. As for tools to use when looking for file system vulnerabilities you should really check out the forensic approach that John Sawyer from InGuardians has developed. It’s my preferred method for seeing how the app writes to the file system and saves lots of time over creating a dd image.
IMAGING BY SHOKEEN SAIFI
2. The Application Layer
CTO FORUM 07 MAY 2012
iOS. UDID is currently being used by many applications for unique device identification. However, the use of UDID is becoming an increasing concern from a privacy perspective. Not to mention, Apple is cracking down on UDID usage by now denying applications in the Apple App Store.
3. The Transport Layer How does the application communicate over TCP? How are custom protocols and third-party APIs used? Does the application use SSL? At OWASP AppSec DC we talked about the LinkedIn mobile application that was vulnerable to “sidejacking” or better known as HTTP session hijacking. This is where an attacker can pull out the session cookie in clear text and replay this so the attacker can login as the user. The popular “Firesheep” tool released in 2010 demonstrated this nicely. The good news is the release of the LinkedIn app (version 5.0) fixes the sidejacking issue. Unfortunately, using SSL for just the login process and defaulting back to HTTP is an issue many mobile and web applications still have. Mobile Application testing is something that will evolve as mobile apps get more complex and the business drives more towards mobile solutions. If you’re deploying mobile apps for your business it’s more important than ever to have testing done on these three areas at a minimum.
Business logic needs to be reviewed just like you would in a web application assessment to find flaws in the way functions were developed
How’s the application communicating over HTTP? How are web services being used and how are they configured. Important things such as authorisation and authentication need to be reviewed as well as session handling, business logic, input validation and crypto functions. Business logic needs to be reviewed just like you would in a web application assessment to find flaws in the way critical functions (like shopping cart checkout processes) were developed. Remember to never under estimate the criticality of Web Services! For reference and context, check
40
out the presentation that Josh Abraham, Kevin Johnson and I gave at Black Hat USA last year. Something else worth mentioning is that you can’t rely on traditional web proxies like Burp Suite to test the application layer on a mobile app. I’ve encountered applications that are configured to bypass device proxy
THE CHIEF TECHNOLOGY OFFICER FORUM
settings! You need to use a tool like Mallory which is a fantastic TCP and UDP proxy. Mallory sees all traffic and allows you to manipulate and fuzz it. There are other ways to do this as well but regardless, you need to have a way to see all traffic the mobile app may generate. The application layer is also where you need to look for issues specific to mobile applications like UDID usage in
—The article in printed with prior permission from www.infosecisland.com. For more features and opinions on information security and risk management, please refer to Infosec Island
NEXT
HORIZONS
FEATURES INSIDE
PowerPlay or Meltdown? Pg 45
Ensuring Private Cloud Security Pg 46
ILLUSTRATION BY MANAV SACHDEV
O Is Siri Smarter than Google? As the CIO, you need to determine how your company can use its UIEA in the various functions BY DANIEL BURRUS
42
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
ver the past thirty years I have been writing, speaking, and consulting about technology-driven trends that are coming but difficult for most people to see. Back in 2000, I wrote about one such trend that would hit about now, and here we are, on the brink of experiencing a technology that will provide new opportunities for IT to add strategic value and competitive advantage: ultra-Intelligent electronic agents. Ever since our first digital search we’ve all spent increasing amounts of time on the Web looking for the information we need. Since most of us are in a hurry, we’ve used various search sites and mega portals over the years, from early players like AOL and Excite to today’s leaders such as Google and Bing. You know the process: You enter a keyword or phrase to find what you are looking for, and then you manually scan the results (which are usually staggeringly useless in depth) looking for what you really want. The good news is that the Web has provided us with a world of information at our fingertips. The bad news is that the world of information we have access to is getting
M A N AG E M E N T
much bigger by the day. As a result, we are all spending way too much time looking for the information we really want.
Help is on the way We are now on the brink of having access to a powerful new tool that will do much of the search and sorting work for us; with far more intelligence and personalisation that we have had in the past. Very soon you will find yourself using, on a daily basis, an emerging new technology called an ultraintelligent electronic agent (UIEA). Most of us remember using Microsoft Office 97 or 2000, where you experienced the beginnings of an electronic help agent — the paper clip guy that popped out to give you help when you wanted to perform a task. The problem was, it was not an intelligent agent. It was a basic help agent. If it had any intelligence, the first thing it would have showed us was how to disable it! Yet, with all of its many shortcomings, it did give us a very early and primitive preview as to how an intelligent agent might work.
Enter Siri Actually, the first generation of intelligent agents is here and her name is Siri. Siri, what Apple calls their intelligent personal assistant, is very different from the Google app on your smart phone where you ask for directions or a restaurant and it provides search results. While Google search is intelligent and works very well, Siri gives you an actual agent to interact with. Siri has a woman’s voice; it has a personality; it can even give you some humour. Essentially, it’s an audio avatar. And if we look to the future a little further out, it’s obvious that soon we’ll be able to see Siri’s face (or visual representation) on a smartphone, tablet, computer, or even TV screen. Of course, Siri was just the beginning. In no time at all we saw an Android version of Siri, and as you already know, there will be many others. So what makes Siri an UIEA versus the Google app many of you use on your smart phone? Siri and her competitors are linked to a super-computer in the cloud that can tap into all of the world’s databases and news feeds. It has access to increasing amounts of information coming from everywhere. This is about machines talking to
NEXT HORIZONS
We are now on the brink of having access to a powerful new tool that will do much of the search and sorting work for us; with far more intelligence and personalisation that we have had in the past Therefore, as a CIO you have to start looking at how you can both control and use this technology in the enterprise rather than waiting for your competitors to use it, which puts you in the position of having to play catch-up. And just like the BYOD phenomenon that seemingly (unless you were looking, of course) came out of nowhere and caught many CIOs off guard. UIEAs will probably CIOs take heed need to be supported by IT in much the Most CIOs, IT departments, or even same fashion and, potentially, on the same the general public don’t realise the impact time scales. an ultra-intelligent agent will have; much Remember, when the iPad first came less how it will transform companies the out, the press and business community world over. didn’t pay much attention to it. The conThink about it. If you have an UIEA that sensus was, “Wow, it’s just an can give you exactly the informaiPod Touch, but bigger.” It was tion you want, do you need to viewed as a novelty. At the time, take the time to personally go to I asserted that they were wrong, a website to get it? Did you do a that the iPad would be a big traditional search? Did you issue game changer for business. Now the request for information, RISE IN RETAILERS’ we can see that what I predicted analyze the information, or even OPERATING occurred. The iPad has become physically place the order for a transformational for business. product or service? MARGINS WITH People are now making the The answer to all of these BIG DATA same mistake with Siri. They’re questions is “no.” The agent did viewing it as a fun toy. I’m here the search, gathered the information, issued the request, and in some to say that it’s way more than that. In fact, cases even analysed the results and placed it’s an even bigger transformation for busithe order. ness than the iPad was. For a CIO, this is something powerful and disruptive, and it’s not an “if” or a BYOD, UIEA and working smarter “maybe.” We can see already with Siri and In addition to helping your customers, your some of the Siri competitors emerging how company’s UIEA will be able to help your this technology is taking hold. And because employees work smarter. Chances are you of bandwidth, storage, and processing have many employees who don’t always power growing exponentially, we’ll see more have access to a computer screen but still advanced versions of the UIEA coming need information. This could pertain to onboard very quickly. employees who are on the road, such as When you know what’s going to happen salespeople, as well as those in the field, before it happens, you have the upper hand. such as repair and maintenance people or
machines and sensors all communicating through the internet. In addition, it’s connected to our personal computing devices with access, granted by you, to your calendar, contacts, and more. All the data goes to a super-computer that feeds into our ultraintelligent agent, which can then give us the actionable knowledge that’s pertinent to us.
60%
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
43
NEXT HORIZONS
MOBILITY
engineers. These people can pull out their smart phone or tablet and ask their intelligent agent for detailed information. For example, suppose you have a maintenance person fixing an air conditioner. He can pull out his smartphone and ask his agent, “Do I have this part in my truck?” And the agent can reply, “No, you don’t have that part in your truck.” He can then ask, “Well, do we have it back at the shop?” As he asks and gets his answers, he’s still working and doing maintenance, essentially multiplying his time. Instead of having to go their laptop back in the truck or type in search terms
People are viewing Siri as a fun toy but it is way more than that
As the CIO, you need to determine how your company can use its UIEA in the various functions. Just like mobility is driving a transformation of almost every business process — including purchasing, logistics, supply chain, etc. — we can do the same with an UIEA. —Daniel Burrus is considered one of the world’s leading technology forecasters and business strategists, and is the founder and CEO of
on their smart phone, employees simply ask a question and have access to all of the information they need, including diagrams and videos for just-in-time training. These UIEAs are the way to help employees do more with less.
Burrus Research, a research and consulting firm that monitors global advancements in technology driven trends to help clients better understand how technological, social and business forces are converging to create enormous, untapped opportunities.
Power Play or Meltdown? It’s time to take a hard look at Mobile Fusion and see what’s really there for IT BY PAM BAKER
R
time to take a hard look at Mobile Fusion and see what’s really esearch in Motion (RIM) hopes to be the comeback there for IT. kid in the mobile battles for king of the BYOD hill. While MDM is far from a mature industry, several major players At the very least they hope to secure their position long ago launched into the space and made offerings that enterfirmly on the side of enterprise IT. That’s no small prise IT find largely helpful. This means RIM is already feat considering Blackberry has been entering a competitive field with a late product. So, how covered in mud and sliding down the hill for quite does RIM measure up? some time now. “Overall I was hoping for more in the new offering,” From last year’s service outages across the U.S., said Alex Bratton, president and CEO of Lextech Global Europe, the Middle East and Africa to constantly befudGLOBAL Services, a global app developer for mobile workforces. dled co-CEOs and continuously slipping device popuSECURITY “This launch seems on par or behind most mobile larity, there’s been little to spark a glint of interest from device management solutions already on the market.” IT leaders let alone solicit a look of admiration. But the SOFTWARE company hopes to retake the hill (and IT’s patronage) INDUSTRY SIZE with a three part strategy. One dashboard, two platforms IN 2011 The first is Mobile Fusion, a mobile device manageMobile Fusion combines two platforms, one for Blackment (MDM) play launched in April. The second and berry and one for non-Blackberries, i.e. Apple and third are the Blackberry 10 platform and smartphone, Android. It then combines these two platforms into a unveiled Tuesday. While the reception of all three has been mostly single dashboard. There is no doubt that such visibility and control positive, the underlying question remains: Can RIM deliver? It’s is helpful to IT, but is it enough?
44
CTO FORUM 07 MAY 2012
$18bn
THE CHIEF TECHNOLOGY OFFICER FORUM
MOBILITY
“The primary benefits to IT will be the ability to apply knowledge of managing a BES server to working with other brands of mobile devices,” said Bratton. “But the implementation of a BB Mobile Fusion server isn't simple, per the manuals, and will take more learning on the part of IT.” And there are other drawbacks to the dual platform approach to consider. “How current will RIM keep the nonBlackberry platform?” asked Dan Croft, CEO of Mission Critical Wireless (MCW), an enterprise mobile management firm whose client list includes RIM. “Considering MobileIron releases new versions every quarter, will RIM keep a comparable pace on the non-Blackberry platform? If not, that could be a problem.” Overall, Croft says RIM’s universal device service (UDS) is “somewhat thin compared to other solutions.”
ILLUSTRATION BY ANIL T
Shooting at their foot?
But without devices that appeal to the consumer market, the BYOD market is a terribly difficult incline to scale. Consumers, i.e. employees, are far more likely to choose the more popular Apple and Android phones if IT gives them the
NEXT HORIZONS
a strong and sustainable upswing. “RIM has the opportunity to really innovate and solve some of the hard problems like they tried with BB Balance for separating personal and corporate data. Let's see that same thing for other platforms,” said Bratton. RIM is no stranger to overcoming seemingly immovable obstacles to success. In its earliest days the company’s idea for a Blackberry was almost aborted by carriers too dim-witted to see the future of text and mobile email and too greedy to give the technology a chance before profit was assured. Yet RIM found a way to survive and prosper despite strong carrier resistance. RIM needs to summon strength from its innovative and resilient roots to pull off a comeback now. “Writing RIM’s obituary is overplaying the situation,” said Croft. “Do they have troubles? Yes. Are they going out of business? No, I don’t think so. They’re not going to be gone next week, next month, or anytime soon.” RIM will live to fight another day. But will they make it to king of the BYOD hill? It all depends on how well they deliver over the next 12 months and whether the company is willing to step beyond its fears and truly innovate again.
RIM will live to fight another day. But will they make it to the king of the BYOD hill? It all depends on how well they deliver over the next 12 months and whether they are willing to step beyond the fears and innovate again
“The flip side to this will be speeding the adoption of other, nonRIM mobile handsets in the enterprise,” said Bratton. If enterprise IT does adopt Mobile Fusion in appreciable numbers, and RIM cranks out a steady stream of updates and new versions of its non-Blackberry platform, then IT may be more likely to accept a larger number of devices from a broader sweep of brands. Ultimately, this could prove detrimental to the Blackberry brand. Enterprise is, after all, where Blackberry has its strongest influence. It must at the very least hold its ground in the enterprise market and grow a larger consumer market share to survive.
choice. Which is why, of course, RIM followed the launch of Mobile Fusion with the unveiling of Blackberry 10, but those phones will not be available for purchase until later this year and that could very well be too late to change the tide for RIM. On the other hand, if RIM fails to keep its dual platforms working smartly and on top of device changes, Blackberry could lose its tenuous hold on enterprise altogether. Neither scenario is a good one for RIM. However, if RIM were to steel its nerves and boldly jump ahead of trends and truly innovate MDM then its products could see
— A prolific and versatile writer, Pam Baker writes about technology, science, business, and finance for leading print and online publications including ReadWriteWeb, CIO and CIO.com, Institutional Investor, Fierce Markets Network, I Six Sigma magazine, CIO Update, E-Commerce Times, and many others.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
45
NEXT HORIZONS
SECURIT Y
Ensuring Private Cloud Security
BY VON WILLIAMS
t is somewhat ironic that one of the main reasons IT directors give for preferring a private cloud in their data center over leveraging a public cloud is their concern for security. There appears to be some projection going on here. Unless an organisation is in a regulated industry that requires proof of security (i.e., PCI DSS, HIPAA, FERPA, FISMA, ITAR), the level of security in many data centers today could best be characterised as “not so much.” Admittedly, the overriding demands of keeping everything up and running leaves little time in many IT shops for diligent security. Where they exist at all, security policies are often incomplete, outdated and untested. The commitment to implement a private cloud environment in many data centers, as a result, provides a strategic opportunity to develop a comprehensive security policy from scratch. A security initiative needs to be a detailed, disciplined process, but it doesn’t have to be overwhelming. Once you set a few processes in motion, developing a dependable security policy becomes a very logical, even mechanical, undertaking that proceeds by extending the right building blocks throughout the IT infrastructure. Technologies and procedures exist today that can ensure that your private cloud complies with your security policy. But you have to have a security policy to apply in the first place. A best practices approach to upgrading and/or creating a security policy that is appropriate for your organisation focuses on five basic security components: risk management, data ownership, data classifica-
46
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
ILLUSTRATION BY ANIL T
I
A security initiative should be a detailed, disciplined process, but it doesn’t have to be overwhelming
tion, auditing and monitoring, and security incidence response.
Risk management The first step in the development of a security program needs to be a risk management policy that establishes how much risk your organisation is willing to accept and identifies the security and privacy requirements needed to ensure compliance with the federal and state regulations that oversee your industry. To be authoritative, a risk management policy needs to have the input and buy-in of key stakeholders including finance, HR and all the C-level people in the executive suite.
Go as high as you can get. You’ll need top down support to implement your security policy throughout your organisation. The advantage of a thorough and thoughtful risk management policy is that it takes the mystery out of privacy and security requirements and replaces a nagging uncertainty with clearly delineated rules that can be extended confidently throughout your IT infrastructure as it grows to include private, public and hybrid clouds.
Data ownership The champions and enforcers of the risk management policy are the designated data owners in specific business units who are
SECURIT Y
ultimately responsible for the protection and use of a specific subset of information. This is where the buck stops. The data owner decides the classification of the data that is to be maintained and is held responsible for any corruption of or unauthorised access to the data in his or her charge. Data owners are also responsible for ensuring that the necessary security controls are in place and proper access rights are being used, defining security requirements and backup requirements, approving any disclosure activities, and defining user access criteria.
Data classification A primary role of a data owner is the thorough data classification of all data under his purview into three typical groups: private, confidential and public. A spreadsheet with financial data, for example, would be private and confidential. A memo about the company picnic would be public. Once the data owner has made the classification, the level of security that needs to be associated with specific types of data becomes obvious. The classification determines who needs to access specific data — as well as who should not. Access controls and levels of encryption can then be set accordingly. The data classification policy also serves as a record of truth that you need to make available to everyone so that the appropriate security is consistently applied.
Auditing and monitoring Once you have the controls in place, then you have to keep an unblinking eye on how well they work. This function is generally provided by a security incident and event monitoring (SIEM) system that records successful and failed login attempts into key systems, configuration changes and system activities. SIEM lets you see who is accessing your systems and how. Ideally, logs of who is getting into your systems should accurately reflect the access policies that you have in place. Importantly, SIEM also includes log correlation between various security components enabling you to reconstruct the series of events that led up to a security incident.
Incidence response Your incidence response policy covers what to do when your SIEM tool says, “Hey!
NEXT HORIZONS
around to optimise performance and the security team determined to lock things up in safe places. The two groups apply different logic in their respective roles and, if they are not talking to each other, it’s entirely possible that the best intentions of one can pre-empt the best intentions of the other. Technology is emerging that will make it possible to embed security requirements in meta data that would travel with corporate data in a virtualized environment and High-level security policies ensure it stays within the appropriate secuThe following is a high-level list of the polirity zones. In the meantime, the only way cies Logicalis has in place at its public enterto ensure that no HIPAA data, for example, prise cloud facilties. Each high-level policy ends up on a non-encrypted has branches of sub-policies storage device is for virtualisawithin it. These include accepttion administrators to manually able use policy, information check data classifications before security policy, exception policy, they allow data to move to a new policy terminology definition RISE IN RETAILERS’ location. policy, change control policy, data classification policy, inforOPERATING mation security risk manageInjecting security MARGINS WITH ment policy, access control Fortunately, the hypervisor, the BIG DATA policy, data retention, archiving same technology that turns and disposal policy, media hansecurity into a fast-paced game dling policy, firewall and router of hide-and-seek, also provides security administration policy, an ideal place to apply secusystem configuration policy, rity to specific environments. anti-virus and malicious code policy, data By introducing anti-virus software at the backup policy, cryptography and encryption hypervisor level, for example, all the servers policy, software development policy, mobile running within it are protected. Security computing policy, security logging and zones can also be stipulated at the hypervimonitoring policy, security incident mansor level. The bottom line is it is possible agement policy, business continuity and to apply as tight of a security policy to a disaster recovery policy. private cloud as you can to a conventional infrastructure. And, once you have identified what it takes to keep your data secure in Now you see ‘em… your private cloud, you will also have identiAlthough the policies and rules that make fied the security requirements that need to up an appropriate security programme for be met by a public cloud provider should your organisation in a conventional infrayou decide to extend your infrastructure structure can be extended to a private cloud outside your data center. environment, applying security policies in a virtualized environment does add another dimension to everything. As a function — Von Williams is director of Information of abstracting software and data from the Security and Governance for Logicalis, an underlying hardware, data moves around international provider of integrated information in a virtualised environment. VMware, for and communications technology solutions and example, makes it possible to move workservices, where he is responsible for providloads dynamically within pools of resources. ing security advice and delivering solutions to Strict security policies can be applied to meet customer’s security needs. Mr. Williams data in virtualised pools of resources, but holds multiple certifications -- CISSP, CISM, applying them consistently requires close CISA, CRISC. Before joining Logicalis in 2010, coordination between the virtualisation he worked as a security expert for FirstGroup, administrators intent on moving workloads Convergys, and Sallie Mae.
We’ve got suspicious activity going on here. Do something.” Exactly what to do is outlined in detail in the incidence response policy. As a general rule of thumb, the more lax a company is on security, the better and faster their incident response needs to be because they are not going to see an attack coming as early as they would if they had a better security controls around their systems.
60%
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
47
VIEWPOINT STEVE DUPLESSIE | steve.duplessie@esg-global.com
IMAGE BY PHOTOS.COM
Thursday Thoughts on Friday Mostly because
I forget to hit send before I headed out last night I AM psyched for my presentation at EMC World. Not only because they never have anyone like me on stage (I do love to be a pioneer), and not only because they really have not even once tried to manipulate me (many years together, maybe they gave up!), but because I’ve made 2012 the year of my Vanity Tour. I’m only doing huge rooms now, no more club gigs! Nothing gets the juices flowing like thousands of people staring at you waiting for you to make an ass of yourself. I have a few surprises in store for this since it’s such a big to do — coming soon. Let’s just say I’ve gone above and beyond and convinced some very famous people to do some absurdly silly promotion for me. Love that. You Tube here we come. By the way, my presentation will be setting the world straight on all this flash business. What’s what, what’s not, and what to care about. Two, tonight is big Vince Wilfork’s annual charity fundraiser for diabetes. He raises a good pile of money, and you should help out if you can. www. vincewilfork75.com if you are a New England local, come to Pinz in Mil-
48
CTO FORUM 07 MAY 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
ford, MA and bowl with the big man. There are always a bunch of other Pats around so bring the kids. They sign a million autographs. If you were one of the zillion people who made it out — thanks. Vince raised about $200k last night! If you didn’t make it, you should donate if you can. Diabetes blows. Vince and others signed everything and anything, took pictures all night, and were just awesome with the kids. He is a very humble man when it comes to this stuff. Truly appreciates the help. Send a donation and he’ll call you out on twitter. Remember Falconstor? Seems they aren’t ready to give up just yet. I just met with CEO Jim McNeil and crew and suffice it to say that after a tough year cleaning up a tough series of messes, they have a plan that looks not only doable, but pretty darned impressive. I love a realist, and Jim is nothing if not that. Clean up the crap, focus on what’s important, and move ahead. Tough not to root for them. Early indications are things are working. Had a few customer calls, and they seem genuinely psyched with the quality
ABOUT THE AUTHOR: Steve Duplessie is the founder of and Senior Analyst at the Enterprise Strategy Group. Recognised worldwide as the leading independent authority on enterprise storage, Steve has also consistently been ranked as one of the most influential IT analysts. You can track Steve’s blog at http://www. thebiggertruth.com
improvements in the products and the general direction of the firm. I just interviewed a video dude with the best name ever. Octavius Horn. Pretty much have to hire him regardless, with a name like that. I’m no idiot. Video guy named Octavius comes to you, you hire him. Life lesson. HP has met the enemy, and it is them. Change is hard in a mongo-big company. What’s interesting is that I think they realise it — at least the powers that be. They have a really good team up top these days, who recognise that they don’t need to focus too much on what other competitors are doing. They need to worry about themselves. Break old, bad, habits and develop new good ones. That’s how they turn that big ship. I think it’s true of all mongo-big firms. At some point, you become so big that the barriers to your success come from within, not from some competitor beating on you. No one would ever draw up a big tech company to function the way that they do. They can only happen organically. For example, you would never, ever decide to structure IBM, Dell, HP, Cisco or others.