CTO FORUM
Technology for Growth and Governance
June | 07 | 2012 | 50 Volume 07 | Issue 20
HOW TO MAKE USER EXPERIENCE COUNT | WORKING WITH HYBRID CLOUDS | ADAPTING LEGACY NETWORKS TO THE CLOUD
I BELIEVE
Reduce Risk Exposure Using Technology PAGE 04
BEST OF BREED
Teaching Best Practices PAGE 18
A QUESTION OF ANSWERS
Volume 07 | Issue 20 COVER.indd 1
Moving from Products to Services PAGE 14
A 9.9 Media Publication
6/12/2012 7:18:42 PM
EDITORIAL YASHVENDRA SINGH | yashvendra.singh@9dot9.in
Lead By Example A CIO himself has to be social media savvy if he wants the internal users to be social themselves
W
hen SAP’s global CIO, Oliver Bussman, decided to put a mobile strategy in place, he was absolutely clear on one thing. He knew the IT department had to be in total control for the strategy to be successful. “The IT organisation has to be in the driver’s seat. If the CIO doesn’t embrace the mobile trend, then the business organisation bypasses the IT organisation and that’s not a good thing. Then it is being done without control and security and that
EDITOR’S PICK 28
can have an impact potentially on the company,” he said. Bussman not only devised a mobile strategy but implemented it marvelously. SAP today deploys 14000 iPads, 17,000 BlackBerry’s and 8,000 iPhones worldwide. The result has been heartening for the company giving a big boost to its service support and productivity. There are several technology decision-makers in India who have realised there is no way to escape this emerging mega-
Recharge Your Enterprise The rise of the mobility in the enterprise is charging up both the employee and enterpise
trend of mobility within enterprises. However, there are many more who are yet to realise its potential and impact. For those of you who are weighing the options of a mobility plan for your organisation, securing the critical information of your corporate residing in that mobile device is the biggest challenge. Today there are innovative mobile device management tools available that store crucial information in a secure section within the mobile device. If the device is stolen or lost, this important data can be remotely erased while retaining the user’s personal information. We at CTO Forum believe that the success of any mobile strategy starts and ends with the CIO. A CIO, himself has to be social media savvy if he wants the internal users to be social themselves. Without embracing
social media, a CIO will not be able to fathom the ways in which it could benefit his enterprise. Interestingly, Bussman recently topped the list of the most socially active CIOs. The survey of Fortune 250 and Global 250 CIOs, was conducted by software vendor harmon.ie. Bussman (9824 points) pipped Google’s Benjamin Fried (7758 points) to the second spot with a wide margin. In this issue’s cover story, we have thrown light on enterprise mobility and its various aspects. We hope it will aid you in shaping your mobile strategy. Do write to us about your experiences and challenges with enterprise mobility.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
1
JUNE12 CONTE NTS
THECTOFORUM.COM
28 COLUMNS
COVER STORY
28 | Recharge Your Enterprise The rise of
4 | I BELIEVE: REDUCE RISK EXPOSURE USING TECHNOLOGY BY AMITABH MISRA
mobility is charging up both the employee and the enterpise.
52 | VIEW POINT: CLOUD COMPUTING What’s a Service? Who Are SPs? BY KEN OESTREICH
CTO FORUM
Technology for Growth and Governance
HOW TO MAKE USER EXPERIENCE COUNT | WORKING WITH HYBRID CLOUDS | ADAPTING LEGACY NETWORKS TO THE CLOUD
I BELIEVE
PAGE 04
BEST OF BREED
Teaching Best Practices
PAGE 18
A QUESTION OF ANSWERS
Moving from Products to Services
PAGE 14
A 9.9 Media Publication
COVER DESIGN SUNEESH K
COVER.indd 1
2
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
June | 07 | 2012 | 50 Volume 07 | Issue 20
Reduce Risk Exposure Using Technology
Volume 07 | Issue 20
Please Recycle This Magazine And Remove Inserts Before Recycling
COPYRIGHT, All rights reserved: Reproduction in whole or in part without written permission from Nine Dot Nine Interactive Pvt Ltd. is prohibited. Printed and published by Kanak Ghosh for Nine Dot Nine Interactive Pvt Ltd, C/o Kakson House, Plot Printed at Tara Art Printers Pvt Ltd. A-46-47, Sector-5, NOIDA (U.P.) 201301
6/12/2012 7:18:42 PM
FEATURES
18 | BEST OF BREED: TEACHING BEST PRACTICES Design a process that fits your firm’s existing structure
www.thectoforum.com Managing Director: Dr Pramath Raj Sinha Printer & Publisher: Kanak Ghosh Publishing Director: Anuradha Das Mathur EDITORIAL Executive Editor: Yashvendra Singh Consulting Editor: Atanu Kumar Das Assistant Editor: Varun Aggarwal Assistant Editor: Ankush Sohoni DESIGN Sr Creative Director: Jayan K Narayanan Art Director: Anil VK Associate Art Director: Atul Deshmukh Sr Visualiser: Manav Sachdev Visualisers: Prasanth TR, Anil T & Shokeen Saifi Sr Designers: Sristi Maurya & NV Baiju Designers: Suneesh K, Shigil N, Charu Dwivedi Raj Verma, Prince Antony, Peterson Prameesh Purushothaman C & Midhun Mohan Chief Photographer: Subhojit Paul Sr Photographer: Jiten Gandhi Contributor Designer: Sameer Kishore
14
ADVISORY PANEL Anil Garg, CIO, Dabur David Briskman, CIO, Ranbaxy Mani Mulki, VP-IT, ICICI Bank Manish Gupta, Director, Enterprise Solutions AMEA, PepsiCo India Foods & Beverages, PepsiCo Raghu Raman, CEO, National Intelligence Grid, Govt. of India S R Mallela, Former CTO, AFL Santrupt Misra, Director, Aditya Birla Group Sushil Prakash, Sr Consultant, NMEICT (National Mission on Education through Information and Communication Technology) Vijay Sethi, CIO, Hero MotoCorp Vishal Salvi, CISO, HDFC Bank Deepak B Phatak, Subharao M Nilekani Chair Professor and Head, KReSIT, IIT - Bombay
A QUESTION OF ANSWERS
14 |Moving from Products to Services Vishal Awal, Executive Director, Services, Xerox South Asia talks about the transformation of Xerox from a products company to a services firm 38
46
REGULARS
01 | EDITORIAL 06 | LETTERS NTERPRISE 08 | E ROUND-UP advertisers’ index
38 | NEXT HORIZONS: WORKING WITH HYBRID CLOUDS The most important consideration is what cloud technology means to you
46 | TECH FOR GOVERNANCE: WHY APPSEC WON’T ALWAYS BAIL YOU OUT AppSec is a process that should be invoked right at the inception of SDLC
IBM CTRL S HP PSG Datacard Airtel SAS Institue Hitachi Data Systems PID Pvt Ltd Nokia Microsoft
IFC 5 7 11 13 25 27 37 IBC BC
SALES & MARKETING National Manager – Events and Special Projects: Mahantesh Godi (+91 98804 36623) National Sales Manager: Vinodh K (+91 97407 14817) Assistant General Manager Sales (South): Ashish Kumar Singh (+91 97407 61921) Senior Sales Manager (North): Aveek Bhose (+91 98998 86986) Product Manager - CSO Forum and Strategic Sales: Seema Menon (+91 97403 94000) Brand Manager: Gagandeep S Kaiser (+91 99999 01218) PRODUCTION & LOGISTICS Sr. GM. Operations: Shivshankar M Hiremath Manager Operations: Rakesh upadhyay Asst. Manager - Logistics: Vijay Menon Executive Logistics: Nilesh Shiravadekar Production Executive: Vilas Mhatre Logistics: MP Singh & Mohd. Ansari OFFICE ADDRESS Published, Printed and Owned by Nine Dot Nine Interactive Pvt Ltd. Published and printed on their behalf by Kanak Ghosh. Published at Bungalow No. 725, Sector - 1, Shirvane, Nerul Navi Mumbai - 400706. Printed at Tara Art Printers Pvt ltd. A-46-47, Sector-5, NOIDA (U.P.) 201301 Editor: Anuradha Das Mathur For any customer queries and assistance please contact help@9dot9.in
This index is provided as an additional service.The publisher does not assume any liabilities for errors or omissions.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 MAY 2012
3
I BELIEVE
AMITABH MISRA Vice President, Engineering, Snapdeal THE AUTHOR HAS worked in the IT domain for over 16 years, having worked with MNCs such as Wells Fargo Bank and Verisign
Reduce Risk Exposure Using Technology
Efficient use of technology can not just help mitigate risk but also enable business growth THERE is a lot that a CIO can do to contain the risk exposure of the company, while at the same time ensuring good customer experience. Handling frauds and cyber attacks are among the biggest risks for any online retailer. There are risks associated with Cash on Delivery, where-
4
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
CURRENT CHALLENGE REDUCING IT AND BUSINESS RISKS WITHOUT CREATING COMPLICATIONS FOR THE USERS
in many users with mala fide intention order multiple high value items and either refuse to take the delivery or give an incorrect address. This impacts the inventory and therefore the business. To curtail such risks, we’ve setup up a comprehensive fraud and risk management system. With the help of the fraud and risk management system, we can create a customer credit rating based on previous transactions. If a customer has refused to take delivery during Cash on Delivery, we put separate checks to ensure that the same is not repeated. With this system, we also mark pincodes that have higher probability of frauds. This directly impacts the bottom line for the business. At the same time, we believe that controls shouldn’t be a deterrent for users and need to be used appropriately. The system doesn’t stop at just that. We also use it to ensure no employee can misuse customer information. We segregate customer data from transaction data and ensure that anyone who has access to customer data doesn’t have access to the transaction data and vice versa. We also track all activities on production servers and an alert is triggered as soon as an anomaly is observed. There are various other things that we do with the help of technology to improve customer satisfaction. For example, we noticed that there is a transaction failure rate of over 40 percent in external payment gateways. Instead of making it cumbersome for the user, our system selects the best performing gateway for that particular moment. Even after doing that if the transaction doesn’t go through, our system alerts the user about the same followed by a call from our customer service department to request the user to retry. This has reduced the drop rate from 40 percent to a mere 5 percent. We are working on this to reduce it further.
LETTERS COVE R S TO RY
B I G D ATA
CTOForum LinkedIn Group CHALLENGE OR OPPORTUNITY?
Join over 900 CIOs on the CTO Forum LinkedIn group for latest news and hot enterprise technology discussions. Share your thoughts, participate in discussions and win prizes for the most valuable contribution. You can join The CTOForum group at:
As data within enterprises grows at an exponential pace, the need to store, manage and analyse it is becoming a challenge. CTO Forum tries to investigate if this big data problem is just a challenge or an opportunity for the enterprise
S P I N E
CTO
FOR UM
BY VARUN AGGARWAL
Techno logy for Growth and
Gover nance
SEVE N STEP
May | 07 | 2012 Volum | 50 e 07 | Issue 18
S TO AVOID
CTO FORUM 07 MAY 2012
FAILURE
24
THE CHIEF TECHNOLOGY OFFICER FORUM
PANIC
BIG DAT
CTO FORUM 07 MAY 2012
25
| ASS ESSING MOBILE APPLICA TIONS
| MAN AGIN
Open Source vs Proprietary SOFTWARE Practically how many of you feel OpenSource Free software are best solutions than any proprietor software's?
G YOU R
A
PRIVATE CLOUD e 07 | Issue 18
Volum
It's Tim e to Rethink Outsourc ing Media
OF BREE D
PAGE 18
Publicatio
n
www.linkedin.com/ groups?mostPopular=&gid=2580450
Some of the hot discussions on the group are:
CHALLE NGE O R OPP ORTU Page 24 NITY? BEST
A 9.9
THE CHIEF TECHNOLOGY OFFICER FORUM
NEXT
HORIZ ONS
Is Siri Smarte than Go ogle?r PAGE 42
Fi wal ls, Virusre en’t AntiShouAr ld theyDead be? TECH
FOR GOVE RNAN CE
PAGE 36
ARE CTOS MORE INTERESTED IN SATISFYING THE CFO & BOARD RATHER THAN THE CONSUMER?
If CTO is aligned to the CFO and the Board in that order, the CTO will have to also be good at resume writing as he will not last too long. But then the question arises, is the CFO aligned to the Consumer? If he is not, then even he may be in hot water sooner or later.
I would rather mention that, you call should depends on the criticality of the application to serve the enterprise business requirement, as opensource application can have security breaches and lack of support in worst come senario
—Vishal Anand Gupta, Interim CIO & Joint Project Director HiMS at The Calcutta Medical Research Institute
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
http://www.thectoforum.com/content/ ddos-attacks-haveincreased-20-timeslast-3-years-0
BUSINESS READY TO INVEST
When I saw the news about business wanting to spend on IT, it was like an oasis Progression of CEOs showed IT investment trend line going north To read the full story go to:
WRITE TO US: The CTOForum values your feedback. We want to know what you think about the magazine and how to make it a better read for you. Our endeavour continues to be work in progress and your comments will go a long way in making it the preferred publication of the CIO Community.
6
With increasing number of DDoS attacks, Brad Rinklin, CMO Akamai talks about some of the ways in which they can be mitigated
OPINION
ARUN GUPTA, Group CIO, Shoppers' Stop
Send your comments, compliments, complaints or questions about the magazine to editor@thectoforum.com
CTOF Connect
http://www.thectoforum.com/content/businessready-invest-0 ARUN GUPTA CIO, CIPLA
FEATURE INSIDE
Global Mobile Payments Cross $170 Bn Mark Pg 10
Enterprise
IMAGE BY PHOTOS.COM
ROUND-UP
Only 17% of Global PCs Without AntiVirus: Study A McAfee survey reveals 83% of PCs have basic security software
MCAFEE has conducted a global study, analyzing data from voluntary scans of an average of 27-28 million PCs per month, to determine a global estimate of the number of consumers who have basic security software installed. The study found that 83 percent of consumers had working basic1 security protection, and 17 percent of PCs scanned either had no anti-virus installed or the software was installed, but disabled. “It’s gratifying to see that the majority of consumers have gotten the message that at the very least they need to have basic security protection installed,” said
8
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Todd Gebhart, co-president of McAfee. “Protecting digital devices against cybercrime from malware not only benefits each of us personally, but also serves to discourage illicit activity and preserve the integrity of the Internet, which benefits the greater good.” Globally, the study found that 83 percent of consumers had working basic security protection. 17 percent of PCs scanned either had no anti-virus software installed or it was installed but had expired. The country with the highest percentage of PCs with basic security protection is Finland with 90.3 percent of its consumers protected and 9.67 percent unprotected.
DATA BRIEFING
$18.3 Billion
Worldwide IT Ops Mgmt software market in 2011
E NTE RPRI SE ROUND -UP
THEY BILL SAID IT GATES
IMAGE BY PHOTOS.COM
Bill Gates spoke to NDTV recently on varied topics and when quizzed about Mark Zuckerberg, he said Mark's been great at reaching out to a lot of people including myself about what we might be able to say. Mark's in a unique situation. We have a few things in common. We both dropped out of Harvard.
30% of IT Projects Driven by Compliance Firms' risk and compliance priorities are changing
“I gave a lecture in Harvard where I talked about.. 'hey if you've got a great idea, don't worry about dropping out'. He attended that lecture, so it's a funny connection there.” —Bill Gates Chairman, Microsoft
MCAFEE has announced findings from its annual study that highlights how IT decision-makers view and address the challenges of risk and compliance management in a highly regulated and increasingly complex global business environment. The report, title Risk and Compliance Outlook: 2012, found that Database Security and Security Information and Event Management (SIEM) were among the top priorities due to increased advanced persistent threats (APTs). Database security has been an ongoing concern for organizations due to highly publicized data breaches and the growing regulatory compliance demands. The largest portion of an enterprise’s most sensitive and valuable information resides in databases. When asked about sensitive database breaches, over one quarter had either had a breach or did not have the visibility to detect a breach. In addition, respondents listed databases as the top challenge in meeting regulatory mandates. The other top concern was SIEM, finding that most organizations rely on legacy systems that do not meet their current needs. Ever changing threats, data breaches, and IT complexity add to the burden of being able to monitor security events, detect attacks, and assess real and potential risk.
QUICK BYTE ON MOBILITY
Worldwide mobile payment transaction values will surpass $171.5 billion in 2012, a 61.9 percent increase from 2011 values of $105.9 billion, according to Gartner, Inc. The number of mobile payment users will reach 212.2 million in 2012, up from 160.5 million in 2011. —Gartner
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
9
IMAGE BY PHOTOS.COM
E NTE RPRI SE ROUND -UP
Global Mobile Payments Cross $170 Bn Mark Fragmented service
offerings to be seen in the short term WORLDWIDE mobile payment transaction values will surpass $171.5 billion in 2012, a 61.9 percent increase from 2011 values of $105.9 billion, according to Gartner, Inc. The number of mobile payment users will reach 212.2 million in 2012, up from 160.5 million in 2011. "We expect global mobile transaction volume and value to average 42 percent
annual growth between 2011 and 2016, and we are forecasting a market worth $617 billion with 448 million users by 2016," said Sandy Shen, research director at Gartner. "This will bring opportunities for service and solution providers who will need to cater to the local demand patterns to customise their offerings." The mobile payments market will experi-
GLOBAL TRACKER
PC growth in India
The combined PC
market in India totalled 2.8 million units in Q1, 2012, a 6.6% increase over Q1 2011
10
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
6.6%
ence fragmented services and solutions for the next two years. Technology providers will have to cater their solutions to the local market that will be using different access technologies, business models and partners, and under different regulatory conditions. "There will be a few global players that have the scale and resources to serve large customers and the mass market whose requirements can be readily satisfied by standard solutions," Shen said. "However, there will always be segments that cannot be sufficiently served by the global players. The demand of these segments can only be satisfied by specialised or local players who can better understand the segment and have specific solutions to meet the unique challenges. SMS remains the dominant access technology in developing markets because of the constraints of mobile devices and the ubiquity of SMS. Web/WAP is the preferred access technology in North America and Western Europe where mobile Internet is commonly available and activated on user devices. Gartner expects Web/WAP access to account for about 88 percent of total transactions in North America and about 80 percent in Western Europe by 2016. Near Field Communication (NFC) transactions will remain relatively low through 2015, although growth will start to pick up from 2016. "NFC payment involves a change in user behavior and requires collaboration among stakeholders that includes banks, mobile carriers, card networks and merchants," said Shen. "It takes time for both to happen, so we don't expect NFC payments to come into the mass market before 2015. In the meantime, ticketing, rather than retail payment, will drive NFC transactions." Merchandise purchases will drive transactions in North America and Western Europe. These will include e-commerce purchases where users buy online, as well as in-store purchases. Major e-tailers such as Amazon and eBay have developed strong mobile storefronts and have seen significant growth from the mobile channel. For in-store purchases, Starbucks' Card Mobile app is now being rolled out nationwide in the U.S., following a successful pilot program, and Gartner expects a large number of merchants to introduce their own mobile payment services, trying to emulate Starbucks' success.
E NTE RPRI SE ROUND -UP
SAP Picks Up Ariba for $4.3 Bn
Will help SAP to stay ahead in the cloud era
SAP AG and Ariba, Inc. have announced that SAP’s subsidiary, SAP America, Inc., has entered into an agreement to acquire Ariba, a leading cloud-based business commerce network, for $45.00 per share, representing an enterprise value of approximately $4.3 billion. The acquisition will combine Ariba’s buyerseller collaboration network with SAP’s broad customer base and deep business process expertise to create new models for business-to-business collaboration in the cloud. The Ariba board of directors has unanimously
approved the transaction. The per share purchase price represents a 20 percent premium over the May 21 closing price and a 19 percent premium over the one month volume weighted average price per share. The transaction will be funded from SAP’s free cash and a €2.4 billion term loan facility. The transaction is expected to close in the third quarter of calendar year 2012, subject to Ariba stockholder approval, clearances by relevant regulatory authorities and other customary closing conditions. The move positions SAP in a fast-growing segment as buyers and sellers across the globe connect in new ways through the cloud. SAP’s entry into the inter-enterprise business network space significantly expands its growth opportunities and accelerates its momentum in the cloud. Last week, SAP announced the roadmap for its cloud applications business (Softwareas-a-Service), focusing on managing customers, suppliers, employees, and financials, in addition to its cloud suite offerings SAP Business ByDesign and SAP Business One. The acquisition will also significantly boost SAP’s cloud applications portfolio with the addition of Ariba’s leading cloud-based procurement solutions. Headquartered in Sunnyvale, California, Ariba has approximately 2,600 employees. The company is the leader in cloud-based collaborative commerce applications and the second-largest cloud vendor by revenue. Ariba combines industryleading technology with a web-based trading community to help companies discover, connect and collaborate with a global network of partners – all in a cloud-based environment. With $444 million in total revenue, Ariba experienced 38.5 percent annual growth in 2011.
FACT TICKER
Indian Enterprise IT Spending to Reach ` 1.9 lakh Crore in 2012 Govt, BFSI sectors driving growth INDIAN enterprise IT spending across all industry markets is forecast to reach 1,910 billion Rupees in 2012, a 16.4 percent increase from 2011 spending of 1,640 billion, according to Gartner, Inc. Some of this growth is attributable to the relative strength of the Rupee against a
12
CTO FORUM 07 JUNE 2012
weakened US dollar, and growth is expected to be moderate through 2016 and settle in at the range of eight to nine percent. "Indian IT providers continue to experience strong growth, and there are more Indian companies in general taking a place on the global stage," said Derry
THE CHIEF TECHNOLOGY OFFICER FORUM
Finkeldey, principal research analyst at Gartner. “There are currently 25 Indian companies in the Fortune 500, and there is a corresponding uptake of ICT in the domestic market, as organizations seek globally competitive practices. The ICT market continues to outpace most markets in Asia/Pacific.” "Government investment in the revamp of education is driving high growth in the relatively small education ICT market," said Finkeldey.
SECURITY
M
cAfee has released the McAfee Threats Report: First Quarter 2012,which exposes an increase in malware across all platforms. The report shows that in Q1, PC malware reached its highest levels in four years, as well as a steep increase in malware targeting the Android platform. Mac malware was also on the rise, indicating that total malware could reach the 100 million mark within the year. “In the first quarter of 2012, we have already detected 8 million new malware samples, showing that malware authors are continuing their unrelenting development of new malware,” said Vincent Weafer, senior vice president of McAfee Labs. “The same skills and techniques that were sharpened on the PC platform are increasingly being extended to other platforms, such as mobile and Mac; and as more homes and businesses use these platforms the attacks will spread, which is why all users, no matter their platforms, should take security and online safety precautions.” Mobile malware raced up a significant incline during Q1 2012, with 8,000 total mobile malware samples collected. This large increase was due in part to McAfee Labs’ advancements in the detection and accumulation of mobile malware samples.
Another Innovation from Datacard
®
Datacard MX2000 ®
TM
Card Issuance System
Affordable advanced technology The MX2000 system offers the following modules ● System Controller ● Card Input ● Card Cleaning ● Magnetic strips Encoding ● OCR-B/Bar code Scanner ● Contact Smart Card Personalization ● Contactless Smart card Personalization ● Graphics printing ● Basic Topcoat ● CardGard UV-curing topcoat ● Embossing ● Topping ● Label affixing ● Card Output ● Card Flipper
Modular, Field Upgradeable, Scaleable…... Flexible, Module design: Invest as you need …buy as you require. The MX2000 Card Issuance system is modular and can be configured to new modules as your need evolves. The system can be configured with upto Seven personalization Modules. Start with the capabilities you need now and upgrade as required. Robust Security Features: The MX2000 uses a familiar Microsoft Operating System that enables administrators to authenticate operate and restrict access to data, application and jobs Footprint: 2.6 Metres X 0.9 Metres Weight: 450-500 Kgs (Depending upon the Configuration) Types of Cards Debit Cards Credit Cards Government Identity Cards Loyalty/Gift Cards Prepaid Cards Corporate Identity Cards Applications Financial /Banking Government Loyalty ● ●
●
●
●
●
●
●
●
Contact us at : Datacard India Private Limited B-302, Flexcel Park, S.V. Road, Next to 24 Karat Multiplex, Jogeshwari (West) Mumbai 400 102. India. Tel: +91 22 6177 0300 Email us at: India_Sales@datacard.com
A QUESTION OF ANSWERS
V I S H A L AWA L
Focusing on Services: More than half of Xerox's revenues come from services and it plans to take this progressively higher.
14
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
V I S H A L AWA L
A QUESTION OF ANSWERS
VISHAL AWAL | XEROX, SOUTH ASIA
Moving from Products to Services
In a conversation with Varun Aggarwal, Vishal Awal, Executive Director, Services, Xerox South Asia talks about the transformation of Xerox from a products company to a services firm. How are you trying to drive a services approach at Xerox from the traditional productcentric approach? Xerox is a services led technology driven company. We bought ACS for $8 billion almost two years back. Among the reasons why we acquired ACS was that in India there is a huge opportunity to build a sustainable services business for Xerox Corp. ACS is a IT & BPO outsourcing company based in the US. It is a strong player in the US, Latin America and Europe with around 8500 employees. India is working for back office jobs for whole lot of contracts that ACA has in the US. They are big in transportation, Financial Services, HR Systems. This merger between Xerox and ACS has been a game changer. The move was hailed by the investor community as a huge step in the right direction.
A whole lot of wheel has already turned. This has positioned Xerox firmly as possibly the most potent entity on this globe that has the entire spectrum of business process and document management services. Unlike other players who are offering compartmentalised solutions in different domains of BPO and DMS, Xerox has the capability to take over the whole nine yards of this business and keep adding sustained business value to an enterprises' operations, For instance, one can start by optimising the office print infrastructure via MPS, then one can take it a notch above via the document supply chain management activities such as marketing, collaterals, brochures, customer on boarding process, loan application processes basically all kind of document related things. This
entire process can be centralised by Xerox as we provide a single point of accountability. Enterprises' core business isn’t managing documents but documents are essential to manage their core business. Hence, they would like to have their documents managed by someone who is the master in this space. This will also help them in freeing up their precious human resources. Hence, it is a very strong value proposition But these services that you are talking about are already existent in Xerox? The services that I have discussed so far were already existent in Xerox. Now with the combined strengths of the organisational assets of Xerox and erstwhile ACS, the capability has become multimode. So you do MPS, DMS, cross media communications
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
15
A QUESTION OF ANSWERS
V I S H A L AWA L
services, BPO and IT outsourcing. In a way, Xerox is in a unique position to be able to offer a one-stopshop solution covering the whole gamut. Eventually that is the wanted position. I am not saying that the merger has happened in such a way that we are ready to services the whole spectrum, but that’s where it is heading towards. So basically instead of offering just MPS, Xerox is now handling the complete IT outsourcing? I would say BPO and DMS outsourcing, that’s the wanted position and that’s where the company is gearing to. We have already taken some big steps in that direction. Earlier, Xerox was primarily an equipment provider. That model has changed about three years back. Now, both the box link services and standalone services are being brought and given as an offering to the target market. This approach has helped customers in several ways. SOme of which are as follows. l Helps clients to improve business process and significantly enhance ROI l Ensures cost savings on an average on outsourcing document management needs l Helps them manage customer experience much better l Brand consistency l All the services being done by a single point of accountability l Uniformity in the documents and look and feel factor l Standardisation, Imaging and digitalisation of documents While MPS comprises several services, in India it is typically looked at as print outsourcing and infrastructure management. Do you see this perception changing? We did a lot of research on the sizing of the market. We got Boston Consulting group to do some work for us with respect to market size
16
CTO FORUM 07 JUNE 2012
and how it will grow from 2009 till 2015. The finding was eye popping. In 2009, the overall market size – the addressable market size for Xerox was $3 billion ($250$ mn was DMS, $1.7 bn was equipment, 1bn$ was ACS). In 2015, the MPS+DMS+CMS market size increased to $3bn, the market for equipment grew at 8-9% CAGR while the services CAGR was almost 51%. The ACS pie is growing from $1bn to $6 bn with a CAGR of 35%. Hence, the market transformation is quiet significant. The structure of the market is transforming from CAPEX to OPEX driven. As part of our rebranding exercise earlier this year, we have fully integrated ACS into Xerox. The estimated size for our kind of services could be $12-15 billion. This itself could be a good data point as to why the company decided to transform at a global level to a services driven comany. More than half of our revenues are now coming from
THE CHIEF TECHNOLOGY OFFICER FORUM
THINGS I BELIEVE IN The merger between Xerox and ACS has been a game changer. For Xerox, BFSI and Telecom are the key focus verticals presently. lot of A documentation will be digitalised but paper cannot be completely eliminated
services and our vision is to take this progressively higher. The market is right, the organisational assets are aligned and the timing is right. This journey will keep gathering acceleration as we go ahead. Xerox has traditionally been an enterprise player, especially in the MPS market. What are doing in the SMB sector? We want to deliver superior value services to all the segments of enterprises. When it comes to MPS we are consistently appearing as the leaders. Xerox has been recognised by leading research companies such as Gartner, and as the leader in this area. We provide solutions to clients which helps them from commercialising to customising. Some of the initiatives that we have undertaken to empower SMB are as below. l Reach out to SMB’s and companies through channel partners (XPS – Xerox print Services –
V I S H A L AWA L
“Paperwork will be an imperative part for enterprises and cannot be eliminated in the near future.” enable Xerox to service non-Xerox partners) l Developing Channel partners base (XPPS – Xerox partner print services – standardised, cost efficient manner) l Positioning of ACS operations & increasing revenue Xerox GDO provides the benefit of security, cost saving, accessing web, back office, expense system, validation and extraction of consolidated data. For Xerox, BFSI and Telecom are the key focus verticals presently. Xerox is planning to tap other potential verticals such as Healthcare, consumer goods, retail & manu-
facturing sectors; government sector is also a big space which cannot be ignored. What are you plans for the next one year? Xerox streamlines client’s DocumentIntensive Business Processes investment by offering outsourced services and services delivery platform rather than capital intensive procurement; this helps customers to optimise on capital infusion and select pay per services mode as a viable alternative for ROI. We have some of the leading BFSI & Telecom clients in India. Xerox has major plans for its document
A QUESTION OF ANSWERS
management group in the coming years. Our key strategies includes right capability to gear towards services led company; streamline business models; market reach via alliances, channels and go to market strategy and investment in Xerox infrastructure. CTO’s & CFO’s are an integral part of decision making when it comes to MPS for larger enterprises. In smaller enterprises, these decisions are generally taken by the Facility Management divisions. Xerox believes in driving its revenues via driving and accelerating its customers’ success. What is your opinion on paperless office? While we do believe that in the near future a lot of documentation would be digitalised, which is a big opportunity area for us. However, paperwork would definitely be an imperative part for enterprises and can never be completely eliminated, at least not in the near future.
FEATURES INSIDE
How to Make User Experience Count Pg 20
BEST OF
BREED
CIOs and Securing Data with Analytics Pg 22 Extracting Info From the Sea of Big Data Pg 24 + More
F
or most organisations, the ability to pinpoint and replicate superior practices is a vital competitive advantage. No matter the industry, reusing successfully demonstrated practices can lead to shorter cycle times, faster ramp-ups, higher customer satisfaction, better decisions, and lower costs - all of which can mean the difference between success and failure in an increasingly competitive global economy.
ILLUSTRATION BY SHIGIL N
Best practices and instructional knowledge go hand in hand
Teaching Best Practices Keep your strategic drivers in mind and design a process that fits your firm’s existing structure BY LAUREN TREES
18
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Best practices are a particularly powerful form of institutional knowledge in that they take information and data and put it in the context of real people and experiences. By learning what works best in other parts of the organisation, employees get theory, evidence, and expertise all wrapped into one. Best practices also help cut through employees’ natural resistance to change; it’s harder to ignore or dispute a new idea when it already is being used elsewhere to achieve superior results. But despite these advantages, many organisations fail to recognise and duplicate the processes and methods that consistently get the best results. Why? As you might imagine, the internal transfer of best practices is not as simple as picking up a wiring diagram or process flow chart and faxing it to another location. It takes a formal strategy, a clear business focus, and an understanding of the variables that encourage and impede transfer.
An effective transfer includes outstanding practices
M A N AG E M E N T
Formalising best practices transfer The reasons why organisations formalise the transfer of best practices vary, but common drivers include the need to standardize after a merger or acquisition, efforts to minimise year-over-year costs while maintaining quality, and initiatives to upgrade IT and reporting systems. Business models that emphasise cost control or that require a standardised approach in multiple markets are particularly good candidates for transfer programmes. An effective transfer process must include steps to: 1 Identify outstanding practices; 2 Capture or document knowledge related to the practices and what makes them effective; 3 Review, evaluate, or validate the practices; 4 Communicate and share the practices, possibly by enabling exchanges between the source and potential recipients; and 5 Support recipients over time as they adopt and adapt the practices. While there is no one-size-fits-all solution for best practices transfer, APQC has observed three distinct models that work. The first involves the direct transfer of a centrally identified practice to multiple recipient locations. Often implemented as part of a six sigma or process improvement project, this model begins when a process improvement team identifies or creates a new best practice. Once experts review and validate the practice, it is pushed to relevant teams and locations, which usually are required to adopt it. In the second model, transfer is facilitated through discipline-based communities or networks. Unlike in the first model, no centrally identified group is responsible for the creation of practices. Instead, teams identify their own candidate practices and bring them to the relevant communities, where they are evaluated and shared. The communities advocate the use of best practices and provide support, but adoption tends to be voluntary. The third model involves a self-assessment process in which teams invite assessors to evaluate their current opportunities against pre-established criteria. The teams then design improvement plans to address the gaps revealed by the assessments and, once implementation is complete, document and share their new leading practices. While the
assessments are voluntary, management pressure for continuous improvement creates a pull to participate. Regardless of the model used, teams implementing best practices are often tempted to start by adapting practices to their own operations. However, according to our research, firms experience the best results when they require that best practices be adopted “as is” (at least as much as possible). Then, once a team sees the practice in action, it can tweak it to suit the particular situation. If a team adapts a practice before implementing it, it may unwittingly eliminate the very elements that make the practice “best,” thus defeating the purpose of the transfer.
Addressing logistical, structural, and cultural hurdles In addition to designing an effective process, organisations must address the logistical, structural, and cultural hurdles that impede transfer. For example, the organisational structure may promote silo thinking in which locations or divisions focus on maxi-
BEST OF BREED
support the transfer process; Clear, accessible documentation to explain transfer and what is expected of participants; Extensive “face time” and workshops to help participants assimilate the need for change and the alternative practices they might adopt; Robust communications and visible success stories; Compliance scorecards and visible reporting of adoption; and The inclusion of best practice transfer in employee performance expectations. Below is a description of aluminum refiner Alcoa World Alumina's transfer program, which is a version of the second type of transfer, the community-based model. In addition to creating an effective transfer process, Alcoa has taken significant steps to transform its culture and encourage employee participation.
How Alcoa transfers its best practices With more than 17 locations in six coun-
The most important factor in overcoming barriers is the support and involvement of senior leaders. Securing employee buy-in is easier when management is committed and enthusistic mizing their own accomplishments and rewards, instead of supporting the success of the overall organisation. Similarly, managers may not allocate time for employees to learn from and help one another or they may not sufficiently reward them for doing so. Other barriers are even harder to break down, such as employees’ reluctance to change the way they work based on advice from colleagues they don’t know and may never meet. The most important factor in overcoming these barriers is the support and involvement of senior leaders. Securing employee buy-in is easier when management is committed and enthusiastic. Other factors that support change include: The allocation of full-time resources to
tries, Alcoa views standardisation and improvement as vital concerns. Over the years, refining locations had introduced variation into their activities, with both positive and negative effects. The organisation wanted a way to eliminate the ineffective or negative variations while allowing improvements to be disseminated and implemented across locations. In 2004, Alcoa established a network of virtual, discipline-based communities to discover, refine, and implement improvements to the mining and refining processes and then distribute these best practices to other locations across the organisation. The communities have identified more than 150 best practices to date and have been instrumental in bringing all the refining and THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
19
BEST OF BREED
M A N AG E M E N T
mining processes to a standard level of high Alcoa tries to ensure that participation in performance. its transfer programme is inherently valuIn addition to communities, two other iniable for employees. For example, community tiatives also support the goal of best practice members can leverage the information, best transfer: focus plant initiatives and global practices, and expertise in their communities virtual teams. Focus plant initiato help solve issues that arise at tives are events where experts their locations. To further progather in one location to plan mote participation, the organisalarge-scale improvements and tion incorporates goals for best identify collections of best pracpractice transfer in employees’ tices that can be used to address performance objectives. It also GROWTH systemic problems. altered certain job descriptions OF INDIAN Global virtual teams are to explicitly outline activities INFORMATION similar in that they develop best such as best practice transfer practices in response to specific and community membership. TECHNOLOGY challenges, but they work virtuDesign the process that works SPEND IN 2012 for you ally instead of face-to-face. Alcoa There are many different ways refers to communities, focus to transfer best practices between teams or plant initiatives, and global virtual teams locations. This is true even inside a single collectively as “three strategies, one goal” because all three support the organisation’s organization, as evidenced by Alcoa’s “three pursuit of standardized, global best practices. strategies, one goal” approach. The most
16%
important thing is to keep your strategic drivers and objectives in mind and design a process that fits your organisation’s existing structure and culture. A programme that supports the objectives of senior leadership and allows employees to exchange best practices in the course of their normal work is most likely to achieve widespread adoption and popularity. — Lauren Trees is a knowledge specialist at APQC, which is a member-based nonprofit and one of the leading proponents of benchmarking and best practice business research. Working with more than 500 organizations worldwide in all industries, APQC focuses on providing organisations with the information they need to work smarter, faster, and with confidence. — The article has been reprinted with permission from CIO Update. To see more articles regarding IT management best practices, please visit www.cioupdate.com.
How to Make User Experience Count
UEM holds the potential to be the lightning rod in aligning IT to business objectives BY DENNIS DROGSETH
W
e live in an industry that seems to be governed by acronyms, which, though often confusing, represent terms and ideas which are themselves confusing and often misleading. Trying to clean up the mess would probably be only a tad easier than publicly legitimising every CIA action since the beginning of the Cold War.
Using UEM to align IT User experience management (UEM) (or end user experience (EUE) or real user management (RUM)) is a case in point. As the next-generation acronym to replace quality of experience (QoE) that reflects what goes on when real flesh-and-blood IT consumers are taken into account, it is also central to what I would call the “humanisation of IT”- a richer and in my opinion more complete vision than the one captured in the phrase the “consumerisation of IT.”
20
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
I would argue that UEM holds the potential to be the single most powerful lightning rod in aligning IT to business objectives once it’s understood in its full context. UEM is also becoming a guiding light in assimilating the benefits of cloud computing - including public and private cloud manifestations. So the first thing I have to say is when you’re approach an investment in a UEM strategy, don’t fall prey to the erroneous idea that it’s merely a subset of application performance management (APM), which is itself a subset of business service management (BSM). This is a destructive and rather pointless misconstruction of the facts and puts a drastic straight jacket around an interrelated set of disciplines that, collectively, may hold more potential to transform IT into an extroverted value-aware organization than any other. Okay, so what am I talking about when I talk about UEM? Consider the following: Monitor and optimise the effective delivery of business services to
M A N AG E M E N T
their end-consumers in terms of performance and security concerns (latency, transactional efficiency); Monitor and optimise the effective design and content of business services for their end-consumers (navigation, relevance); Monitor and optimise end user interaction with business services including the efficiency with which the service is utilised in support of business processes (for training, compliance and help desk support); Monitor the frequency and other usage patterns with which users (by type or group) leverage IT- delivered business services; and Monitor and optimise the business outcomes of IT-delivered business services based on user interactions . These may seem like quite a lot of things to bundle in one idea or acronym, but they’re all fundamentally related. The answer as to “Why?” goes back to the idea that the core value of any IT service from SAP and Exchange to VoIP or HR self service - is to extend the range and capabilities of the human consumer in support of businessrelevant objectives. And who is that consumer? That can be as important to take into account as the network, system and application architectures underneath your service. I’ll take just three examples. For the first, let’s picture a secretary who, suffering from irrational demands, chain smokes in response. In this case, she’s a college grad stuck in an interim job who, just to stay sane, needs simple, speedy, efficient and consistent access to the IT services that she depends on every minute of every working day. Latencies not only damage her output they may also damage her health and her emotional well being. Next let’s imagine her cousin. She’s a super star in developing a global real estate consortium that depends on IT services to navigate properties, give tours of houses, and provide fast and efficient search capabilities to match prospective buyer requests with selected properties. Her needs as an IT consumer are far more complex, and likely to be more likely to change in a shorter period of time. She will want all the efficiencies of her cousin but orders of magnitude more access to information with far less predictable habits. Now let’s picture a third figure: a network administrator faced with a hodge podge of point tools and “suites” or “platform tools” that promise the world but never seem to work. The very tools that claim to make his life simpler are doing just the opposite. And moreover, he’s expected to make changes in the way he works based on edicts from on high; all without anyone really taking the time to explain why or what’s in it for him.
BEST OF BREED
Understanding your consumer is key to IT I would argue that understanding these and other consumer footprints is the place to start. They are not after-thoughts or bolt-ons. And, if understanding these consumer profiles doesn’t belong in the “business of IT, "then where does it belong? This seemingly radical approach to the business of IT is actually nothing new. If you ran a restaurant, you’d want to know the habits, preferences, and impacts on your clientele; whether it’s a great night out, or a fast but healthy chance to eat on the road. If you sold vacuum cleaners, you’d be very interested in your various buyers, and how they used your device with other devices and how effective that made them in caring for the home. But IT has been historically introverted - with the classic Dilbert cartoon still holding sway. The introversion has created a cultural focus on details and processes that often become obsolete, as new technologies, new options such as cloud or new business environments such as Web and Web 2.0 ecosystems emerge. IT executives wedded to that past introversion are not likely to stay afloat too much longer, especially as lines of business owners (erroneously or not) feel that they can just as easily go shopping in that great shopping mall in the sky called “cloud.” UEM is probably the single strongest beacon out of this mess. Instrumentation that can support performance and triage, user interactions and efficiencies, consumer preferences and priorities in using IT services, competitive industry insights. Even reverse-engineered business outcomes can be of incredible value in navigating the real pressures technology imposes and the opportunities it presents. UEM can both help to identify user “classes” and inform on pre-established and often erroneous myths about what the consumers of IT services like or don’t like. EMA is about to embark on some research to update our work from three years ago in this arena. We will be looking at technologies, organisational owners, processes, obstacles, drivers and achieved benefits for UEM deployments in 2012.
ILLUSTRATION BY RAJ VERMA
Monitor the business outcomes based on user interactions
— Dennis Drogseth is VP of Boulder, Colo.-based Enterprise Management Associates, an industry research firm focused on IT management. Dennis can be reached a tddrogseth@enterprisemanagement.com. —The article has been reprinted with permission from CIO Update. To see more articles regarding IT management best practices, please visit www.cioupdate.com.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
21
BEST OF BREED
B I G D ATA
CIOs and Securing Data with Analytics
No enterprise, no matter how small or benign, will ever be safe from attack in the future
BY BILL GERNEGLIA
PHOTO BY PHOTOS.COM
C
yber attacks are becoming increasingly common across the globe. Many Fortune 2000 companies as well as government agencies around the world are under frequent cyber attack of their core systems and services. Cybercrime is a generic term for the illegal incursion and disruption at the national, enterprise and community level, of both cyber and physical assets. Cybercrime is a relatively new phenomenon but because of its recent scale and game-changing implications for both government and industry it is rapidly becoming the dominant risk theme of the 21st century. The opportunity for cyber attacks grows daily as corporations and governments continue to amass information about individuals in complex networks across the Web. At the same time new generations of cyber activists, some motivated purely by money and others by the desire to expose and destabilize corporations and governments, continue to hack into organisational secrets. No enterprise, no matter how small or benign, will ever be safe from attack in the future, with an estimated 250,000 site breaches reported in the last few years. The challenge for the CIO is that larger organisations need to monitor hundreds of millions of events per day, including some activities that are happening around the periphery of their business and outside the data center. There is just no way humans can analyse and process that amount of data in search of a potential significant cyber crime event.
22
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
This expanding rate of potential threats call for a new way to approach corporate data security. The latest approach is one that is based on intelligence and BI tools. Secu-
rity intelligence applies advanced analytics and automation technology to the collection of information from hundreds of sources across an organization.
B I G D ATA
By sifting through data from global networks, corporate applications, user activity and mobile usage data, business analytics applications can assist firms in better understanding a baseline of normal behaviour. Then analytics can assist the CISO more quickly and clearly flag abnormal events to predict, prevent and minimise the impact. In consideration of their BI initiatives, CIOs consistently speak about five IT trends that are shaping decisions on how, where, and when they can deliver the required analytics for their organisations. These include:
1 The Growth of Big Data Systems. 00 2 Technologies enabling faster and morese00
cure processing. 3 The declining cost of IT commodities in 00 he cloud. 4 The proliferation of mobile devices. 00 5 Leveraging social media outlets for their 00 organisations. CIOs today in their global organisations are focused on building secure controls around their data centers to protect their assets, information and intellectual property. This is done through a series of initiatives such as implementing a layered securi-
BEST OF BREED
ty approach to data, or using advanced data encryption techniques ( may be hardware or software based ) to protect data assets from hacking, or constant diligence in monitoring firewalls and building secure access channels to important data assets. Thinking about information security today means thinking about the world events around your organisation and how they interact with organisational data. —This article is printed with prior permission from www.infosecisland.com. For more features and opinions on information security and risk management, please visit Infosec Island.
More Data, More Problems
Five tips to help you make sense of the exponentially expanding volume of data we're all facing BY DENNIS DROGSETH
I
have memories of my mother telling me when I was a kid not to bite off more than I could chew. That seems to be the biggest challenge with big data. There is so much of it, we can’t seem to figure out how to capture it, store it, search it, analyse it or visualise it. It is simply overwhelming. What are we supposed to do? We certainly can’t wait for all the technology to be in place to address this issue. That’s the cart leading the horse. I would suggest that while we’re waiting for technology to evolve, we have to make some hard decisions regarding what we do — and don’t try to analyse. Too much of a good thing can indeed become a bad thing. Here are five suggestions: 1. Decide what data matters most and is worth analysing and reporting on. If you already know how your consumers in a certain market segment are reacting to your efforts, limit analysis to new market segments or new product offerings. 2. Seed a few comments about your services on social media sites and see what type of reactions you receive. While immediate visceral reactions aren’t always statistically meaningful, they will give you a flavor for the kind of “energy” people have toward your organisation and its offerings. 3. Talk to people you trust and respect. I call it the hallway met-
ric. What will people you trust tell you that others won’t? 4. Get out of your office and visit your consumers. In our case, that means talking to the people who play in the United States Tennis Association leagues and tournaments. What do they think about our new products? What challenges are they experiencing that we can’t envision while sitting behind a desk? 5. Work with your clients to pare down the mountain of data to the handful of data elements that they feel are most germane to determining trends, progress and challenges. Don’t presuppose the answers. Ask them what they think matters most. The suggestions listed above certainly won’t solve all the problems of big data. It will take us some time and some more innovation to do so. Meanwhile, we still have businesses to run. Let’s focus on that for the time being. —Larry Bonfante is CIO of the United States Tennis Association and founder of CIO Bench Coach, LLC, an executive coaching practice for IT executives. He is also author of Lessons in IT Transformation, published by John Wiley & Sons. He can be reached at Larry@CIOBenchCoach. com. — This article was first published in CIO Insight. For more stories, please visit www.cioinsight.com.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
23
BEST OF BREED
B I G D ATA
Extracting Info From the Sea of Big Data Cloud, Big Data, semantic and metadata technologies are causing a rethink of feasibility and risk BY IAN ROWLANDS
A
convergence of technologies is shifting the balance of attention in the information management world. From the beginning of business IT, which, tellingly, started out in many organisations as “data processing,” the focus has been on so-called “structured” data. Yet there has long been recognition that enterprises have far more “unstructured" data than structured, matched by a perception that mining it would cost more than the effort would justify.
PHOTO BY PHOTOS.COM
The cloud, Big Data, semantic and metadata The cloud, Big Data, semantic and metadata technologies are causing a rethink of feasibility, just as the value of customer interactions, click stream data and other sources are being reevaluated and the risks inherent in email, contracts, and other documents being reassessed. Put all of this together with the emergence of cool new consumer apps driving expectations, and suddenly the sea of unstructured data is full of sunken treasures. Information overload. It’s not a new idea. In fact Alvin Toffler probably first put the notion into general parlance in his book “Future Shock,” published back in 1970. Even then, before the Internet, email, or social media, there was a perception that too much information was making decisions harder, not easier to make. At around the same time, Peter Drucker was coining the term “knowledge worker” to identify the shift from manual and service
24
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
work to knowledge work as a main driver of business value. So here we are about 40 years after Drucker and Toffler identified the seminal social and business shifts implied by the information explosion and we are swimming in an ocean of information and data. In fact, according to well respected research firm
IDC, the amount of information created and replicated in 2011 will have passed 1.8 zettabytes (a zettabyte is a trillion gigabytes) and it’s anticipated that the pile is going to keep doubling roughly every two years. How are we going to cope, now? So far, it’s not looking good. According to another distinguished research house, Forrester,
BEST OF BREED
B I G D ATA
firms use only five percent of the data available to them, while created data is growing at 40 percent to 50 percent annually and only 25 percent to 30 percent of that total is being captured. That of course is data, not information. And therein lays another ugly truth: Information technology started as data processing. Its business was dealing with a special kind of information, information that resided in fixed locations in defined data structures. Over time, this increasingly has come to mean tabular data, with rows and columns, and keys to support analysis. Increasingly sophisticated, high-performing tools have been developed to capture, process and analyze this kind of information. The bad news is that the majority of information no longer fits that satisfyingly simple model. The majority (I’ve seen estimates as high as 80 percent to 90 percent!) is what is now lumped together as "unstructured data."
Unstructured data is a misnomer Unstructured data is really a shorthand
advts.indd 56
B I G D ATA
term which lumps together many different things. There’s no such thing as truly unstructured data (or information). No sharing is possible without some agreement as to structure. The unstructured moniker is really a hangover from when information was stored in ways that meant it was impossible (or extremely difficult) to get at it with data processing capabilities. It’s probably better now for us to think about three kinds of information: Information structured and stored in ways that support business processes and analysis like traditional structured data in databases; Information that can be massaged to support business processes and analysis such as clickstream data, emails, and documents that has typically been thought of as “unstructured data;" and Digital assets: Information that cannot be processed or analysed unless structured information has been attached. This includes things like movies, sound files and pictures, which have generally not been thought of as part of the “data” world at all, but which new technologies may make
BEST OF BREED
increasingly accessible to search What’s the appropriate response to the rising tide of information? Standing on the edge, and trying to push it back, King Canute style, isn’t going to work. What’s more it ignores the treasures to be found in the sea of information. The trick is finding ways of identifying and responding to the significant information, while ignoring and discarding the less valuable flotsam and jetsam. What we might call “riding the unstructured data wave.” The key skill is going to be surfing, not fishing! — As Senior Director of Product Management, Ian Rowlands is responsible for the direction of ASG’sapplications, service support and metadata management technologies including ASG-MetaCMDB, ASG-Rochade and ASG-Manager Products. He has also served as vice president of ASG’s repository development organisation. — This article has been reprinted with prior permission from CIO Update. To see more articles regarding IT management best practices, please visit www.cioupdate.com.
12/22/2009 3:02:47 PM
269
What would you do with an extra 269 minutes? A leading marketing company’s scoring models used to take 4.5 hours to process. Now, with high-performance analytics from SAS ® , they’re scored in 60 seconds. Get the relevant insights you need to make decisions in an ever-shrinking window of opportunity – and capitalize on the complexity of big data to differentiate and innovate.
high-performance A real analytics game changer. High-Performance Computing Grid Computing In-Database Analytics In-Memory Analytics Big Data
sas.com/269 to learn more For more information please contact Jaydeep.Deshpande@sas.com
Each SAS customer’s experience is unique. Actual results vary depending on the customer’s individual conditions. SAS does not guarantee results, and nothing herein should be construed as constituting an additional warranty. SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. © 2012 SAS Institute Inc. All rights reserved. S90309US.0412
COVE R S TO RY
28
MOBILITY
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
MOBILITY
COVE R S TO RY
The rise of mobility in the enterprise is charging up both the employee and enterpise.
IMAGING BY SUNEESH K
l
t is not uncommon today for someone to imagine sending emails, word documents or even full fledged presentations from their mobile devices. In today's device-dominated age, where device prices are comparative to those of jewellery, people are more empowered than they ever were to get things done as much in real time as possible. Most enterprises today are seeing a significant number of their employees using mobile devices to access information on the enterprise network. While this scares a few CIOs, many see the need to accept this change and move forward in finding solutions to allow secure and complete access. Several CIOs believe that mobile device and their user-friendly paradigm will be a driving force for enterprise mobility adoption in the next few years. With mobile phone sales reaching an all time high of approximately 419 million units globally, it is no surprise that more and more of your employees are going to use one device or another to access their information – be it emails, documents, or forms. Enterprises today need to focus on managing devices or building their infrastructure in a way that can provide users with what they need in a secure manner, without treading on control. Today’s workforce has a choice, and they will exercise it when it comes to device of choice. Our story touches upon how CIOs must shape themselves to being more open individuals who work on allowing their employees their freedom while at the same time ensuring a safe computing environment. In the Indian context, CIOs are still learning what works and what doesn’t but the key question of whether there will be a unified mobile ‘middleware’ as such remains unanswered, even globally. Today, CIOs must learn how to best manage the devices owned within the enterprise so as to be able to open up an ecosystem that will drive the way employees work tomorrow. The last few years have seen the rise of the Blackberry platform, which really helped organise and standardise enterprise mobility companies. Today, enterprises are trying to cater to much more than just one operating system with a set of capabilities. CIOs are confronted with iOS, Android (and its numerous flavours), Windows Mobile and the Blackberry OS. Given the above scenario, most CIOs believe that policy is the only way to make sure that you can provide secured and unrestricted access. However, there are still many loose ends, which make it even more important to understand how CIOs are innovating in this space. This is what we hope to cover in the rest of this feature. THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
29
COVE R S TO RY
MOBILITY
THE MOBILE WORKFORCE IT teams do not have a choice to restrict device proliferation. Efficient management is, therefore, the only way out
T
he last few years have seen unprecedented evolution in terms of the development of the iPhone, the release of Android, and the advent of tablets, all of which have taken mobility beyond the fortress of Blackberry into the enterprise. Today, devices are not just feature rich, they are more powerful than most threeyear old computers. However, given these new devices and the means by which they process information, the IT departments of most companies have been pulling their hair in order to understand how to deliver information to these devices in a way that doesn’t compromise the security of the organisation. With employees (end users) being the biggest proponents of this change, it is apparent that IT teams of companies do not have a choice when it comes to restricting this device proliferation. Today, it has become increasingly important to evolve with this trend and provision for it.
THE SCENARIO "For us, BYOD has been more of a natural evolution than a choice. In the past, we used to be the touch point for devices, consider-
30
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
ing there were only a few (or one) that could actually deliver on the enterprise front. However, today devices in the hands of people are better than the ones we provide, so we have been more or less pushed into an open scenario where we try to avoid restrictions on device purchases and use more innovative means of control,” explains Sankarson Banerjee, CIO, India Infoline. Arun Gupta, CIO, Cipla explains that enterprise mobility attempts to address a few paradigms such as how can I deliver information to my workforce on the move to make decisions whenever they need to; how can mobility can improve a process; or how it allows you to engage with your customers on mobile devices. “The idea is to have more information available to all available touch points securely so as to make the life of the end user easier. Employees today are getting their own devices and it would not be fair to disable access. Applications should be secure enough and policies should be in place to have control on what is flowing in and out of these devices,” says Mehriar Patel, CIO, Globus.
CHALLENGES A change in philosophy is one thing, however that change does not come easy. Talks
about change in technology circles are usually based on making sure employees can deal with the change, but here it’s the other way around. Here employees are driving force behind the change. “One of the biggest challenges in my opinion is the sheer variety of devices. Now if everyone was using Blackberry or Nokia, these were issued as identical devices, so you knew then how you could manage them. There is also the fact that you choose devices that are generally easier to manage as opposed to having three or four different flavours at the same time, here you don’t have such control,” explains Banerjee. Arun Gupta goes further to explain that the platform approach is yet to mature there is a lot of promise in the market but no capability to deliver across platforms, considering there are different form factors that would render applications differently. Even if you consider android there are so many flavours, that it becomes difficult to zero down on one platform. “These challenges have to be addressed with a common denominator which could take away a lot of rich functionality that you might want your users to have access to,” explains Gupta. “You don’t have the choice of choosing the management options and then getting the technology in today's day and age. You are currently choiceless and dealing with devices that haven't matured completely for an enterprise setup. The challenge is to keep up with the options available. No one really declares what device they have, and we need to stay
MOBILITY
abreast of the phones or devices people are using and on basis of that form a strategy,” says Banerjee. Banerjee continues to explain “Pardoxicaly it makes us think of what controls are useful to us. So, historically companies really safeguarded their emails, which would not reside on any device, they would not be allowed outside the firewall. You can have it in your own network and as the BYOD movement came it became difficult to keep the information inside.” “You will have to support multiple devices and platforms, and within those you can standardise using a couple of options. This works internally not for your customers. It is going to remain a challenge if you go an applications way or custom applications, the market will evolve after a certain level to create some frameworks that we can use. Initially in Shoppers Stop, we wanted to enable our app on Blackberry and iOS, but with android it was tough to get working across all devices,” explains Gupta, who was the CIO of Shoppers Stop before joining Cipla. According to Mehriar Patel, “The governance aspect of this is extremely important, because you cannot extend too many restrictions on users. Today, the phones available to the common user are so feature rich and capable. Governance should be such that control always stays with you, but you create an open ecosystem that does not depend on devices themselves.”
BEST PRACTICES Although the mobility segment is fairly disparate right now, with many devices and a lack of a single platform or middleware of sorts to look at cross platform management, people are still experimenting, be it in India or globally. CIOs today are trying their best to accommodate more and more devices so as to deliver maximum functionality. In order to exercise control, the trends are more towards managing the data as opposed to managing the device. CIOs are looking at securing the servers and applications that feed data to the end points and are looking to ensure that data is delivered without actually being stored on the device. This is where technologies like virtual desktops also come into play. “We started thinking of what policies
YOU WILL HAVE TO SUPPORT MULTIPLE DEVICES AND PLATFORMS, AND WITHIN THOSE YOU CAN STANDARDISE USING A COUPLE OF OPTIONS
would fit this scenario. In the past we had a dictatorship of sorts as to who would own what device, today with the open environment, it becomes difficult to exercise the same controls. So we started re-looking at choices, recasting our security policies, we allowed people to use things they were previously not allowed, so we moved from prevention to monitoring. We would use user behaviour to our benefit to track who is abusing the system,” explains Banerjee. When it comes down to best pratices Banerjee mentions that on policy side, in this time of large change, people try to make extremely complex choices, policy writers also makes mistakes. Second thing is that it leads to a poor tradeoff on convenience, so you might have to sacrifice security for peoples’ convenience. In this scenario simplicity is better, in most cases, especially where technology is so consumer friendly. On the security side, technology alone is not the answer. It must be tied to business processes to emerge successful, explains Banerjee. “Currently there are no standardised solutions available. So this is still a developing subset of consumption, and we are waiting for it to stabilise until when we will have a good set of tools. Having said this, monitoring has become quite mature, It’s pretty easy to monitor everything. So the monitor and correct approach has become more mature
COVE R S TO RY
as opposed to the draconian control approach,” he continues. When asked about platform of preference, Banerjee mentions, that on the enterprise side Blackberry continues to have the best enterprise manageability, but the least user convenience. However, IOS and android are still building up their enterprise capabilities. Patel adds, “Your IT policy should be robust enough to allow BYOD. It should not only be able to handle BYOD, but also the other paradigms that are taking place. Continue to exploiting IT into giving your users the best. Controls should be there, but they should be encouraging to exploring new ways of doing things and innovating. Users today are moving away from the old style of working and are certainly empowered with these devices. They should be encourages to ask for new technology and applications. Mobility has to be provided in the right manner.” He adds, “First of all, all the devices that must be connected have to be identified. You should have a set of policies that will apply to these mobile devices. Keep on checking your applications and servers to make sure there is no vulnerability using BYOD. You have to make sure that although you have allowed BYOD, you must be able to exercise control on your infrastructure. No one should be able to play with the corporate infrastructure.”
CONCLUSION Gupta mentions that it’s too early to say anything about where the industry is moving. “If you look at large IT developers like Sybase, for example, who came with so many promises. They don’t even have their own application across platforms. So, to have SAP mobile across all kinds of devices, you can't do it at this point. The market is still in the infancy stages of getting ready for real delivery,” he says. “My simple recommendation to anyone looking at mobility is to start experimenting. Don’t wait for the evolution as it might not address all your problems. If your solution works, it makes sense to go ahead and wait and watch what happens. It’s a challenge to impose choice onto a user, but users have their own minds when it comes to purchasing devices. BYOD is going to be phenomenon where if you adopt it you are damned and if you don’t, you are damned too,” explains Gupta. THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
31
COVE R S TO RY
MOBILITY
DEFINE YOUR MOBILITY GOALS In a conversation with Ankush Sohoni, Vishal Tripathi, Principal Analyst, Gartner talks about some of the key trends in enterprise mobility
D
evice proliferation is at an all time high in todays day and age, and more and more organisational employees are looking to these devices to see what they can do with them, and how they can improve the way they work by utilising these devices. On the other hand, CIOs of various organisations are looking at new ways of managing these devices. As the information vanguards of the enterprise, CIOs are under increasing pressure to bring the most out of these devices from a productivity standpoint, secure the data being accessed by these devices, and to ensure delivery at the same time. This leaves the CIO in quite a tough spot. In light of this, Ankush Sohoni, spoke to Vishal Tripathi, Principal Analyst, Gartner to find out where this trend is going and how CIOs can secure the end points.
32
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Vishal Tripathi Principal Analyst Gartner
MOBILITY
What are the key trends that you see in the Enterprise Mobility space? As far as enterprise mobility is concerned, at this point of time BYOD is the popular contendor of choice, given that at this point companies are still trying to manage the influx of devices into their organisations. Today, users have access to a number of devices, which all come with their own set of operating systems. This paints a very complex scenario. Last year BYOD was a fairly new concept and although CIOs knew that it would need to be handled, it was not a necessity. We were still in a world dominated by Blackberry strategies. However, things have changed, and today there are a number of news devices like the Android, iPhone and other legacy devices which need to be supported which makes for a very complicated environment Today, users have choices. Earlier enterprises were quite rigid when it came to devices, considering at that time the ball was in their court. The past was dominated with enterprise led device distribution, which worked well considering that the enterprise had enough control and clarity as far as policies and data treatment went. However, today, supporting just one platform is not even a remote possibility. End users today have disposable incomes, and they are empowered in a sense that they will bring home a tablet, smartphone and so on. Companies are in process of setting up policies, and infrastructure. So when you talk about BYOD, you need a lot of support and need to see which applications can be distributed and how. Policies can govern functions like remote wipe, data access and so on. So what a lot of companies are also doing is setting up VDI. Applications are then distributed on this. Another thing they are doing is setting up policies to make sure company data cannot be stored locally on the device. So you have Mobile Device Management (MDM) that allows you to exercise control on devices. So, a lot of companies are also building security at application level. They are saying they will build their own security. In light of these developments, more and more companies are opening up to the concept of BYOD.
What are some of the common challenges that CIOs encounter when faced with this scenario? The biggest challenges is managing these devices. You cannot tell people what to buy. CIOs are therefore trying to standardise. Some are taking a stand and supporting only a specific set of devices. One has to also understand that these things depend on how important enterprise mobility is to an organisation. Sometimes the IT organisation is not under any kind of obligation to support these devices. It may ro may not be a KRA for them. In this sort of scenario,
COVE R S TO RY
where they are responsible of servicing their users’ issues with devices. Do you see an opportunity for third party service providers to come into the picture and help take the load off the enterprise in terms of mangeability? Yes, there is a big opportunity for 3rd party companies to come into the picture and take the load off enterprises. The service provider will have the capability to exclusively service the devices present in a company and
OALS G O W T ADLY OULD BE O R B E R THERE ABYOD, WHICH CGOALS. THE WITH R FINANCIAL T EMPLOYEE U O SOCIAL OALS ARE ABO ETC. G SOCIAL SATISFACTION
we are encountering situations where the IT guys are saying this is the level that we support your device to, and we will not do anything else. So it’s not like they will give you every single thing an end user wants. A lot of them are using virtual desktop as a solution, so when someone logs into a computer network they log in using a VPN and even if data is stored, it is not being stored locally. There are broadly two goals with BYOD which could be social or financial goals. The social goals are to do with employee satisfaction, retention policies, or empowerment. Users are made too feel that they work in an environment that is open and accepting of change and new technologies. Contrary to this, the financial goals talk about whether bringing in mobile devices or a mobility strategy will cause cost savings. IT organisations can also have a charge back mechanism
allow easier manageability. It then becomes an internal vendor v/s external vendor call that has to be taken by the enterprise. So its obvious that managing devices within the control of the company is better than doing so outside it. So then a third party provider can help a lot as being a vendor to diagnose problems, create opportunities. What would you recommend to CIOs getting onto the mobility path? Well the first thing that CIOs need to do is define why they want to do this as per the goals specified above. The reasons could range from social to financial or to a combination of both and can also be used for improvements in productivity. Secondly, it is important that things be measured. There will be a cost, but the question is can this cost be justified. CIOs need to ensure that they know why the are doing this as opposed to following industry trends.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
33
COVE R S TO RY
MOBILITY
ESSAR: THE EARLY ADOPTER Essar has enforced an extensive strategy to manage mobile devices that can be used by various internal employees
G
iven the extent of mobile device proliferation within enterprises, it has become an utmost priority for CIOs to outline a policy driven secure approach at managing all these devices and making sure there are few to no data leaks.
THE CHALLENGE Essar is one of the companies that have extensive strategies at work to manage mobile devices that various internal employees are using. However, not before facing their won set of challenges. In order to organise this, Jayantha Prabhu, Group CTO, Essar put in solutions that could help in creating policies and secure access to his applications while being device agnostic. “Essar has been an early adopter of mobility as part of the Information Strategy with the adoption of 6000+ Blackberry Smartphones and 500+ Apple Tablets for
34
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
MOBILITY
senior and middle management productivity enhancement,” say Prabhu. The company is well on its way in the implementation of BYOD for Employees and has strategically made investments in Technologies that will allow them to reach the goal of BYOD implementation across the entire Group. “Given the device landscape when it comes to mobile, as you may understand, it is quite complex. Although the largest selling platforms today may be IOS, Android, Blackberry, we still have instances of other operating systems like Symbian to support,” claims Prabhu. Fundamental questions like how to create a secure and robust architecture for enabling information delivery to various platforms including mobile devices and supporting multiple operating systems came into focus. Prabhu mentions that security of enterprise network, mobile devices and applications is of paramount importance. In addition with security, organisations also need to consider tools and functions to be used to extend their security polices to the mobile network. It is also important to mention that data synchronisation and integration of mobile technologies or different mobile applications with other enterprise resources and systems is critical to ensuring smooth delivery. Other concerns encompassed supporting different mobile devices like blackberry, iPhone, Android, and others, which add complexity to the enterprise; limiting the use of unauthorised and unsupported tools and applications on the mobile devices for enterprise executives and exercising policies to do so needed to be looked at. One also has to consider the development of management tools and building existing developer expertise to create an environment where support does not become a problem.
THE SOLUTION “We have enhanced our IT infrastructure capability to address the risks and concerns that would arise from the Mobility Strategy for the organisation. There are a number of solutions that we have implemented which fit into the complete security landscape of the organisation,” explains Prabhu. Essar
has Blackberry Enterprise Server and Blackberry Enterprise Express Server, SAP Afaria Mobile Device Management, SAP SUP Mobile Platform for Application portability on Mobile devices, Exchange 2010 and its Device binding capabilities for restriction of devices on which Active Sync is allowed, Antispam solutions, Data Leak Prevention solutions, Citrix Desktop Virtualisation Solution, Juniper SSL VPN solution and Seclore Rights Management Solution.
COVE R S TO RY
These solutions together, allow Prabhu and his team to have immense management capabilities for the device ecosystem at Essar, thus allowing the users to enjoy utilising new technologies to improve productivity at all levels. All this combined with a policy driven approach assures secure access to each user.
THE BENEFITS As a result of their efforts, Essar workers
ANTICIPATED MOBILE TECHNOLOGY OVER THE NEXT FEW YEARS:
Mobile Devices
1. Hardware performance improvements and chipset consolidation is likely to continue, leading to: More powerful, thinner, lighter and energy efficient devices with desktop-like performance. Support for more powerful business applications, consumer applications and games. 2. Carriers are likely to continue to support several hardware vendors to benefit from supplier competition. 3. Carriers and device manufacturers are likely to continue to offer devices at different styles and price points to appeal to different market segments and tastes.
1. GPS is likely to be standard, enabling widespread use of location-aware applications. 2. Cameras are likely to be standard, enabling real-time photo sharing, videoconferencing, image projection, bar code scanning and augmented Embedded reality overlay. Hardware 3. Near Field Communications (NFC) are likely to be standard, enabling widespread mobile payment. 4. Sensors are likely to increasingly be present in devices, creating new data collection opportunities. 1. Carriers are likely to continue investing in spectrum, towers and wireless technology, leading to wider high-speed coverage, more reliable connections and faster connection speeds. Wireless 2. Mobile devices are likely to support multiple wireless technologies and Technology transparently transition across indoor, outdoor, urban, suburban and rural locations: Embedded Wi-Fi enables indoor coverage, cellular offload and tariff avoidance.
Mobile Operating System
1. Rapid release cycles are likely to continue as vendors fight for market share through innovation. 2. Continuous feature additions are likely to force continuous user interface experimentation. 3. Mobile applications are likely to increasingly leverage GPS and sensor data provided by the operating system (OS). 4. OS consolidation is likely to be limited due to carrier desire for competition. Mobile devices are likely to become more powerful and sensor rich. The hardware and mobile OS environment is likely to remain heterogeneous.
Mobile devices are likely to become more powerful and sensor rich. The hardware and mobile OS environment is likely to remain heterogeneous.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
35
COVE R S TO RY
MOBILITY
ANTICIPATED MOBILE TECHNOLOGY OVER THE NEXT FEW YEARS: Social Media
Pervasive social networking features in web sites, business applications, games, etc. Micro blogging on corporate Intranets as unstructured knowledge-sharing vehicle
Mobile Marketing
Location-based services and customer engagement tools, such as couponing, loyalty programs, awards, proximity marketing, geo-tagged purchase history, etc. Crowd sourcing and consumer apps that provides instant notification of problems
Mobile Payment
Near Field Communications (NFC) and alternative technologies will likely streamline Peer-to-peer and conventional retail financial transactions
Augmented Reality
Real-time view of the physical environment overlaid with virtual, computer-generated Image and information
Personal Productivity
Personal medical data monitoring and analytics, text-to-translated-speech, remote house control, remote car monitoring, etc.
Enterprise Applications
Instant access to any business application, workflow or transaction; faster approvals Users able to seamlessly roam between devices while accessing cloud-based services Increasing use of sensor data (geo-tagged transactions, transaction photos, etc.)
Mobile Analytics
Instant access to business dashboard from any location with extensive drill-down capability
Business Impact/Number of Mobile Apps
Software innovation is likely to continue and result in new applications that enhance our personal and professional lives
Purchase Mobile Devices: Mobile access to existing apps No mobile app development Result: Poor user experience (UX) and negligible productivity. CRM revenue gains
Stage 1
now enjoy anywhere, anytime and secure access to all the corporate data they need in the form of mail or virtual desktop of streamed applications. “You can take the mobile devices to the point of service and input data on the move. That’s good news for some job cat-
36
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Mobilise Existing Applications: Develop new graphical user interfaces (CUI) on top of existing business logic Result: Acceptable UX and noticeable productivity, CRM and revenue gains
Mobility-Centric Innovation: Develop completely new apps that leverage mobility benefits Result: User-centered UX and new productivity. CRM and revenue opportunities
Stage 2
egories such as technologyt and sales representatives. As is obvious to anyone that has tried, laptops were never intended to be used this way,“ says Prabhu. Corporates can be assured with the abilities of Mobile Device Management tools of minimising the risk of data theft
Stage 3
of the data on mobile devices due to the remote wipe capabilities of these tools, says Prabhu. He also claims that employee efficiency and effectiveness has increased due to the ability to access information and respond while on the run.
NEXT
HORIZONS
FEATURES INSIDE
Adapting Legacy Networks to the Cloud Pg 40
Reasons Why vSphere 5 is Coming Out on Top Pg 41
A ILLUSTRATION BY MANAV SACHDEV
reluctance to make a commitment to a public cloud is one of the reasons that many organisations decide to develop a private cloud in the reassuringly familiar surroundings of their own data centers. Ironically, the winding path that leads to the development of a robust private cloud environment inevitably leads right back to public clouds.
The many benefits of private cloud
Working with Hybrid Clouds While making a commitment to cloud, the only relevant consideration you need to worry about is what cloud technology means to you BY MIKE MARTIN
38
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
There are tremendous benefits to building a private cloud . Some of the benefits of cloud computing, however, are simply not available to you without leaving home. The most obvious one is that you don’t have to pay for capacity in a public cloud out of your own capital expense budget. It’s there whenever you want it, and you pay for it only when you use it. But you don’t have to choose either a private cloud or a public cloud. That’s old thinking. The good news is that those organisations that focused on systematically developing a private cloud have completed all the steps they need to leverage all the benefits of public cloud computing, as well. The IT service management (ITSM) protocols based on IT Infrastructure Library (ITIL) best practices that you put in place as well as the lessons you learn setting up and
C LO U D
using your private cloud give you the skills and confidence to expand your IT universe beyond your data center. The future role of your private cloud may very well be to act as the broker that controls how you interact with and control other cloud environments.
Point of proof: You are using public cloud now Many organisations have been using public cloud capacity in some form for years. Payroll was one of the first applications to be allowed out of the data center. Email and other utilities that are not core competencies of the organisation have accelerated the trend. Salesforce.com has been like a universal proof of concept for software as a service (SaaS). An almost instant availability and pay-as-you-go cost structure for SaaS has since opened the floodgates. The availability and appeal of SaaS has also dramatised the need to implement a comprehensive cloud strategy. Many IT departments are finding that business units are getting away with a little ad hoc self provisioning by signing up for SaaS outside of the purview of the IT department. The hybrid cloud model promises to provide the coordinated monitoring and management that will make it possible for the IT department to realize a high level of scalability and operational flexibility by tapping a variety of cloud resources on demand without losing control of corporate data or breaching security.
NEXT HORIZONS
Many IT departments are finding that business units are getting away with a little ad hoc self provisioning by signing up for SaaS outside of the purview of the IT department into cloud computing, however, they need to know enough about hybrid clouds to not inadvertently make it more difficult to implement one when the time comes... because the time will come.
40 years of paradigm changes
As it has been for every IT paradigm changer in the last 40 years, the weight of conflicting standards sits heavily on the coattails of hybrid cloud technology. You obviously get to pick the architecture of your private cloud including the OS, hypervisor, APIs and management tools. Someone else, however, gets to pick the architecture of public cloud environments and, until all the choices become interoperable, if you anticipate tapping the public cloud in the future, you need to begin aligning your private environment with the intended public environment. Your users are going to want to be able to create applications, or move existing applications between the clouds in a hybrid cloud environment, without having to change anything serious like networking, security policies, operational processes or management/ monitoring tools. Some of the promises of cloud computing Compelling but not ready to fly have been around so long there’s a tendency It’s a very compelling model, which, to assume they must be real. frankly, doesn’t actually fly just That would be wrong. Take yet. All the various devils in bursting. The concept is huge: all the details of an expanded the ability to “burst” to a public cloud environment have to be cloud with your virtual machines addressed, defined, tamed and EMPLOYERS TO (VMs), applications and data and systematised before the hybrid MONITOR THEIR grab whatever you need whencloud can live up to its potential. ever you need it. Cool! The concept of the hybrid EMPLOYEES' It is in fact hard to do. You can cloud as controlled environFACEBOOK PAGES make arrangements with a partment that can be monitored and BY 2015 ner and specify all the protocols managed behind a single pane for you to grab resource time of glass does point the way for from a public cloud, but the reality is so the evolution of cloud computing. You obviconstrained that it makes the term “burstously don’t need to have a hybrid cloud to ing” seem inappropriate. The reality is more begin taking advantage of the benefits of like negotiating a pre-nuptial agreement both private and public cloud computing. between a skittish bride and groom. It can As IT departments begin the long journey
60%
be done, but it’s not going to be easy. There are also pesky laws of physics to consider. Electrons only travel so fast and moving a workload to a public cloud data center over a significant distance can cause latency, poor performance and other adverse side effects. Many of these issues can be addressed with appropriate application design, but they still need to be considered. There are some open standards like VMware’s vCloud API and Red Hat’s open Deltacloud API as well new entities, i.e., the Distributed Management Task Force (DMTF), the Open Grid Forum (OGF), theStorage Networking Industry Association (SNIA), and the Open Cloud Consortium (OCC). As more open standards are established, you won't have to worry that when you check into a public cloud environment you'll find yourself in a Hotel California from which you can't check out. But agreeing on all standards is going to take a while. In the meantime, you need to keep one eye on the future, but the other eye focused clearly on where you are today. Becoming aware of and understanding every aspect of your current IT environment is the best guide to developing a comprehensive cloud strategy that will work for you today and give you the easiest access to the blue-sky potential of cloud computing tomorrow. Don’t be distracted by all the dazzling promises, emerging standards and other baffling changes that swirl constantly around cloud computing technology. The most important consideration is what cloud technology means to you. — Mike Martin is VP, cloud solution group for Logicalis, an international provider of integrated information and communications technology solutions and services. — This article has been reprinted with permission from CIO Update. To see more articles regarding IT management best practices, please visit www.cioupdate.com.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
39
NEXT HORIZONS
C LO U D
Adapting Legacy Networks to the Cloud Legacy infrastructure can play a large role in this new world, but it will have to change to keep up with the times BY ARTHUR COLE
ILLUSTRATION BY MANAV SACHDEV
W
hat is the best way to design network architectures for the cloud? That is literally the milliondollar question as cloud computing quickly evolves into the new normal for IT.
connectivity plus multiple RAID configurations. And the company says it can deliver a much lower price point than traditional Fibre Channel, iSCSI or FCoE designs.
No network at all?
Of course, don't try and tell companies like Brocade that Fibre Channel has no place in the cloud. The company recently unveiled a new addition to its suite of cloud-optimized 16 Gbps FC solutions. The entry-level 6505 switch is available in 12- or 24-port configurations in a 1 RU footprint and adheres to the company's inter-chassis link design that allows for flatter data fabrics when joined with higher-end systems like the 6510 switch and the DCX 8510 backbone system. Brocade and others have long made the argument that as virtual and cloud architectures place greater demands on storage and storage networking, the need for higher-end technologies like Fibre Channel will grow, not diminish. Still others argue that since the cloud rests on flexibility and the provision of dynamic data environments, the time has finally come for a true open source revolution. Organisations like the Open Networking Foundation, developers of the OpenFlow protocol, say that proprietary architectures simply can't keep up with the demands of cloud users. After all, it's no coincidence that early cloud providers like Google and Amazon devised their own network architectures rather than pull off-the-shelf solutions. At the same time, some of the most notorious proprietary developers on the planet are starting to make nice with the open source
Many of the new Flash storage proponents say the best network is no network at all. Simply load servers or near-line arrays with high-capacity solid state drives (SSDs) and run it all through the PCIe interface. You get a highspeed storage infrastructure at a fraction of the cost of a fullblown SAN without all those switches, adapters and lengthy cable runs. However, most enterprises have poured a lot into their existing storage infrastructure and wouldn't mind just a little more return as the cloud era gets underway. But as the Wicked Witch once said, "These things must be done delicately … or you hurt the spell." When it comes to traditional Ethernet SANs, the holy grail would be an infrastructure built on commodity hardware but packed with enough specialised software to enable the kind of flexibility and scalability that cloud computing requires. A start-up called Coraid is moving in this direction with the EtherDrive, a connectionless parallel storage array that delivers up to 1.8 gigabytes per second throughput using raw ATA over Ethernet architectures. With each EtherDrive utilizing its own processor and providing addressability to any host on the network, the system can easily scale to more than 65,000 shelves, providing a mix of SSD, SATA and SAS
40
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Fibre and cloud: fire and ice?
C LO U D
community now that the cloud is dangling real dollar signs before their eyes. Witness Microsoft's recent support of the Remote Direct Memory Access (RDMA) protocol in Windows Server 2012. RDMA has received strong backing from the OpenFabrics Alliance where it provides the framework for handling large file-based workloads in virtual and cloud environments built on Ethernet and InfiniBand networks. As part of the deal, Microsoft will enable RDMA for remote file access in server message block v.3 (SMB3), as well as a kernel bypass RDMA API that should enhance third-party development of OpenFabrics software (OFS) applications. As prior generations of enterprise technicians will tell you, however, open source does not necessarily mean cheaper, better or more flexible. It all depends on the level of cooperation among supporters
NEXT HORIZONS
and the degree to which your existing infrastructure can accommodate multi-vendor environments. By nature, cloud is intended to provide a melting pot of solutions from which enterprises can draw the most efficient and effective solutions. Legacy infrastructure can play a very large role in this new world, but it will have to change to keep up with the times. —Arthur Cole covers networking and the data center for IT Business Edge. He has served as editor of numerous publications covering everything from audio/video production and distribution, multimedia and the Internet to video gaming. — This article has been reprinted with permission from CIO Update. To see more articles regarding IT management best practices, please visit www. cioupdate.com.
Reasons Why vSphere 5 is Coming Out on Top
T
Tips for exploiting some of the best features of vSphere BY PAM BAKER
he market heavily anticipated vSphere 5 and its arrival disappointed none. However, it comes packed with nearly 200 new and enhanced capabilities and that can make it a little difficult for some to figure out which features are best to exploit in any given scenario. The tips listed below will get you started in understanding which may mean the most to you and your company.
200 enhancements and solid performance but...
ILLUSTRATION BY MANAV SACHDEV
By all accounts, VMware’s latest release of its virtualisation platform is a solid product, with much to be hailed from the 200 new enhancements added to the a mature and broad ecosystem. VMware vSphere5 is “the most user friendly virtualisation solution on the market today,” said Janusz Bak, CTO of Open-E, an enterprise class storage management vendor. The new version brings plenty of technological improvements “which are THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
41
NEXT HORIZONS
V I R T U A L I S AT I O N
mostly transparent to the user” he added, and the GUI in version 5 “provides a few nice but not really significant improvements from the user perspective.” However, it also packs a few reasons for a concerned frown. Among these are concerns over the broader cloud strategy, which includes more than just vSphere and involves “orchestration, automation and portal tools that are not as mature,” said Eric Chiu, president and founder of HyTrust, a preferred VMware partner. High costs are another concern, he said, as are the “lack of security and compliance controls to ensure adherence with industry and legislative mandates, as well as corporate governance.” But it is the “new licensing based on virtual RAM usage” that gives Bryan Semple pause. “It will negate much of the value of consolidation; pushing CIOs to consider alternatives such as Hyper-V for anything but most mission critical applications,” said Semple, who is CMO at VKernel, a subsidiary of Quest Software. Nevertheless, the point is made even, if not especially so, in the criticisms. vSphere 5 is still the prime cut and the drawbacks do not dull the claim. As with any major software upgrade or investment, it is best to know as much as possible before undertaking any deployment. To held with that, Logicalis, an international IT solutions and managed services provider, has outlined eight check list items CIOs should know before taking the plunge: 1. Increased size of the virtual machine (VM): Before vSphere 5, it was typical to find eight processors and 256 gigabytes of memory assigned to a VM, while the hardware had 24 processing cores or more. When virtualising the most resource-intensive applications, that meant that some CIOs were not able to virtualise large databases and application servers. Now, with vSphere 5, 32 processors and as much as a terabyte of memory can be assigned to a VM. That means IT pros’ hands are untied. They can now virtualise businesscritical application or computing workload with confidence. 2. Enhanced distributed resource scheduling: vSphere 5 gives IT managers the ability to use additional metrics, such as storage IO performance to automatically distribute
42
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
workloads. If the storage requirements of a particular VM are too intense, resources can be moved to other VMware servers, giving IT pros the ability to get more servers virtualised on less hardware. 3. Oversubscribe with confidence: vSphere 5 is aware of solid state drives (SSDs), so IT pros who have been accustomed to overcommitting their memory resources, betting on the fact that not all memory will be used at any given time, can now allocate their resources with confidence. With strategic placements of SSDs, memory constraints can become a thing of the past. 4. Enhanced security: With the VMware
vMotion technology allows IT pros to move a VM from one physical server to another while it is still running without downtime. Now, with vSphere 5, this can be done even in higher latency networks vShield 5 product family, users get the opportunity to simplify security and eliminate the need for multiple software agents and security policies. vShield App is a hypervisorbased application aware-firewall that installs on each vSphere host to create a “trust zone” of logical and dynamic application boundaries rather than the physical boundaries associated with traditional security offerings. In addition, vShield App includes a network Layer 2 firewall optimised to support intrusion prevention systems (IPS) from VMware security partners. 5. Improved availability: Typically, if a company is running multiple VMs on a single piece of physical hardware and that server crashes, the business has lost all of those VMs instead of just the one physical server. But in vSphere 5, if a company is running multiple physi-
cal servers and one goes down, vSphere can automatically take all of the virtual machines that were running on the one that crashed and bring them up on the remaining intact hardware. The servers stop at the crash, but are rebooted as they are moved to the remaining physical servers. While this technology is not new, with vSphere 5, there are significant enhancements that allow this to happen faster. 6. VMware vMotion over higher latency networks: This almost sounds like magic, but it’s one of the most incredible vSphere features of all. vMotion technology allows IT pros to move a VM from one physical server to another while it is still running without downtime. Now, with vSphere 5, this can be done even in higher latency networks, so instead of just moving from one local server to another, the VM can be moved from one server to another in a completely different data center even over a high-latency campus network. 7. Management changes: With vSphere 5, there is now the ability to manage a vSphere virtualised network through a Web browser from anywhere in the world. Another appealing new management feature is the new switch port analyser, a mechanism that supports network diagnostics and NetFlow, giving IT pros the ability to listen to traffic from five ports at a time, which increases monitoring capabilities and decreases troubleshooting time dramatically. 8. Auto Deploy: Now IT departments can automatically deploy servers “on the fly” and cut down the time it takes to deploy a datacenter with 40 servers, for example, from 20 hours to 10 minutes. Once the servers are up and running, Auto Deploy also automates the patching process, making it possible to instantly apply patches to many servers at once. With the LAN-based booting of VMware, CIOs can also configure vSphere to automatically be installed over the network in a diskless environment, communicating with a central server and downloading the needed info as soon as it’s powered on. — A prolific, Pam Baker writes about technology, science, business, and finance for leading print and online publications — This article has been reprinted with permission from CIO Update. To see more articles regarding IT management best practices, please visit www.cioupdate.com.
Now in its 13th year
CTO Forum
Celebrates
Ever growing numbers an ENviable engagement Superior quality Best-in-class content Unique destinations Trend-setting formats and in a new avatar as...
cio&leader
To live the CIO&Leader credo 120 cios will assemble at the...
Gold Partners: Curious to know what's in store? Watch out for more information in the forthcoming issue of CIO&Leader.
T E CH F O R G OVE R NAN CE
M A N AG E M E N T
5
POINTS
T HE CXO FUNCTION expects air-tight security within the application APPSEC PROFESSIONALS ARE often expected to perform miracles and mitigate flaws APPSEC CAN never be held responsible for processes that are offline
ILLUSTRATION BY ANIL T
AUDITING FOR password management is always a tricky situation APPSEC CANNOT address the problem of password sharing
WHY APPSEC WON’T ALWAYS BAIL YOU OUT AppSec is a process that should ideally be invoked right at the inception of the applications SDLC (software development life cycle) BY: DHANANJAY ROKDE
46
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
M A N AG E M E N T
T E CH F O R G OVE R NAN CE
While such business rules may save some time; this is definitely the worst practice to adopt. A typical request-approval workflow works on the basis of the requestor (the maker) posting a request and some approver (checker) taking a decision to approve, reject or hold the request. This workflow is generally disrupted by adding functionalities like the ‘checker being able to modify the request’ or the ‘checker being able to delete the request’. In such a scenario no there is no validaAppSec professionals are often expected tion or approval on the action taken by the to perform miracles and mitigate flaws that checker and the very essence of the makerare often connected with the lifelines of the checker mechanism is lost. AppSec can only application. While business pressure will detect flaws (if any) in the transfer of control always compel the teams to have applicafrom the maker to the checker; But it can tions up and running; it is never an easy sitnever challenge the business rules or the uation for any CISO to let such applications excess privileges assigned to the checker. fly without the proper checks and balances. Password Management Here are a few crucial factors that every Auditing for password management is CISO needs to consider before signing-off always a tricky situation for AppSec profesapplications and eliminating the blind relisionals. While AppSec can always verify ance on AppSec assessments. Although password strength, secure password storage AppSec assessments are vital they can and transmission. AppSec not dictate terms never address the people, processes & techon the hard-coding of passwords into applinology completely: cation frameworks. The most commonly found password Lack of STP (Straight-Through-Processmanagement lacunae are Hard-coding ing) & Manual Hand-offs passwords into macros and stored proceAppSec can never be held responsible for dures and using a uniform password across processes that are offline or that are perthe application framework. Because these formed manually. While AppSec testers can passwords are hard-coded and difficult to test for data validation; they can never test change application development and infrafor business rules. It is a common practice structure teams often seeks exceptions to in several organisations, to have online ‘never change the passwords of the target workflows that detach themselves into systems or databases.’ (smaller or multiple) manual tasks. Besides this; AppSec can also not address These could include physical verification/ the problem of password sharing among the inspection, offline approvals or matching application development teams. records with another system. Whenever there is manual hand-off; the application has Excessive Super-User privilege abuse to rely on the validation of the incoming data. Singular administrative user credentials This data can never be tested AppSec being used by an entire team for local/ resources. This is simply remote administration like because Applications only conrunning backup scripts, routrol the use of resources granted tine batch jobs or updating to them, and not which resourcand patching, is one the worst es are granted to them. enemies of AppSec. While AppSec assessments Intentional disruption of revolve around the application maker-checker mechanism COMPUTERS HAVE components residing on the One of the most observed pracBASIC SOFTWARE infrastructure; having multiple tices with corporations is the SECURITY super user identities or sharing dissolution of the maker-checkcredentials of administrative er mechanism in the name of users completely defeats the ease of use and time-saving.
It is very typical of organisations to perform web application (WebApp) security assessments before the go-live of newer applications or periodic assessments of their existing applications
And these assessments are known by all sorts of aliases like application penetration testing (App PenTest), ethical application hacking etc. For those companies lacking the internal core competency of AppSec, often outsource this activity to competent third party players in the market.
What does the CxO function expect post an AppSec assessment? This (the AppSec assessment) is often treated as an additional or ancillary investment to the core development expenditure. The CxO function expects air-tight security within the application after such an assessment. Once the development teams start mitigating actions; one can often hear statements filled with hyper-expectations like ‘the application should now become unhackable’ or ‘no one break the application now & it can go public.’
So why are AppSec tested applications still not secure? In most of the cases applications undergo assessments when they are either almost ready for production or already in production. This is against the spirit of AppSec to begin with, as AppSec is a process that should ideally be invoked right at the inception of the applications SDLC (software development life cycle). Very rarely are AppSec resources involved during the requirement analysis or the finalisation of the design. And therefore the assessment that happens (post development) is more of a corrective activity rather than a proactive one. Flaws and vulnerabilities that could have been killed right at the beginning; are most often patched (with quick hacks & not actual AppSec best practices) after the application is already in production.
83%
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
47
T E CH F O R G OVE R NAN CE
M A N AG E M E N T
Before starting the AppSec assessment; internal teams must ensure that a clone environment is ready. This decreases the chances of the application becoming unviable due to unforeseen effects of the assessment purpose of implementing AppSec controls. Allowing too many user identities to directly access the application backend, makes access auditing very complicated & this also makes change and incident control very challenging. Questions like ‘who did what & when?’ It becomes very difficult to answer. It is therefore extremely essential to audit & restrict unnecessary access on the infrastructure that hosts the application. Unauthorised migration of environments Developers often start development on a sandbox environment (colloquially known as the ‘Dev’ environment). As soon they start progressing on their (software) builds/ releases; they often do not port the changes into the UAT (User acceptance testing) or QA (Quality assurance) environments. This is a very common blunder made by many development teams under the pretext of meeting stringent timelines and lack of migration strategy. This causes same build/ release to mature on the ‘Dev’ environment itself & the same environment eventually lands up in ‘production mode.’ Before actually starting the AppSec assessment; internal teams must ensure that a clone environment along with the production is ready-at-hand. This decreases the chances of the application becoming unavailable due to unforeseen effects of the assessment. Sometimes the AppSec testers run intrusive checks which have the potential to bring down essential services within the application. Besides this, a clone QA or UAT environment helps to expedite the vulnerability mitigation process, without any negative business impact. Excessive dependency of automated scanning tools and services Most organisations looking to build-up their internal competency towards App-
48
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
Let me illustrate my point with a simple scenario that highly persuaded me to give equal importance to NetSec, while assessing WebApps – When an AppSec tester is able to manually verify a privilege escalation, he/ she would generally note down the affected module (piece of the application) and rank this risk based on the data that became visible, as a result of running the test. However; this escalation may not necessarily take him any further and could be dead-end. A flaw – nevertheless; but doesn’t result into someone taking over the complete application. While the AppSec test will conclude in that manner; NetSec pros take this to the next level. They generally don’t rest until they have struck the application really hard. They will peruse this till they find some serious information leakage, an SQLi (SQL injection) that reveals some fascinating data, or any general platform flaw that lets them ‘own’ the entire system. The key difference that I observed here is Why a NetSec (Network Security) that while AppSec folks will generally not assessment if often better at venture beyond assessing and testing; Netannihilating WebApps? Sec pros take the application This is purely based on my environment to its breakingpersonal experience after I saw point. This clearly indicates some interesting results posted the distinct ideology of the two on some internal NetSec assessskill sets. ments. The approach of NetSec The argument here is not that if professionals is very different an assessment is better than fullfrom the AppSec folks. OF IT PROJECTS blown PenTest or not; but that NetSec pros rather concenARE DRIVEN BY sometimes AppSec professionals trate on the attack-surface (servCOMPLIANCE get mental blinders and that they er infrastructure and communishould freely consult with their cation equipment) than get into NetSec peers for helping them the application itself. AppSec perform successful PenTests. and NetSec, both are hot skills in the market and good resources are very — This article is printed with prior permission hard to find. This is in no way comparison from www.infosecisland.com. For more features of intellect or level of difficulty of either of and opinions on information security and risk the disciplines. management, please refer to Infosec Island.
Sec, often procure some sort of automated scanning tool or a service. These services are also offered a pay-as-you-use on-demand cloud service. One of the key aspects here is that these tools or services are completely Black-Box. These tools do NOT have the ability to: 1 Understand business rules and workflows. 2 Detect & Interpret ‘logical’ vulnerabilities. 3 Can perform ‘deep crawling’ in sophisticated applications that do not give all the links. 4 Support for JavaScript and Flash based vulnerabilities. Most often several of the vulnerabilities reported by these tools are false-positives (and worse; sometimes false-negatives, too). A great amount of human effort is required to fine-tune these scanners. Automated scanning can never replace human AppSec professionals; these tools only help to facilitate the assessment.
30%
M A N AG E M E N T
T E CH F O R G OVE R NAN CE
Patch Management: Accuracy & Automation An indepth look at the two options of patch management - replacement or update of active workloads or virtual machines BY RAFAL LOS
L
et’s talk about consistency with respect to patch management as a key aspect of your cloud computing strategy. If you’ve chosen wisely, you environments across your public and private clouds are consistent. Often this means that you're looking at the same technical implementation of the same infrastructure and software to run your clouds, right down to the patch level of each component. Consistency also refers to the internally consistent state of your cloud with respect to your virtual workloads and servers/systems. This is where the discussion of patch management becomes quite interesting. After doing some digging on Google for interesting sources of patch management (a la consistency) for the cloud, I’ve come up relatively empty for anything that addresses this particular perspective. The big question is - how do we keep our environments consistent in the face of security requirements to push patches? The answers rely very heavily on automation and policy. The main point of decision is around the policy direction you’re planning on taking. There are two separate thoughts on how patches can be pushed to keep an environment consistent. Patch management can either be done via replacement or update of active workloads or virtual machines. Let’s look at these two options.
ILLUSTRATION BY RAJ VERMA
In-Place Update One way to push patches is the same way we’ve always done it. The way we’ve always done it involves taking an existing machine, albeit now virtual, and updating the patch level of either the operating system, applications, or something else to the latest and greatest after extensive testing in a non-production environment. What started out as an exercise done by hand many, many moons
ago now has healthy obsession with automation to drive pushing patches in an automated way, and monitoring whether they 'take’ in the environment. Automation creates a great deal of leverage, and for cloud environments at least, this means consistency can be achieved at much larger levels of scalability. All seems right with the world... if you’ve deployed multiple clouds for non-production and production environments you have the ability to use automation to push patches to a non-production environment, watch the result and gather metrics, learn lessons and then when you’re ready do the same to your production environment in some off-hours time or when traffic/workload is low. I mention a healthy obsession with automation because that’s what is required for every step of this to work right. You’ve got to have a fanatical attachment to your technology and deployment THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
49
T E CH F O R G OVE R NAN CE
SECURIT Y
strategy to know that the non-production environment is an exact duplicate of your production environment - and that your automation technology (and policies/procedures) are designed such that they account for the vast numbers of failure modes you can possibly encounter. What if one system works and another fails? There are thousands of what-ifs that must be accounted for an, in themselves, tested before you can trust this works. There are other issues as well... but let's leave those for another post. To sum things up, in-place updates are all about an obsession with automation and understanding failure modes, accounting for them all while keeping consistency of the environment top-of-mind.
and update that machine or you can simply update the base virtual machine and deploy it to the cloud as an updated base image. Then you can take that base image, re-deploy your application to it (hello, automation!) and simply use the cloud management framework to cut traffic over to that new deployment a little bit at a time monitoring for any adverse reactions (does the patch break the application?) and if nothing breaks you slowly move the entirety of the traffic over to the new environment and decommission the old. Like a wave of the magician's wand, done. Now, you can keep that old environment around, in a down state, for a while just in case you find some odd bug that only triggers with this new patch level on every other Tuesday at midnight or something weird... but for the most part you're happily WILL BE THE re-deployed. The in-place swap is a great way to showcase RATE OF the power of the cloud. GLOBAL MOBILE
42%
In-Place Swap
There is another alternative to patching in-place... swapping in-place. Rather than testing your patches in a non-production environment then rolling them out to Which is right? TRANSACTION production using mass automation and monitoring the Rather than asking which strategy of patch manageGROWTH BY 2016 push the other alternative is to simply forego that entire ment is right across the board, let’s ask which one is process and simply rebuild all the machines with the right for you. That’s the only thing that matters. There updated patch level (which arguably isn’t very hard), are a number of factors to consider including the type deploy the applications or workloads to those updated machines, of release cycles your organisation is working off of - for example and cut the traffic over from the old (non-patched) to the new (fully DevOps, NoOps, etc - and whether you have a single public cloud patched) virtual machines. Just like the in-place update, you’re going (or private cloud) or you’ve built your own consistent multi-cloud to require a healthy obsession with automation, but this automa(public, private, hybrid) converged clouds across several vendors or tion is less about actual patch-management and more about cloud physical deployments. management framework(s). Think about it this way... if you have an — This article is printed with prior permission from www.infosecisland.com. application running on server A with patch level 10 right now and you For more features and opinions on information security and risk management, please refer to Infosec Island. need to update it to patch level 11 quickly. You can either roll the dice
Root Cause Analysis (RCA): A Critical Skill Why don't more analysts, penetration testers, and others have that skill or the baseline knowledge?
BY: RAFAL LOS
R
ecently at TakeDownCon in Dallas I brought up a term during my offense keynote that I thought everyone in the audience would, and should, be familiar with. The concept of Root Cause Analysis (RCA) should be a familiar principle to you if you've ever worked the defensive side of information
50
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
security (or warfare) or if you’ve ever done any software reverse-engineering or hacking. RCA knowledge isn’t limited to hackers anyone doing any sort of incident response should be familiar with performing root cause analysis to identify failures’ roots, to unmask the source of a failure and figure out how to keep it from happening again.
If you think about what someone fuzzing an application input format does, the RCA is critical to figure out why something crashed, then how to exploit it in a meaningful way. Unfortunately, when I asked who was familiar with root cause analysis only a few hands out of the whole room went up. This was a bit distressing.
SECURIT Y
RCA is a critical skill for security professionals, no doubt, so why don’t more analysts, penetration testers, and others have that skill or at least baseline knowledge? There may be several reasons for this condition - one of which is the desperate need for information security professionals in today’s workforce. Whereas in the past I think many of the best minds in security "worked their way up" through the ranks of IT, many today find themselves going straight from college or a non-related profession directly into information security. These folks never have the opportunity to accumulate some of those baseline skills that make identifying failures, and creating mitigations a critical role of information security professionals. It’s not about pointing fingers or laying blame, it’s about education and making sure the base level of knowledge to be a good security professional is consistent throughout our industry. I definitely consider the ability to understand and perform an effective RCA is part of a security professional’s toolkit. It’s baseline knowledge you should have, if you want to be effective in information security. I further think it is skills like the ability to perform a thorough root cause analysis that separates the "hackers" from the "buttonpushers" in the industry. What makes a good root-cause analysis, anyway? An RCA is an analysis of a failure to determine the first (or root) failure that cause the ultimate condition in which the system finds itself. If you’re looking at an application crash you should be thinking - "why did it crash this way?" Your job in performing an RCA is to keep asking the inquisitive "Why?" until you run out of room for questions, and you’re faced with the problem at the root of the situation.
Let’s take for example an application that had its database pilfered by hackers. The ultimate failure you may be investigating is the exfiltration of consumer private data, but SQL Injection isn’t what caused the failure. Why did the SQL Injection happen? Was the root of the problem that the developer responsible simply didn’t follow the corporate policy for building SQL queries? Or was the issue much more subtle such as a failure to implement something like the OWASP ESAPI in the appropriate manner? Or maybe the cause was a vulnerable open-
T E CH F O R G OVE R NAN CE
asked to repeat the failure they often can’t duplicate the condition because they didn’t adequately identify the root cause of the initial failure - so the problem often goes unresolved. RCA is super-critical in the software security world. When you're looking at a web application from a dynamic perspective (as many of you do when you test web apps) you often find several parts of the application which are vulnerable to something like cross-site scripting (xss). Several, sometimes dozens, of form fields in various parts of the application appear to fail at filtering and handling input appropriately. If you were to give this condition to a developer to fix - you’re going to be in for a fight because all you’re doing is telling them where in the application the issues lie, but not in the code. Fact is, I recently saw an application with upwards of 50 cross-site scripting vulnerabilities in a web application that all that the same rootcause. A root-cause analysis (which HP's web application security testing technology can now do in a mostly automated fashion) can link those 50+ XSS issues to a single line of code in the application input handler. This sort of efficiency, and pin-point accuracy allows for not only a more collaborative conversation with development, but for a faster time-to-fix - which is the grail of software security. Think about that the next time you're testing an application and you find dozens of SQL Injection or Cross-Site Scripting vulnerabilities ... are they all just a single vulnerability which manifests itself in many parts of the application? Wouldn’t you want to know that?
ILLUSTRATION BY RAJ VERMA
An RCA is an analysis of failure to determine the first (or root) failure that cause the ultimate condition source piece of code that was incorporated into the corporate application without passing it through the full source code lifecycle process? Your job when you’re performing an RCA is to figure this out. On the offensive side of the coin, you should be able to figure out the root causes of the failures you generate. "Because the tool broke the app" doesn’t cut it. I’ve often heard "I piped garbage into the application inputs until it crashed" from people fuzzing applications. When
— This article is printed with prior permission from www.infosecisland.com. For more features and opinions on information security and risk management, please refer to Infosec Island.
THE CHIEF TECHNOLOGY OFFICER FORUM
CTO FORUM 07 JUNE 2012
51
VIEWPOINT
ILLUSTRATION BY PRINCE ANTONY
KEN OESTREICH
Cloud Computing
What’s a Service? Who Are SPs? MY PERSPECTIVE of Cloud Computing has been rocked a bit. I assumed I knew what IaaS, PaaS and SaaS meant, and I thought I knew who the logical public providers of these services would be. However, there have been a number of interesting new services, and providers, that show signs of breaking the stereotypes. And in an increasing number of cases, I am seeing these services being sourced from providers that are not the traditional hosting or SaaS providers you might expect. This leads me to believe that there will be a number of business models which the cloud makes possible that we’ve not yet considered. And these models will come from startups and established companies alike. The first set of services that got me thinking comes from PayPal, and is X. commerce. Somewhat analogously to Amazon, eBay has observed that its mainstay businesses of auctioning and payments could be extended by making infrastructure and component services available separately to online merchants. X.commerce is their attempt to make service prod-
52
CTO FORUM 07 JUNE 2012
THE CHIEF TECHNOLOGY OFFICER FORUM
ucts such as credit card payment processing, cataloging, auctioning etc. available as a la carte APIs outside of the eBay/PayPal site. We all think of eBay as an online “SaaS” platform, but they now need to be re-thought of as a cloud services provider as well. Wall Street Exchange, or Cloud Service Provider? I’m referring toNYSE Capital Markets Community Platform that Iblogged about back in June. In an attempt to capture more customers such as hedge funds, NYSE took quite a step to launch its own specialpurpose cloud-computing (IaaS) platform, right across the Hudson from Wall Street. Not only does is it intended to host high frequency trading apps, but it also houses a Big Data repository of historic exchange trading ticks, against-which trading algorithms can test their effectiveness. Much like the example above, we have a business that turns itself inside-out to expose internal services creating an entirely new business model. Next, I tripped over ifttt (If This, Then That). On the surface, this looks like a simple hosted scripting site. Anybody can set up a simple
ABOUT THE AUTHOR: Ken Oestreich is a marketing and product management veteran in the enterprise IT and data centre space, with a career spanning start-ups to established vendors.
decision/action... i.e. IF I’m mentioned on Twitter, THEN text me; IF I’m tagged on Facebook, THEN post a message... and so on. Ifttt goes further by allowing others to create combined “recipes” that you can use. In some sense, this is an ultra-simple service mash-up that’s hosted on the web for you. And, if you adhere to the theory that very simple actions can be combined and nested to form more complex actions, there might be some pretty sophisticated personal automation features that Ifttt will enable in the future. Ifttt is a pure online decision service. Not sure what I’d call them, but this is a frame-breaker. It’s a Store! It’s a Streaming Service Provider! Yes, even Walmart Entertainment is arguably now in the cloud Services game, with Disc-toDigital which (using Vudu, a video streaming acquisition) allows users to stream videos from the web, assuming they’ve already purchased a DVD (read: Digital Rights) in the store. I’ll be keeping my eyes out for other examples of new forms of IP disguised as cloud services. I’m sure I’ll keep being surprised.