Nov / Dec 2009
| THE INDEPENDENT RESOURCE FOR IT EXECUTIVES
Lisa Erickson-Harris Enterprise Management Associates (EMA) Roy Illsley Butler Group Clive Longbottom Quocirca Michael Lock Aberdeen Group
Join the IT revolution Prepare for the future
ETM ■ CONTENTS PAGE
and contributors page 7 Editor to the new look ETM 8 Welcome The service desk: and driver 10 cost-saver
The service desk is central to key process development activities and a target for IT executives in search of cost-saving measures. LISA ERICKSON-HARRIS (ENTERPRISE MANAGEMENT ASSOCIATES) says that the service desk is, no doubt, experiencing a resurgence in the market.
14 Data intelligence
David Hatch (ABERDEEN GROUP) moderates a panel discussion on the challenges and business performance effects that data management has on business intelligence initiatives. He is joined by ANDREW DE ROZAIRO (SYBASE) and SANDY STEIER (1010DATA).
26 The paradigm shift
The main message that ESM conveys is about the need for greater transparency of total asset usage. ROY ILLSLEY (BUTLER GROUP) tells us that ESM, with its cross business unit visibility, provides the information on what areas need to be addressed.
dive into SIEM 2.0 32 Deep
CHRIS PETERSEN (LOGRHYTHM) tells ETM’s ALI KLAVER that the limitations of SIEM 1.0 have been holding companies back for years, and why SIEM 2.0 is definitely worth the upgrade.
4
36 Working hand in hand
Finding and implementing a DLP solution that works across your infrastructure is key to staying secure in the face of new technologies. ETM’s ALI KLAVER talks to KATIE CURTIN-MESTRE and ANDREW MOLONEY (RSA, THE SECURITY DIVISION OF EMC) about the role of content awareness in successful DLP implementation.
41
ITSM—payback time?
Responding to change quickly and effectively while keeping an eye on the bottom line is proving to be problematic for a lot of organizations. CLIVE LONGBOTTOM (QUOCIRCA) tells us why ITSM is great for business.
45 The season for change
The management of IT change can make or break effective IT service management. SCOTT CRAWFORD (ENTERPRISE MANAGEMENT ASSOCIATES) moderates a discussion with GEORGE GERCHOW and JOHN MURNANE (EMC) who say that change management is central to assuring the quality of IT.
50 Future focus
Current market forces are changing the way we see the future, and the roles of IT service management and the service desk are only becoming more important for the running of the business. LISA ERICKSONHARRIS (ENTERPRISE MANAGEMENT ASSOCIATES) moderates a discussion with three industry experts; CHRIS WILLIAMS (BMC), TIM ROCHTE (CA) and MATT FRENCH (SERVICE-NOW.COM).
26
CONTENTS PAGE ■ ETM
Contents page called business 80 Career path intelligence for a reason 60 It’s How do you make the data you already have work for you? BILL DUNN (DUNN SOLUTIONS GROUP) says it’s easy if you have the right strategy and the right tools. Interview by ETM’s ALI KLAVER.
82
Events and features
76
and business continuity 64 Security
With an impressive list of partners and clients, NETASQ is fast becoming the security solution provider. FRANÇOIS LAVASTE and DOMINIQUE MEURISSE (NETASQ) join ETM’s ALI KLAVER to discuss exactly what they can do for you.
68 Secrets to success
Dedicated action is needed to get the best out of small and medium sized businesses. MICHAEL LOCK (ABERDEEN GROUP) tells us how to slash cost and empower the business user.
72 Security at its best
Astaro are so concerned about your security that they’re going to give you their product—free. GERT HANSEN (ASTARO) talks to ETM’s ALI KLAVER about why SMBs will triumph, the future of security, and why they’re giving away free solutions for everyone.
Proactive information 76 management
ETM’s ALI KLAVER talks to SIMON TAYLOR (COMMVAULT) about changing the way organizations think about and manage their information into a top-down approach that aligns both IT and business perspectives.
5
Spend less
Improve Access Control
BHOLD Enterprise Authorization Management Preventive Authorization Management 60% to 80% automated authorizing Let people work not wait for access
Minimize Identity Management (IdM) implementation Risks A solution within weeks Complete integration with Microsoft IdM solutions
Reduce Cost of Compliance Reduce IT Audit efforts on access with more than 50% Prevent unauthorized access and fraud
Make IT Responsive to Business Change Adopt reorganizations in days IT doesn’t need to be an impediment to change
www.bholdcompany.com info@bholdcompany.com
Editor’s Page n ETM Contributors
Fo u n d e r / P u b l i s h e r Amir Nikaein Managing Editor A l i K l av e r Ar t Director Ariel Liu He a d o f D i g i t a l Xiao Gang Lu
The IT revolution
T
his issue heralds something fresh, new and exciting for ETM. The team has been hard at work developing a new website in order to give you, our audience, everything you need at the touch of a button. For a snapshot check out page 8, or why not jump onto to www.globaletm.com and have a look around for yourself. We’re also very excited about the new poll function on our website. You can vote on a new poll each week—both fun and serious—and the results and the comments so far have been surprising. Plus, don’t forget to register while you’re there. It’s free and you have hundreds of whitepapers, podcasts, information and tools at your fingertips. Our bumper issue is full of up-to-date research and fantastic guidance to help you make the right decisions for your organization. As well as our many and varied podcasts, Roy Illsley from the Butler Group writes about how Enterprise Service Management gives visibility across the entire business, and Clive Longbottom from Quocirca tells us that ITSM allows you to respond to change quickly and effectively while keeping an eye on the bottom line. Michael Lock focuses on small and medium sized businesses and how to slash cost while empowering the business user, while Lisa Erickson-Harris, as well as moderating one of our panel podcasts on the Service Desk, also writes about how it is central to key process development activities and those in search of cost-saving measures. If you’re an avid reader of ETM, you’ll notice a new page at the end of this issue about the career and company of one of our favourite analyst partners. This page will give you a snapshot view of what a company (and a particular person) can do for you, and we’d like to thank Martin Kuppinger of Kuppinger Cole for being our first “Career path” expert. I hope that we’ve provided some handy updates and essential solutions in this bumper issue, and that they help in your day-to-day business processes.
Thank you for reading, and if you would like to contribute to any future issues of ETM, please feel free to contact us at www.globaletm.com or via email at editor@ enterpriseimi.com
Fi n a n c e D i r e c t o r M i c h a e l Ng u y e n Po d c a s t / S o u n d E d i t o r Mark Kendrick A ssociate Editors M a r y Wr i g h t Ann Read Account Executives Jo e M i r a n d a Sandino Suresh Marketing Executive Michael Le Contributors L i s a E r i c k s o n - Ha r r i s R esearc h D irec to r Enterprise Management A ssociates (EM A) Roy Illsley Senior Research Analyst Butler Group Clive Longbottom S e r v i c e D i r e c t o r, B u s i n e s s P r o c e s s Fa c i l i t a t i o n Quocirca Michael Lock Research Analyst, Business Intelligence A b e rd e e n G r o u p How to contact the editor
We welcome your letters, questions, comments, complaints, and compliments. Please send them to Informed Market Intelligence, marked to the Editor, Studio F7, Battersea Studios, 80 Silverthorne Road, London, SW8 3HE or email editor@enterpriseimi.com
PR submissions
All submissions for editorial consideration should be emailed to editor@enterpriseimi.com
Reprints
For reprints of articles published in ETM magazine, contact sales@enterpriseimi.com All material copyright Informed Market Intelligence This publication may not be reproduced or transmitted in any form in whole or part without the written express consent of the publisher.
Enterprise Technology Management is published by Informed Market Intelligence
Headquarters Informed Market Intelligence (IMI)
Ali Klaver Managing Editor
IMI Ltd, Battersea Studios, 80 Silverthorne Road London, SW8 3HE, United Kingdom
+44 207 148 4444 Tokyo 1602 Itabashi View Tower, 1-53-12 Itabashi Itabashi-Ku173-0004, Japan Dubai (UAE) 4th Floor, Office No: 510, Building No.2 (CNN Building), Dubai Media City, Dubai
7
FoCus on EtM n rEbranding
ETM’s new look | T HE I NDEPENDENT R ESOURCE F OR IT E XECUTIVES
NEW CONCEPT NEW IDENTITY
E
nterprise Technology Management— three words, or one company. They are conceptually separated, but can be operated together. They are individual, but corporate. ETM has created a brand new concept for the business of enterprise technology management, presenting it through the new logo. This extensive but grouped image with bright and modern technical blue and silver grey heralds a new vision of quality products and services for the future of our targeted IT market.
T
he ETM website is always the first face we show to EW WEBSITE our audience and clients. ETM has launched a EW STRATEGY brand new dynamic website with a fresh, clean and stylish appearance to enhance our professional products and services. The user-friendly interface and logical flow of operation helps users to quickly and easily find the up-to-date information they need. The development at the interactive digital magazine showcase and ETM video sections attract and retain attention from the very beginning. Audience and client participation and interaction are a strong function of the website—featuring polls, the latest news and industry-leading blogs to continually pull our dedicated audience back to the ETM website. They want to stay current, competitive, and interact with expert IT commentary—all of which they find at ETM. We are, in effect, building a virtual networking environment for current and potential audiences to exchange, and thereby strengthen, their IT knowledge and experience. This strategy will drive large amounts of traffic to the website, boost the visiting rate, and secure the future inbound rate. ETM is not only an IT management information website, but also the leading IT social networking stage.
WE HOPE
N N
YOU
ENJOY THE
NEW - LOOK
8
ETM
M
inimizing cost and achieving maximum return is the key to ETM’s rebranding.
Map out a course to IT Management Success with EMA EMA Coverage Areas Include: • ITIL • CMDB • Service Desk • Service Catalog • Security & Risk Management • IT Governance • Network Management • SLM/BSM • Storage • Systems Management • Virtualization • Application Management • Green IT • And more!
Register for an EMA Website Account to Access Free Resources, including: • Webinars • White Papers • Podcasts • EMA Solutions Center
Register Today: www.enterprisemanagement.com/registeretm
Enterprise Management Associates (EMA) is a leading industry analyst firm dedicated 100% to IT management technologies. Find out how EMA can help you successfully map out IT Management solutions in support of your business goals. Visit www.enterprisemanagement.com
ANALYST FEATURE â– IT SERVICE DESK
The service desk:
cost-saver and driver
T
he service desk is central to key process development activities and a target for IT executives in search of costsaving measures. LISA ERICKSON-HARRIS (ENTERPRISE MANAGEMENT ASSOCIATES) says that the service desk is, no doubt, experiencing resurgence in the market.
IT executives in charge of leading IT service management (ITSM) initiatives should stand up and take note of the service desk. While adoption of help desk technologies began more than 20 years ago, the rudimentary beginnings of the early trouble-ticketing systems have changed. Service desks are at the center of ITSM activity both operationally as well as strategically. The IT service desk is now at the heart of asset management, change management, knowledge management, the Configuration Management System (CMS), and more (see Figure 1).
Figure 1: The Modern Service Desk
Knowledge Management
Service Desk MobileSupport
Self-Service Governance Compliance Audit Varying License Models
ITIL Domains Service Model Service Catalog SLM Security/Identity
Š 2008 Enterprise Management Associates, Inc.
10
IT SERVICE DESK ■ ANALYST FEATURE
The purpose of the early help desk was to ensure the operability of technology within the enterprise. Users experiencing difficulty would be assisted by help desk personnel to smooth the way for technology to support all job functions—trouble-tickets were generated, issues escalated, and users were put back to work as issues were resolved. This was a simple concept with far-reaching results when implemented successfully. Fast-forward 20 years and the service desk is pivotal to the ITSM movement that is well underway. Similarly, the service desk represents low-hanging fruit for cost-saving efforts in IT. In this article, we’ll take a look at both dynamics—assessing why the service desk is so important for broader ITSM strategies and how it can be used to realize short-term cost-savings for corporations.
SERVICE DESK AS A DRIVER OF ITSM INITIATIVES nterprise Management Associates (EMA) began noting increased interest in the service desk in 2006/2007, coincident with increased investment in the IT Infrastructure Library (ITIL) best practice adoption. In 2009, EMA conducted a research study called The Aging Help Desk: Migrating to a Modern Service Desk to gain insight into the increased investment in the service desk. Over 150 participants in this study took a web-based survey and 16 additional companies were interviewed. This research captured IT executive sentiments toward the function of the service desk in its operations and its goals and priorities for investment. High-ranking management goals for the service desk were to improve customer satisfaction (82%), process deployment (72%), and cost reduction (70%). Research also shows that there is a need for multilanguage support and a smaller, but growing, interest in financial metrics for the service desk (44%), (see Figure 2).
E
Figure 2
Enterprise Priorities for the Service Desk
What are your organization’s larger management goals for the service desk? 82%
Improving customer satisfaction
72%
New or improved process deployment
70%
Reducing costs
46%
Improve integration with external systems
44%
Develop financial measurement metrics for the service desk
37%
Assess return-on-investment in sevice desk software and services Replace current help desk/ service desk software solution
33%
Other (please specify)
3% 0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% Valid cases (mentions/valid cases) © 2008 Enterprise Management Associates, Inc.
Sixty-two percent of participants in this research are either already declaring the service desk to be part of the company’s ITSM strategy or planning to do so. This sentiment supports anecdotal insights that have been gleaned from corporate IT by EMA. One of the big challenges for companies moving down the path of an ITIL deployment strategy is that of determining the best starting point for its efforts. Is it best to start by designing a CMS system? Or more appropriate to define services and their associated service level agreements (SLAs)? Enterprises have clear choices to make. The most appropriate direction is not always so obvious. The service desk represents one of several good “starting points” for ITIL initiatives. The benefit of the service desk as a starting point is that it builds on a discipline that is mature and very familiar to most organizations. Additionally, companies can tackle problem, incident and change management together— and often do just that. Sixty-four percent of respondents have deployed or are planning to deploy ITILv3. Incident, problem and change management are the most frequently-deployed ITIL disciplines and remain the most critical for the service desk.
EMA interviewed a large, financial services firm that expressed its best practices viewpoint—one that it shares with many of its peers: “We really got started in April of 2006. The initial spark was getting our arms around the infrastructure we had—what is out there—from an operational perspective. The idea had always been to leverage the ITIL framework, but we never assumed that the chief objective was to be ITIL compliant, if there even is such a thing. Rather we would harness ITIL insights to improve our resources and processes for incident, change, problem, and configuration management.” In addition to problem, incident and change management, a number of other priorities are on the radar screen of enterprise IT. The service catalog, connectivity with the CMS, process development and knowledge management all surfaced as important goals for the service desk—all indicative of the key role the service desk has in ITSM initiatives. From a business point-of-view, service desk leaders are focused on executive dashboards, SLM for the service desk, and customer surveys. The service desk clearly understands the need to align the service support function with the business.
11
ANALYST FEATURE ■ IT SERVICE DESK
TRIMMING IT EXPENSE IN THE SERVICE DESK Some may be amazed that the service desk can play such a pivotal role in ITSM initiatives, and at the same time be an area where measurable cost saving can be achieved. The reality is that the service desk is an increasingly smart investment for advancing service management, and at the same time, cuts operational expenses. This dynamic exists due to ageing technology that often carries with it a high cost of ownership, mergers and acquisitions over a period of time that have created multiple service support operations and a relative low level of maturity in the area of self-service. These conditions offer good news to IT executives. IT organizations typically have a number of choices with respect to how to save IT expense in the service desk. Some options:
S
REPLACEMENT OF AGEING TECHNOLOGIES Many help desk deployments have far outlived their usefulness in the organization. Maintenance costs can be very expensive as can the investment in continuing
Figure 3
Importance of Best Practices
Which of the following best practice related capabilities are important to your organization’s help/service desk solution? 90%
Incident management
86%
Problem management
79%
Change management
77%
Service level management
58%
Service catalog
53%
ITIL configuration management system
46%
Support for ITIL v3
30%
Support for ITIL v2
1%
Other (please specify) 0%
10%
30%
40%
50%
60%
70%
80%
90%
100%
© 2008 Enterprise Management Associates, Inc.
customizations that may have been needed at one point in time, but are no longer necessary. While spending IT budget to replace an existing system can sometimes feel like a luxury, the result for the help/service desk domain can mean a very quick return on investment while the new solution helps to expand the ITSM footprint of the historical help desk.
service desk solution may not be taking advantage of its breadth of capabilities. Many, if not most, service desk solutions now incorporate several ITIL disciplines in a single solution. Implementing even one or two additional ITIL functions serves to expand the service desk’s reach and demonstrate additional value to the organization for the same investment.
SERVICE DESK CONSOLIDATION Operating multiple service desks is expensive. Companies can find themselves in a position where mergers and acquisitions have taken place and resulted in the addition of service desk operations with disparate toolsets, now all serving the same company. Careful consolidation of some of these support teams can reduce staff requirements, eliminate maintenance contracts on multiple products, and improve workflows as processes are refined during the course of consolidation.
OUTSOURCING Outsourcing often creates anxiety for IT professionals as the industry has experienced the competitive strain that it has created over the past five to 10 years. That said, outsourcing front-line support for routine applications can be a low-cost solution that relieves support staff to focus on more strategic activities.
“… the service
desk is an increasingly smart investment for advancing service management, and at the same time, cuts operational expenses.”
EXPANDING THE SERVICE DESK FOOTPRINT Organizations fortunate enough to have purchased a strong, ITIL-focused
12
20%
% Valid cases (mentions/valid cases)
BETTER USE OF SELF-SERVICE AND KNOWLEDGE MANAGEMENT Most service desk organizations have yet to deploy self-service and knowledge management in any significant way. Toolsets have these capabilities and so IT support managers can wisely take advantage of these existing capabilities to put some of its workload at the source of the problem. This often results in happier users that feel in control of their issues. Cost reduction and more effective use of the service desk does not by any means equate to abandoning the traditional service desk. It is quite evident that there is no need to stifle the evolution of the service desk even in
IT SERVICE DESK ■ ANALYST FEATURE
times of economic distress. Continuing to move forward today with streamlining workflow, consolidation of operations, incremental implementation of additional best practice components, and even planning for consolidating customer service with the IT help desk are all potential actions that can be taken in addition to product replacements. Any and all of them will position IT for greater growth when budget dollars are easier to come by, and the groundwork will be in place to take advantage of it. PULLING ITSM AND COST SAVING TOGETHER hese two dynamics—a central role in ITSM combined with strong opportunities for cost savings—is a good news story for IT and enterprises. Not often does IT, or any business discipline for that matter, identify an area of investment that offers this much value to the organization while at the same time representing a domain for managing costs. The best advice for IT is to choose carefully and make those choices most relevant to your organization. Each company will want to apply the possibilities in a slightly different way. For instance, if your company has a help desk product that was developed and deployed prior to ITIL acceptance and adoption, it is very likely that the requirements for service support exceed the capabilities of your current solution. This situation dictates replacement of that product just to meet basic requirements. The maintenance costs are likely to be significant depending upon which solution and whether
T
or not it was customized. The return value in this example has two different perspectives. This company will realize significant value in deploying a solution that supports ITIL best practices improving operations and the customer experience. At the same time, it will incur near-term expenses as it invests in both the new technology and best practice education for its staff. Measurement of value will come from reduced cost of ownership in the new product suite, as well as positive relationships with users across the company. The net result will be an upside gain in a relatively short payback window. Some organizations may not be in a position to expend this initial capital due to economic conditions in its market segment. Recognizing the reality that may exist, it is so important to consider all the ways in which cost saving and organizational value can be achieved with the service desk. The service desk will continue to play a pivotal role in any service support and delivery strategy. Wise CIOs and other IT strategists will capitalize on the momentum that the service desk is gaining to deliver IT service management. Many enterprises will choose to deploy only part of ITIL best practices over the longhaul and others will tackle it all. Incident, problem and change management will clearly be included for all. The service desk is inherently right in the middle of this dynamic. It also plays a strong role in building corporate goodwill and customer perceptions for organizations where the service desk is externally facing. It is an exciting time for service desk leaders who are now enjoying a new respect for their function and professional staff, and at the same time have the opportunity to influence the direction of the IT service support and delivery engine.
Lisa Erickson-Harris
RESEARCH DIRECTOR, Enterprise Management Associates (EMA) Lisa has over 18 years experience in the computer industry, having served in a variety of technical, marketing and managerial roles. Lisa focuses on service level management, business process management, small-to-medium business infrastructure management needs, and partnership strategies for channels and strategic relationships. Prior to joining EMA, Lisa was responsible for the SPECTRUM Partners program for Cabletron Systems (now Aprisma). She writes as a guest columnist frequently for Network World Fusion and contributes articles to slm-info.org. Lisa is also co-author of SLM Solutions: A Buyer’s Guide, now in its third edition.
13
EXECUTIVE PANEL ■ BI AND DATA MANAGEMENT
Data intelligence
http://www.GlobalETM.com 14
BI AND DATA MANAGEMENT ■ EXECUTIVE PANEL
D
AVID HATCH (ABERDEEN GROUP) moderates a panel discussion on the challenges and business performance effects that data management has on business intelligence initiatives. He is joined by ANDREW DE ROZAIRO (SYBASE) and SANDY STEIER (1010DATA).
DH: ABERDEEN RESEARCH CONDUCTED IN THE PAST FOUR WEEKS HAS SHOWN THAT ONE OF THE TOP CAUSES OF FAILED BI PROJECTS ARE THE BARRIERS AND CHALLENGES THAT COMPANIES FACE WITH DATA MANAGEMENT AND THE PREPARATION OF DATA FOR REPORTING AND ANALYTICS. LET ME JUMP STRAIGHT INTO THE FIRST QUESTION FOR SANDY AT 1010DATA. SANDY, DATA MANAGEMENT DISCIPLINES AND CAPABILITIES HAVE BECOME A FOCAL POINT FOR BUSINESS INTELLIGENCE PROJECTS. WHAT ARE SOME OF THE TOP BUSINESS PRESSURES THAT YOUR CUSTOMERS ARE EXPERIENCING REGARDING THE NEED FOR MANAGING DATA FOR THEIR BI APPLICATIONS?
SS:
In today’s world, competition is becoming more and more keen and important and companies need to be smarter about how they do things—that means in many cases that they need to analyze more data. There is, in fact, more data available nowadays and so in theory they can do that, but obviously they need to overcomesome significant hurdles. Another issue is that their competitors are going to be analyzing more data, so if they don’t, they’ll fall behind. Management knows this and they’re demanding more sophisticated reporting and analysis—and they want it faster—so basically the challenge for most shops is that not only is the volume of data so much bigger, the way that the data needs to be analyzed doesn’t work with traditional architectures. One example that we’re seeing a lot of is the need for retailers to analyze a lot of point of sale data which is very voluminous. Another very common example is in the financial services industry where people need to analyze years of payments and millions of loans, and we all know what happens when the proper analysis is not done on loans like mortgages—the whole world has found that out already.
“... volume, velocity and value... are coming together to put incredible amounts of pressure on BI infrastructure...” Another example is in the pharmaceutical industry where companies do various kinds of studies on drug purchase patterns. All these require fairly ad hoc manipulation of very large amounts of raw data, and they can’t be accomplished with the usual OLAPtype pre-aggregation which is much more fixed. Also, there is a time series component to the analysis which traditional database structures don’t support very well. So the challenge is really in managing tremendous amounts of data, in a form that can easily be analyzed, in the way that it needs to be analyzed. DH: YOU BRING UP THAT NEXUS WHERE THE DATA VOLUMES AND THE RESPONSE TIME NEEDED ARE REALLY WHERE THE BATTLEGROUND IS NOW, AND COMPANIES ARE REALIZING THAT THEY NEED TO ANALYZE MORE DATA, BUT THEY NEED TO DO IT FASTER. LET’S MOVE ON TO SYBASE. ANDREW, SLOW RESPONSE TIME TO CRITICAL BUSINESS INFORMATION CAN REALLY BE THE DIFFERENCE BETWEEN SUCCESS AND FAILURE. IN FACT, OUR RESEARCH FINDINGS OVER THE PAST FOUR WEEKS HAVE SHOWN THAT THERE IS A DIRECT AFFECT ON BUSINESS PERFORMANCE WHEN MANAGEMENT CAN’T GET AT THE INFORMATION THEY NEED WITHIN THE DECISION WINDOW THAT’S NECESSARY. HOW SHOULD COMPANIES VIEW RESPONSE TIME, AND WHERE ARE THE FIRST PLACES WHERE COMPANIES SHOULD LOOK TO IMPROVE RESPONSE TIME WHEN ACCESSING HUGE VOLUMES OF DATA?
15
EXECUTIVE PANEL ■ BI AND DATA MANAGEMENT
ADR:
Let’s pick up those three issues. First, let’s look at where we should look to improve the response time and then address queried response which is what most people associate with when we talk about improving response times. Then I’d like to take a look at another dimension of it which is more the agility side—the ability to respond to change. The first thing is, when you’re looking for a response time improvement, it’s important to realize that most of the performance of analytics is down to the preparation and the structuring of the data. Throwing in additional hardware or memory provides a short-term fi x, but it’s expensive and it doesn’t address the root cause of the problem. If we accept that the data architecture is the single biggest factor affecting performance, then using a database platform that was built from the ground up for analytics is fundamental. And since analytics is performed on columns, column-based architecture, and a query processor tuned for columns, it is critical to get the fastest response. So, how does this architectural approach translate into response time? If we look at query response first, business users have become conditioned by Google to be able to ask any question pointed at the net and get a response in seconds. And it’s hard for these business users to accept that a query against a much smaller set of data inside company walls should take any longer. So there’s extreme pressure on getting a quick response for ad hoc queries. At Fortis Bank, we have users firing 115,000 ad hoc queries against the Sybase IQ data warehouse and 57% of these queries are now responded to within a second. Predictability is the other component that’s very important in terms of response time. At Barclays Global Investors where risk reports take between two and 30 minutes, there’s now no single report that takes over one minute. Finally, let’s turn to the agility question. The addition of a new set of data and most BI architectures involves lots of decisions about how to structure it, how to
“... new technologies are much more flexible in the sense that they allow combinations of data from various sources...”
16
prepare it, whether this includes rebuilding indexes and then choosing where the data needs to be stored for optimal results and so on. And this slows downs the change process considerably. We have a BI partner in the UK called Anari, who’s used a combination of Sybase IQ and our modelling and metadata management solution to deliver a new BI project for a major airline. They were able to deliver it in half the time partly because they could cut out the steps of thinking about indexes and aggregates, but also because they were able to test against full volumes of data because the performance was so good. So the customer ends up paying 50% less consultancy time, but the other huge benefit is that they bring the product to market in half the time. So I think both types of response time are really important, and they’re both under pressure. Having a BI infrastructure that allows you to deliver fast performance and agility is a real competitive advantage. DH: THAT‘S A NICE SEGUE INTO ANOTHER AREA THAT I’D LIKE TO TALK ABOUT—SANDY, ANDREW JUST MENTIONED THE IMPORTANCE OF PREPARATION, STRUCTURING AND OVERALL DATA MANAGEMENT NECESSARY FOR ANALYTICS— ALL SO THAT THE QUERY RESPONSE TIME IS FAST. AS ANDREW SAID, GOOGLE HAS EXEMPLIFIED THE IMPORTANCE OF PREDICTABLE RESPONSE TIME AND WHAT WE’VE FOUND IN OUR RESEARCH IS THAT WHEN RESPONSE TIME IS NOT PREDICTABLE THIS LEADS TO LOWER ADOPTION OF BI ASSETS THAN IS EXPECTED, AND IN FACT, LOW ADOPTION FOR BI ASSETS IS OFTEN CITED AS THE CAUSE FOR THE LACK OF ROI IN BI INVESTMENTS. OUR RESEARCH HAS FOUND THAT ADOPTION ISSUES ARISE FROM SEVERAL CAUSES; EASE OF USE, LACK OF DETAILED DATA REQUIRED BY THE END USER, LONG TRAINING CYCLES ETC. BUT THE MAIN ISSUE THAT CONTINUES TO BE IDENTIFIED AS THE TOP CHALLENGE IS AROUND QUERY RESPONSE TIME AS A RESULT OF POOR DATA INTEGRATION, DATA COLLECTION AND DATA AGGREGATION, LEADING TO QUERY PERFORMANCE ISSUES. ARE YOU SEEING THIS WITHIN THE FIELD AS WELL, AND WHAT NEW APPROACHES ARE OUT THERE THAT WILL HELP ORGANIZATIONS DEAL WITH THIS?
SS:
We are absolutely seeing increased demands for data integration—I agree 100%. Earlier I mentioned the financial services industry. It has become
EXECUTIVE PANEL ■ BI AND DATA MANAGEMENT
standard operating procedure in that industry to build analysis on multiple disparate data sets that come from all different sources; let’s say payment histories, housing price indexes, employment statistics, credit histories—a whole range of things. People need to take all of this data that comes from different places and that has different formats, put it all together and come out with some integrated analysis, and everyone is doing this. Major banks like Goldman Sachs, JP Morgan, Credit Suisse, UBS and Bank of America use 1010data to do exactly this sort of analysis. What we find is that not only do they need to do this all the time, but the types of data that come into play increase over time, so this gets to Andrew’s point about changes. One day they may be working with six data sets and then the next day they need to work with seven data sets, and that needs to be integrated smoothly. And it is quite a challenge for most people to put this all together, given the fact that any one of these data sets by itself could be a challenge and when you put them all together it compounds the problem. Retail is another good example. Data comes from many different groups within a company, even from several businesses like bricks and mortar stores, catalogues, websites etc. To put it into perspective there are really two challenges here. The first is getting the metadata straight, or in other words, figuring out what all that data is. When you’re collecting data from multiple sources, but the application designers or those people doing the analysis aren’t necessarily always familiar with this data, it’s kind of new and so there needs to be a way of exploring the data and figuring it out—that’s one challenge. The second challenge is managing all that data that is very voluminous, and it has to be done at a reasonable price. Fortunately, in terms of new developments, technologies like ours and perhaps others help in the fact that—and this comes back to a point that Andrew made that I completely agree with—the new technologies require much less design and set up time. The typical formal process by which one goes and figures out exactly what the database should look like, and how it’s going to be accessed and so on, can be compressed dramatically, in some cases eliminated. Therefore
changes can be implemented more quickly and databases can be put up more quickly, including data from various sources. DH: YOU MENTIONED SOMETHING VERY INTERESTING THERE WHEN YOU IDENTIFIED THE TWO CHALLENGES. THE FIRST ONE IS GETTING THE METADATA STRAIGHT AND YOU MENTIONED THAT A LOT OF TIMES THIS IS DATA THAT IS NOT FAMILIAR TO THE USER, WHETHER IT’S THE IT USER OR DEVELOPER DEVELOPING THE BI APPLICATIONS OR WHETHER IT’S THE BUSINESS USER OR BUSINESS ANALYST. IN FACT, WE RECEIVED AN OPEN TEXT RESPONSE IN ONE OF OUR SURVEYS THAT SAID; “DATA ACCESS IS RESTRICTED TO IT STAFF—PERIOD.” AND WHEN WE DELVED INTO THAT AND HAD SOME CONVERSATIONS WITH OUR RESPONDENTS WE FOUND THAT THERE ARE SOME ISSUES WITH THE INABILITY OF BUSINESS MANAGEMENT TO BUILD THEIR OWN REPORTS, DATA VIEWS AND ANALYTICAL MODELS BECAUSE OF THAT LACK OF FAMILIARITY WITH THE META-DATA YOU JUST MENTIONED. THERE’RE SOME NEW TECHNOLOGIES OUT THERE THAT ARE SPEEDING UP AND AUTOMATING THE PROCESS AND MAKING MULTIPLE AND DISPARATE SOURCES MORE ACCESSIBLE, BUT IS THIS REALLY CHANGING THE SITUATION FOR THAT END USER, ANDREW? AND IF SO, HOW IS THIS CHANGING?
“... competition is becoming more and more keen and important and companies need to be smarter about how they do things...”
18
ADR:
One of the barriers that we’ve seen to a wider adoption of IT outside the IT environment has been the reporting layer. The reporting layer in the past often wasn’t business-friendly enough and required quite a bit of understanding about data structures and data management principles which is absolutely in line with what Sandy was saying earlier on. We’ve started to see some improvement on the BI reporting investors improving their interface, but the other side that’s really held people back has been the technology infrastructure, and there have been two critical areas that have restricted this wider adoption.
ANY USER, ANY QUERY, ANY TIME.
ANY QUESTIONS? Look to Sybase IQ for all your answers. Unlimited headroom for data and users, incremental scalability to grow and adapt, the freedom to leverage standard hardware and operating system, and the flexibility to choose your reporting and analytics tools. Add the strategic advantage of faster, more accurate answers to complex queries, unbounded reporting, deep-dive data mining, and predictive analytics. Now you have insight-driven perspective into risks, opportunities, and rewards—high-performance business analytics proven in over 3,100 unique installations at 1,700+ companies. It takes a smarter analytics platform to power the new business reality. It takes Sybase IQ.
www.sybase.com/bi Copyright © 2009 Sybase, Inc. All rights reserved. Sybase and the Sybase logo are trademarks of Sybase, Inc. ® indicates registration in the United States of America. All products and company names are trademarks of their respective companies.
EXECUTIVE PANEL ■ BI AND DATA MANAGEMENT
One is the restricted visibility to the entire data set, and then the second big limiting factor has been performance in a mixed workload environment. If we look at the limited visibility issue first, we find that a lot of traditional environments’ user access to data is limited to a subset of data, so it’s either the data available in cubes, or another architecture’s data is isolated depending on which server you’re accessing it through. Now, with a column-based architecture, all information is visible and accessible to all users and it’s only limited by the security privileges which is obviously very important. So this is one of the ways in which we’ve made this democratic access to the information much easier. The second bottleneck has been performance. So even when systems are performing really well in general, there is this great fear of the unexpected query—and this goes back to the point that Sandy made before. In the past, BI architecture was such that you really had to know a little bit about data management and data structure in order to be able to ask the right type of question. When you open this up to a larger user base you’ll find that there are queries that are sent to your business intelligence infrastructure which may be completely valid and valuable to the business, but the data is not well prepared and not well structured for that kind of query. In the traditional environment that query can slow down all other BI users on the system and this meant that IT felt that they had to control these queries, either writing the queries themselves, filtering them, or a lot of customers were finding that they have teams of query tuners. This works with small groups of people, but it’s not scalable to a large user base. We’ve seen a number of large projects that started off small, everything was working fine, then they scaled out and hit this unexpected query issue, slowing down all the BI users. They have actually scaled back to a small number of users who understood data management and who wrote the right kinds of queries. So we see these as two infrastructures that have really limited the adoption of BI outside IT staff. Most BI projects in the past have been limited to tens or maybe hundreds, at the most, of users. Once customers can remove these restrictions partly around data visibility, but secondly more tolerance, for unexpected queries you can really open out to and support thousands of concurrent users. We have thousands of concurrent users at places like Superfarm in Israel, a leading retailer, and Experion here in the UK. What we see is definitely a move towards self-service inside business intelligence so we want to make this data available to a much wider group of people. A good example is Nielson Media Research who wanted to cut a CD
20
Andrew De Rozairo BUSINESS DEVELOPMENT MANAGER EMEA Sybase
Andrew has focused on delivering business value through technology for the past 23 years. An Electrical Engineering and Computer Science degree from the Massachusetts Institute of Technology with an MBA from INSEAD, Europe’s leading Business School, provides him with a unique mix of technical and business perspective. Andrew has over 16 years in management roles in the Data Management Industry, including as CEO of a VC-backed encryption start-up, and European Managing Director of an American data monitoring Software Company. In his current role as Business Development Manager EMEA for Sybase, Andrew works with strategic clients and partners to develop and deliver business value propositions based on leading-edge Sybase modeling, metadata management and analytics technology solutions.
every day and send it out to their different customers. Now what they’ve done is provided this access directly to the customers who can build their own reports, and it’s very often this move to self-service that forces companies to think about their fundamental data architecture and say; “Is this the right approach? Is the architecture that I have going to scale out so that we can do self-service so that it is successful?” DH: THAT LEADS ME TO THE FINAL QUESTION ADDRESSED TO BOTH OF YOU. ANDREW JUST REFERRED TO WHAT WE SEE HAPPENING THROUGH OUR RESEARCH AT THE ABERDEEN GROUP, WHICH IS A MAJOR SHIFT IN TECHNOLOGICAL DEVELOPMENTS THAT ARE DRIVING ACCESS TO DATA AND DELIVERY OF DATA IN A SELF-SERVICE ENVIRONMENT TO PEOPLE WHO NEED TO MAKE BUSINESS DECISIONS, AND NOT SPEND TIME ON TECHNOLOGY TASKS. WE ARE SEEING SELF-SERVICE, INDEED AS ANDREW JUST MENTIONED, AS SOMETHING THAT CAN ALSO BE CUSTOMER-
BI AND DATA MANAGEMENT ■ EXECUTIVE PANEL
FACT FILE_ Sybase
H
ISTORY
> Sybase has a rich 25-year history as a technology leader, starting from its creation in 1984 by Mark Hoffman and Bob Epstein in California. > Sybase has consistently created technology that enables the Unwired Enterprise by delivering enterprise and mobile infrastructure, development and integration software solutions.
P
RODUCTS
> Sybase products range from database to government solutions. > Database management software: Best-fit infrastructure for managing data within multiple distributed environments and for a variety of purposes. > Business continuity software: Reduces the cost of remote data recovery while reducing business risk and ensuring data integrity. > Business intelligence and analytics software: Delivers high-performance enterprise analytics and business intelligence without blowing the budget or abandoning investments in technology and knowledge resources. > Mobile commerce: Delivers mobile services from mobile messaging interoperability to mobile content delivery and mobile commerce services. > Government solutions: Select information technology, management and mobile solutions for government agencies. > Healthcare solutions: Provides the healthcare industry with timely and secure access to vital medical information.
FACING. IT’S NOT JUST INTERNALLY FOCUSED ACCESS TO DATA, BUT DATA THAT CAN BE LEVERAGED AND USED TO HELP FIRM-UP CUSTOMER RELATIONSHIPS. ARE YOU BOTH SEEING THAT SAME THING, AND WHAT ARE THE TECHNOLOGIES AND THE TECHNOLOGICAL ADVANCEMENTS BEHIND THAT? HOW IS SELF-SERVICE BI GOING TO HELP ORGANIZATIONS IMPROVE THEIR OVERALL BUSINESS PERFORMANCE?
SS:
I should start by saying that I agree with what Andrew has said. In fact, I’ve just written a whitepaper, which hasn’t yet gotten out to all of our customers, which talks about what I call transparent databases— transparent in the sense that they are visible to the user community, which is exactly the point that Andrew was making. In terms of the general direction, I think we’ve hit on the fact that it’s about making databases more transparent and more agile so that they can change quickly and deal with more data from more sources more easily; that they can do more sophisticated types of analysis without having to be carefully designed and so on. But there is one additional point that I think builds off of that in terms of how a business approaches a project, and that is that, if in fact new technologies like ours enable people to put up databases much more quickly without the usual formal requirements and design phases, organizations can try out different services and products before making a commitment. This takes a tremendous amount of risk out of the entire process. In fact, we’ve seen in many cases that, in a pilot project, if an organization is looking at multiple competing vendors, for example, the actual final data warehouse can in fact be constructed as part of the pilot. And so it’s not that the pilot is done on a subset of data and some small mock-ups of the ultimate deliverable, but the ultimate deliverable is what is created during the pilot project. We did something like that for Dollar General for instance—in fact most of our customers use that approach. Their pilots basically are the entire project. If you think about it, this means that the company can really see how the competing products work in a real production-like system, and it means that there’s no risk in it because they don’t have to decide on a certain technology, and then maybe a year or two later it will work or maybe it will fail. In this case, they know it works, and so when they decide to go with
21
EXECUTIVE PANEL ■ BI AND DATA MANAGEMENT
some combination of vendors they know that not only will it work, but it already has worked, which is a very powerful paradigm shift in the way these projects get done. DH: ANDREW, SANDY HAS JUST MADE A VERY IMPORTANT POINT AND THAT IS THAT A RAPID ITERATION APPROACH TO BUILDING DATABASES HAS OPENED UP DOORS FOR COMPANIES TO ‘TRY BEFORE THEY BUY’, TO REALIZE THE REAL BUSINESS IMPACT (OR NOT) BEFORE THEY MOVE FORWARD, WITHOUT RISKING A MULTI-YEAR PROJECT IN THE PROCESS—IN FACT OUR RESEARCH WOULD AGREE WITH THIS BEING AN IMPORTANT CAPABILITY. WHAT ELSE IS IMPORTANT FOR LISTENERS TO UNDERSTAND IN TERMS OF THE LATEST TECHNOLOGICAL ADVANCEMENTS?
ADR:
I guess we see an almost perfect storm hitting business intelligence today, where it’s really volume, velocity and value that are coming together to put incredible amounts of pressure on BI infrastructure. Let’s look at volume first. For years we’ve been talking about BI delivering better improved operations from the board room to the mail room. If we’re going to be able to deliver on that we have to manage the simultaneous explosions of data and the user population. We’ve already built a data warehouse, we already have thousands of concurrent users and that pressure is going to keep building, so one of the key developments is the ability to grow and to independently scale processing, memory and storage to be able to allow companies to grow incrementally. I agree 100% with Sandy—companies don’t want to risk making big lump-sum investments, they want to grow as their needs grow and I think the introduction of grid technology to be able to do this independently and slowly as your needs grow is an important one. The second point is velocity. We talked before about the fact that the world is getting much more impatient and that there is less and less time to return responses. We see lots of vendors in the BI industry talking about real-time BI data warehouses where they trickle-feed information in order to provide fast response times. We provide that as well, but we’re seeing some critical applications, the most demanding applications, where we have to make decisions before the data evens hits the database. The other area that we’re seeing draw lots of pressure is around predictable
David Hatch— Moderator SENIOR VICE PRESIDENT AND GENERAL MANAGER Aberdeen Group
David is responsible for all Aberdeen Research programs and overall operations of the company. In his research capacity, David focuses on the delivery of actionable information to the enterprise. This encompasses both traditional business intelligence (BI) and emerging methods (client/server, web applications, SOA-enabled applications, On-Demand via SaaS, Hosting/ASP, and appliances). David also has expertise in the application of BI within specific vertical market environments including healthcare, supply chain, manufacturing, publishing, agriculture, wholesale/distribution and insurance. David holds a BA, Communications degree from the University of Massachusetts, has completed seminar work in Project Management Excellence at Boston University, and Marketing Innovative Technologies at Harvard University.
analytics and the requirements of predictive analytics on response time. Here the business is looking for insight rather than information; they’re looking at the root cause of problems rather than just the symptoms. This requires complex analysis against really granular data at a speed that encourages multiple iterations. So you ask a question, it comes back, you ask it again, and in order to do that you really need fast response time. One of the hurdles in the past has been that in order to do these calculations you need to pull data out of the data warehouse, perform your calculations, your statistical functions and your analytics against it and then turn it back into the data warehouse. So one of the things we see as a big step forward is in database or in analytics which removes that delay or latency in the process. We’ve talked about volume and we’ve talked about velocity, and I think the third big factor is value. In the past we’ve always had a growing economy for a long time so even though the volume and velocity challenges have been there, there hasn’t been the same economic pressure on us. In
22
EXECUTIVE PANEL ■ BI AND DATA MANAGEMENT
this environment we’re all under cost constraints and we’re expected to contribute to the savings. Sybase IQ has always been a leader in terms of data volume reductions which reduce your storage costs and so on, but one of the things that we’ve added on to that is information lifecycle management so you can set up data retention policies that automatically move data from higher cost storage down to lower and lower cost as the access frequency decreases. We think that’s the backdrop against which people are working today. So, the additional challenge of having to deliver value while responding to volume and velocity pressures has stopped people throwing hardware at a problem, and forced them to fundamentally look at whether the basic approach of what they’re doing is sound. We’ve got over 3000 customer installations and we’re seeing more and more vendors taking a column-based approach to solving this problem—to us that’s very strong feedback; that a column-based architecture is helping them to weather the storm.
Sandy Steier VICE PRESIDENT AND CO-FOUNDER 1010data
24
With more than a quarter century of industry experience, Sandy Steier is recognized as an innovator behind the adoption of advanced analytic technologies by financial services institutions. Before co-founding 1010data, Sandy was a Vice President and manager of research and technology at UBS North America, where he supported several trading desks and contributed significantly to the evolution of the firm into a leader in the use of advanced technologies. Previously, as Senior Vice President at Lehman Brothers, Sandy led the effort to migrate mortgage-backed securities analytical programming from mainframes to workstations. Earlier in his career, while a Vice President at Morgan Stanley, Sandy was responsible for product development and analysis of fixed-income securities and the supporting technology effort.
FACT FILE_ 1010data
H
ISTORY
> 1010data was founded in 2000 by pioneers of large-scale data systems on Wall Street. Drawing on experience and new technologies, the company developed a web-based service and underlying software. > For almost a decade, 1010data has provided analytics, business intelligence and data publishing and warehousing services to top tier companies in many sectors.
P
RODUCTS
> 1010data’s data management architecture is, in theory, almost infinitely scalable and is architected to handle multi-terabyte databases at a fraction of the cost and with much higher performance than other data management approaches. > Financial services: Combines the power of a high-performance back-end database with a web-based, front-end user interface, empowering financial institutions with the tools they need to analyze, manage and present data. > Retail and consumer packaged goods: This analytical data platform combines a powerful back-end database with a flexible front-end tool, enabling fast, reliable queries of detailed data. > Data warehousing: Provides a unique platform that combines front-end usability with back-end functionality. Delivered as a managed solution, 1010data allows analysts to quickly construct complex and sophisticated queries on very large datasets and get results in seconds.
www.KUPPINGERCOLE.com
ANALYST FEATURE â– ENTERPRISE SERVICE MANAGEMENT
The paradigm shift The main message that ESM conveys is about the need for greater transparency of total asset usage. ROY ILLSLEY (BUTLER GROUP) tells us that ESM, with its cross-business unit visibility, provides the information on what areas need to be addressed.
26
ENTERPRISE SERVICE MANAGEMENT ■ ANALYST FEATURE
while also reducing the environmental impact on the planet, they will provide other organizations with some idea of how ESM principals can operate in their business sectors. Firstly, the business opportunities that ESM presents may not always be obvious, particularly in organizations where IT is not represented at the C-level—as much of the value of ESM is contained in its ability to provide a platform for cross-business unit collaboration. Many organizations are considering how real business process automation can be achieved, while currently the processes being automated represents only a part of the entire business process. For example, consider the simplified mobile sales process. In IT business process terms this is modelled as a sales person with a laptop, complete with all relevant information, a method of connecting to corporate systems, and the ability to generate a sales invoice. However, in real business terms the mobile sales process also relies upon the car used to get to the meetings, the mobile phone used to arrange them, and the traffic conditions and distance the sales person has to travel (with a secondary impact on their stress level and therefore sales performance). All of these elements are currently unmonitored, and by implication not able to be optimized in terms of the entire business process. Therefore, ESM can help by capturing the information on company car location and routes taken, mobile telephone usage and signal strength, and model the total time allocated to one sales opportunity.
“... the current economic environment provides the ideal backdrop for a renewed ideology on how to tackle some of the most difficult business challenges...”
E
nterprise Service Management (ESM) has been used in a very targeted and focused manner by those organizations that have implemented it. ESM lacks the hype that other technologies have achieved in recent years, but Butler Group believes that ESM provides a potential paradigm shift for organizations; especially if it’s used to deliver maximum organizational value, and as such could create significant changes in how IT is used and perceived in the organization. The natural extension of ESM could be used to consider the employee as an asset, however, this degree of intrusion into an individual’s personal life may be considered a step too far. ESM recognizes the fact that currently IT assets are, for the most part, monitored and managed by many organizations, but devices and equipment beyond the IT world is Somebody Else’s Problem (SEP). This approach has restricted the value that organizations can obtain from investments in plant, machinery and other equipment, because these devices hold valuable information that is not combined with other information currently held in IT data silos. Therefore, by extending the reach of asset management and making them industry-specific, operational processes can be developed so all of an organization’s assets are monitored and managed.
Some prime examples The leading industrial sectors of this ESM movement are healthcare and utilities, and by understanding how these sectors in particular believe ESM helps to improve operational efficiencies,
What are the benefits? The benefit to the organization of obtaining information in a format that can be combined with other organizational data is that it enables true business process optimization to be performed. In our example, the sales person may be planning their routes based on the value of the potential sale, while if their routes were planned based on a true cost of sale model then the routes may be different, and the overall profit margin for the department increased. Another benefit that ESM provides is it enables an organization to understand the real value of any investment in assets, again in our example of
27
ANALYST FEATURE ■ ENTERPRISE SERVICE MANAGEMENT
what the business value of the company car is. This particular question has other aspects that contribute to the equation, such as how many organizations understand the real impact on profit margin of the sales person driving that particular make/ model of car. This becomes even more contentious when compared to the overall business significance of the product/service being sold. As can be seen, this level of knowledge and analysis opens up the proverbial can of worms in that it challenges some accepted working practices that have previously been considered a “cost of doing business”, and not specifically allocated to the task at hand. This level of knowledge and management may appear to be excessive, but as the economic recession deepens, more and more organizations are looking to make their operations as cost-effective as possible. In order to fully understand the impact an asset has and the use that asset is put to requires organizations to adopt an enterprise-wide approach to all its assets, resources and procedural activities. The example of the company car can then be put into context. If, for example, its contribution to cost of sale is small when compared to the manufacturing costs, and the evidence demonstrates the biggest other costs (excluding manufacturing) are incurred in the logistics of moving stock from factory to store, then informed decisions can be made on what action needs to be taken.
For IT, this business process perspective provides the platform needed to support the transition towards the “everything is a service” approach to IT delivery. The extension of the asset knowledge database, and by implication the assets’ business relevance and how these are part of any business services being consumed, can be mapped to the IT services being provided. While this “everything is a service” approach may be a few years away from being widely adopted, ESM provides an enabling layer for when organizations are ready to move to the new model for IT delivery.
Working solutions The benefits alluded to in the last section may appear to be theoretical in terms of how they can be used, but two industry sectors are actively working on adoption of ESM capabilities—the utilities and healthcare sectors. The utilities sector has been driven to ESM through the demand from central governments for reduced CO2 emissions targets agreed to at the Kyoto summit. For many Western economies the power infrastructure requires significant investment to modernize its distribution and generation facilities. The UK and US are committed to introducing the “smart meter”, which is capable of being managed remotely and reports its status in terms of power consumption in near real-time.
Enterprise Service Management and its relationship to business processes
Source: Butler Group
28
Part of the Datamonitor Group
Imagine an analyst firm that understands the specific business issues of your industry... Only Ovum harnesses the power of 150+ ICT analysts working collaboratively with 350+ business analysts to understand how IT can be used to maximise your business returns. Welcome to the new world of Collaborative Intelligence.
“Vendor and enterprise research and advisory buyers should put the new Ovum on their short lists� Analysts of Analysts
Contact us today at enquiries@ovum.com for a free publication containing a range of articles on the future of IT from both a technology and business perspective
WWW.OVUM.COM
ANALYST FEATURE ■ ENTERPRISE SERVICE MANAGEMENT
These devices, when installed in households and businesses, will enable the consumer to see the cost of their energy usage, but the real value of these devices will be how the information can be collected and used to help the power distribution companies ensure energy is available only when needed, and is provided from supplies that reduce the transmission loss to a minimum. The extension of the “smart meters” will be that organizations can invest in manufacturing plants that operate at optimum performance levels by understanding the demands of production, the availability of energy supply, and the value/ cost that increased production at that particular time represents. The healthcare industry has approached the use of ESM from a different angle. It has the challenge of tracking equipment, people (both staff and patients) and data (prescribed medication for example). The primary purpose is to ensure that it can maximize its investment in the assets it has and to do this it needs more information on when expensive assets are being used, where they are and who is using them. For example, in Accident and Emergency (A&E) departments the
use of a specialist piece of equipment may only be required occasionally, but in order to ensure it can provide the level of care this equipment may need to be permanently located in the A&E department. However, if the hospital knew the location of all such devices and the current status in terms of importance to the patients’ wellbeing, then these could be re-allocated for use on a priority-needs basis. Another complication that hospitals face is ensuring that patients receive the correct medication, which involves knowing not only who the patient is, but also their clinical history. This level of information is very sensitive and covered by many different data protection laws in different countries. Therefore, the challenge is to ensure that when a doctor visits a patient they can access the information from a central repository, with confidence that they have the correct information, and that data protection laws are not broken.
“... ESM provides a potential paradigm shift for organizations...”
Looking ahead The future for ESM is not a simple case of raising awareness through a marketing campaign. The message that ESM conveys is one of the need
Roy Illsley
SENIOR RESEARCH ANALYST, Butler Group Roy has over 23 years of IT experience, working for a variety of consultancy and end-user companies with experience in the defense, utilities, automotive, retail, and Fast Moving Consumer Goods
(FMCG) industries. Roy has delivered keynote speeches at Butler Group Strategy Briefings, Master Classes and at external trade events. Roy is quoted regularly in the computing press and is recognised as Butler Group’s expert on Infrastructure and Systems Management, with a secondary area of IT Strategy and Policy.
30
for greater transparency of total asset usage. The utilities and healthcare sectors have embraced ESM because of legislation forcing change in the case of utilities, and the need to improve a public service by better utilizing its existing assets in an environment of increased government scrutiny for the healthcare industry. The drivers are clearly linked to the pressure to do things differently within an environment where service is a key factor in the public’s perception. However, we believe that any wider use of ESM is likely to be driven by external pressures—sales departments are unlikely to volunteer to return their company cars without some valid evidence to demonstrate the impact in terms of a product’s profit margin. Therefore, it is our contention that the current economic environment provides the ideal backdrop for a renewed ideology on how to tackle some of the most difficult challenges—dealing with a retail environment and its last 100 meters of the supply chain, for example—and ESM with its cross-business unit visibility, can provide the information on what areas should be addressed. Finally, focusing on ESM from an IT department’s perspective, then the challenge of technologies such as cloud computing, SOA, and virtualization, when coupled with the growth in mobile devices—estimates state that 15 billion mobile internet enabled devices are expected to be in use by 2015, and 30 billion Radio Frequency Identification (RFID) tags in use by 2010— represents a major challenge to ensure that organizations obtain value through matching these devices with other assets and processes, and therefore understanding where waste and value co-exist.
What’s New What’s Next see it at INterop
Don’t miss the leaDing business technology event See the full range of IT solutions, learn what’s new and identify technology must-haves for your business. Interop is the only event to give you a comprehensive and unbiased understanding of the latest innovations— including cloud computing, virtualization, security, mobility and data center advances—that will help position your organization for growth.
exhIbItors INclude:
save 30% or get a Free expo pass Register with priority code CNJXNL01
www.interop.com/lasvegas
*30% off discount applies to Flex, 4-Day and Conference Passes. Discount calculated based on the on-site price and not combinable with other offers. Proof of current IT involvement required. Prices after discount applied: Flex: $2,306.50 | 4-Day: $2,026.50 | Conference: $1,606.50
coNFereNce tracks: • application Delivery
• Cloud Computing • Data Center • enterprise 2.0 • Governance, Risk and Compliance • Green it • it security and Risk Management
• Mobile Business • Networking • storage • Video Conferencing • Virtualization • VoiP and Unified Communications
©2010 TechWeb, a division of United Business Media LLC.
IN THE HOTSEAT ■ SECURITY INFORMATION AND EVENT MANAGEMENT
Deep dive into SIEM 2.0
C
http://www.GlobalETM.com
HRIS PETERSEN (LOGRHYTHM), tells ETM’s ALI KLAVER that the limitations of SIEM 1.0 have held companies back for years, and why SIEM 2.0 is defi nitely worth the upgrade.
AK: WE KNOW THERE ARE USERS WITH FIRST GEN SIEM’S AND THEY’RE LOOKING AT UPGRADING FOR MORE FUNCTIONALITY, AND THERE ARE CERTAIN LIMITATIONS OF SIEM 1.0 PRODUCTS—CAN YOU RUN OUR AUDIENCE THROUGH THE FIVE THAT LOGRHYTHM HAS EXPERIENCED?
CP:
If I were to boil it down to what I consider to be the top five limitations of SIEM 1.0 products, the first would be security-centric focus. When this market was being developed 10 years ago, the primary use cases were around security—specifically data reduction. So the resulting architecture and product use really had a security-centric focus and didn’t provide much in the way of operations, compliance and audit use cases. We’ve realized, since then, that log data is very relevant and useful in a variety of areas, especially operations and compliance in addition to security. So I think this is one significant limitation of SIEM 1.0 products. Another limitation is that there was too much attention focused on data reduction which was the primary use case for first generation SIEM products delivering only high quality “events”. The problem was, they didn’t collect and retain a lot of valuable forensic information—primarily log data—which meant analysts never had access to this information when they needed it to support day-to-day analytics and forensics across the business.
32
Another key weakness of SIEM 1.0 is limited forensics and contextualization. That comes down to not making immediately available a rich set of data around the event that helps an analyst understand what they’re looking at—whether it’s a security issue, a compliance issue or an operations issue. This type of contextualization allows the analyst to corroborate information around an event to better understand what happened, and the resultant impact of the event. The fourth point is that there was a lot of promise around correlation early on in SIEM 1.0 development. I think we’ve come to realize that while it’s certainly valuable, and a core component of SIEM products, there may be too much reliance on correlation as the be-all-and-end-all. We not only need correlation but also other types of analytics combined with more forensic information to enable machines to make better decisions, provide better information to users and support the instant response process. Lastly, a key weakness is that SIEM 1.0 is overly complex and expensive. In terms of making the technology useful, the sophistication and the complexity far exceed the capabilities and budget constraints of a lot of organizations. In many cases there’s a high, up-front ticket price, but an even more significant expense required on the deployment, care and feeding of a very complex beast, resulting in a high total cost of ownership.
SECURITY INFORMATION AND EVENT MANAGEMENT ■ IN THE HOTSEAT
AK: WHAT DO YOU THINK ARE THE MAIN CHALLENGES FOR SIEM 1.0 USERS?
CP:
One key challenge is that SIEM 1.0 users don’t have forensic infrastructure in place because they have an event management solution. What you really want is a log and event management solution that spans the entire IT infrastructure. So all logs can be viewed across all the different layers—networking, device, host, application, database—and events can be identified while maintaining access to all the raw log information, when needed, to help support an investigation. For example, log data doesn’t always have the right context— a log might contain two IP addresses. So which was the attacker and which was the target? It might contain two user names, but which user actually performed the action? We need that contextualization and enrichment in order to make that data more valuable. Another challenge for SIEM 1.0 users is environmental awareness— bringing in more information around the events in question and understanding what is actually happening at the network layer, collecting net flow data, and being able to correlate network traffic activity with IDS events and alarms. This is a big challenge in terms of enriching the overall experience of what information is available to help make better decisions, or to better enable automated analysis capabilities. There is also a lot of focus on where events took place. We not only want to know where activity occurred in terms of devices and hosts, but we also want to know who did it. There needs to be as much information on who performed the action as where it occurred. Lastly, I want to bring up scalable robust monitoring, analysis and reporting. I’ve heard horror stories from early SIEM products where loading 10,000 events would cause the console to crash. That is not scalable analytics—we’re talking about collecting millions or hundreds of millions of logs per day in a typical installation that may need to be translated and boiled down to a million or more events per day. So the analysis layer needs to scale in order to bring huge amounts of data into a single analysis interface and allow “drill-down” in a very rich, highperformance manner.
and environment, looking for sensitive data that might serve to better themselves financially. These are very difficult things to detect inside of networks because they’re unknown. There’s no specific attack pattern and it occurs over a long period of time. I think we need to look at behavioural-based techniques detecting this type of activity. Because we have these platforms now with SIEM 2.0 that are collecting log, forensic and network information, we can begin to build much more sophisticated models around which behaviours can be defined, and can then detect more anomalous and unknown activity that could be very high risk. AK: SO CHRIS, ARE YOU SAYING THAT YOU NEED MORE DATA, PARTICULARLY APPLICATION LAYER DATA, TO AUGMENT THE OPERATIONAL AND SITUATIONAL PICTURE?
CP:
Yes, absolutely. And we’re back to that second key point from the first question, where SIEM 1.0 products only bring in highly-filtered event data to correlate against, build a behavioural profile against, or perform statistical analysis against. The more data that is available to these analytic engines, the more capability there is in terms of the type of correlative rules that can be created and applied to that data. So we need lots of information to understand what’s going on and to feed to these machines’ learning techniques that will detect more sophisticated activity. It needs to span from the physical realm (like physical badge readers), to network flow data and in everything in between. The more information we have, the more hope we have of understanding what normal and abnormal is.
“... we need to feed the human capability with information in a form that can be more easily digested...” CP:
AK: SIEMS HAVE TRADITIONALLY PLAYED A ROLE IN THE CORRELATION OF NETWORK OR INFRA STRUCTURE LEVEL INFORMATION—HAVE SIEMS EVOLVED IN THE LAST FEW YEARS?
CP:
This is where the “state-of-the-art” is evolving and will continue to evolve. Certainly, there is impressive capability in the market in terms of correlating network and security level information, but it needs to go beyond correlation—which is typically pattern-based. This means that we’re looking for something familiar which we’re pretty capable of doing. But, we’re not so capable of detecting those things that are somewhat known and, even worse, those things that are completely unknown. An example of that might be a rogue user—I have an internal user gone rogue who begins to slowly and meticulously scan the network
AK: WITH THIS VAST AMOUNT OF DATA AND INFORMATION, IT CERTAINLY MUST BE A DAUNTING TASK FOR ANALYSIS?
It is, and that is why automated analysis is the end goal and where I think we will continue to advance for the foreseeable future. We’ll see more sophisticated methods for looking across all of this data and pulling out the needle in the haystack—identifying those things that are highest risk and should be looked at immediately. We need to get to that point and we’re making a lot of progress, but there’s still a lot of evolution left. In the meantime, and even after, we still need to analyze a lot of information because in the end the human brain/eye is still the best analysis engine available today. We need to feed the human capability with information in a form that can be more easily digested through things like aggregation, and where trends can be more easily spotted through effective visualization techniques, so that I can actually look at millions of logs or events in a single pane of glass and get to a root issue or cause very quickly. That is part of what we call total performance with our solution. SIEM 2.0 products not only meet scalability requirements in terms of collection, but they present that information effectively to users. That is total performance.
33
IN THE HOTSEAT ■ SECURITY INFORMATION AND EVENT MANAGEMENT
AK: YOU’VE GONE THROUGH THE CHALLENGES OF SIEM 1.0 AND TOUCHED ON WHAT’S NEEDED IN AN UPGRADE. SO NOW LET’S STEP UP TO SIEM 2.0. DO YOU THINK THE LIMITATIONS THAT YOU JUST DESCRIBED ARE ADDRESSED APPROPRIATELY IN THE NEXT GENERATION OF PRODUCTS?
CP:
We’re certainly making progress as an industry. Right now you have solutions that were designed to provide this platform from day one. I consider LogRhythm one of those solutions—where the architecture lends itself to providing the level of analytics capabilities I’ve been talking about. And then there are other SIEM 1.0 products that have acquired integrated log management technology. They’ve bolted it on and tried to make it work with their SIEM 1.0 platform and some will be better at that than others. There are also log management vendors that have gone out and acquired security event management technology and tried to integrate those. The success of these endeavours will be dependent upon how well they’re able to integrate this functionality. I certainly think those organizations and products designed from the outset to do both and provide this SIEM 2.0 platform truly have an advantage.
A lot of compliance standards require organizations to do this. They don’t necessarily need to purchase intrusion detection, but they need to support intrusion detection as a function in the business. SIEM 2.0 products make supporting that security business function much more achievable and efficient. All events can be brought into a single central location with prioritization methods telling you the most critical event at any point in time, plus some automated analysis on top along with correlation to identify those needles in the haystack. Many compliance standards also require you to have an incident management and support process so that if something occurs you know how to respond appropriately. SIEM 2.0 has built-in incident management which reduces the cost of responding and managing incidents. For example, SOX requires privileged-user monitoring. which is needed to monitor users who have “the keys to the kingdom”. Being able to bring all the log data in across the entire IT infrastructure enables a level of privileged-user monitoring that is very difficult, if not impossible, to achieve without a centralized logging solution. Finally, there’s file integrity monitoring, which is often a specific requirement of PCI to detect when a file is accessed, modified, read or moved. This functionality is much more powerful when it’s an integrated part of a SIEM 2.0 platform. Those are the highlights, but there are numerous areas where SIEM 2.0 can help a company achieve compliance coverage. Personally, I think SIEM 2.0 is quite a bargain for the money, because you get a lot of compliance coverage and value with a single solution. And you can also implement mitigating controls for a lot of other areas, plus you have the ability to monitor and report certain types of activity. With SIEM 2.0 you have a very large platform for detection. Combined with the other value that data can bring to the IT help desk to support the internal audit process, application developers and so on, I think you get a lot of value for your dollar with the SIEM 2.0 platform—beyond immediate compliance needs. So I think it’s very good thing to consider early on as part of your overall compliance strategy.
“SIEM 2.0 products not only meet scalability requirements in terms of collection, but they present that information effectively to users.”
AK: I THINK IT’S IMPORTANT TO LOOK AT COMPLIANCE BECAUSE NO DOUBT OUR AUDIENCE ARE FINDING THAT STANDARDS LIKE PCI-DSS, SOX AND SO ON ARE CAUSING THEM NIGHTMARES. SO HOW IS SIEM HELPING USERS COPE WITH THESE STANDARDS?
CP:
SIEM can help in a variety of ways. One is the log management component which collects data, archives it, safeguards it and allows you to retrieve it—months or even years later. Then there are the more traditional security event management capabilities in terms of real-time monitoring, incident management, remediation and support. A lot of standards require you to look at daily logs, centralize and safeguard them, and keep them for a certain period of time. SIEM 2.0 products can help automate that process, increase capability and significantly reduce the cost. Compliance is probably one of the biggest drivers in the market right now in terms of people purchasing SIEM 2.0 platforms. Next is automatic log analysis and reporting. SIEM 2.0 can help to automate a lot of the analysis requirements and provide scalable, consistent and consolidated reporting across all of that data. To do that in any reasonably-sized environment without a SIEM 2.0 solution in place capable of automating much of it would be very, very difficult. There’s also exception detection. Compliance requirements often call for specifically looking for things that should not be occurring on the network. SIEM 2.0 can help detect the movement of data between the PCI environment, for example, and external networks, automatically building rules to notify administrators appropriately. Next is central intrusion detection, monitoring, analysis and response.
34
Chris Petersen CTO AND CO-FOUNDER, LogRhythm
Chris Petersen combines extensive industry experience in information assurance and network security with an innovative approach to technology. This drives LogRhythm’s strategic vision of delivering the most comprehensive Log and Event Management solution on the market today. Chris has served on the faculty of the Institute for Applied Network Security, has numerous speaking engagements, and is frequently quoted in industry leading publications. He holds a degree in Accounting/ Information Systems from Colorado State University.
HEAD TO HEAD ■ DATA LOSS PREVENTION
Working hand-in-hand
http://www.GlobalETM.com
F
inding and implementing a DLP solution that works across your infrastructure is key to staying secure in the face of new technologies. ETM’s ALI KLAVER talks to KATIE CURTINMESTRE and ANDREW MOLONEY (RSA, THE SECURITY DIVISION OF EMC) about the role of content awareness in successful DLP implementation AK:THERE IS A LOT OF CONFUSION IN THE MARKET AROUND WHAT CONSTITUTES A DATA LOSS PREVENTION SOLUTION. CAN YOU SHARE HOW YOU DEFINE THIS TERM?
KCM:
The term we use is the same as the analyst community— specifically the Gartner definition. Gartner defines a data loss prevention solution as tools that are used to prevent inadvertent or accidental leaks or exposure to sensitive enterprise information, using content inspection tools. The key part is that these solutions use content inspection to protect the data. A solution that does encryption of your hard drive, for example, does help to prevent the loss of data, but would not be a DLP solution as per the definition that the vendor and analyst community are using.
36
AM:
I think the key is content awareness, as Katie says. This brings a whole new dimension to the ways we’re able to secure information. AK: WHAT REALITIES ARE DRIVING THE ADOPTION OF DLP SOLUTIONS?
AM:
I think we’re seeing an unprecedented shift in the way we operate businesses, both around the world and obviously here in Europe as well. Just think about, for example, the number of different identities that now operate in your organization. We don’t just have employees any more—we have contractors, temporary employees of various descriptions, suppliers accessing our data, and we have our customers increasingly self-servicing their requirements. If you think about the amount of information we’re creating now, it’s being generated at an unprecedented rate. Then we’re accessing that information across an increasingly complex infrastructure. It’s also about mobility and the way in which we’re virtualizing and looking at cloud computing. All of this creates a very complex challenge for securing information.
DATA LOSS PREVENTION ■ HEAD TO HEAD
When we think about the risks that data is being exposed to as it moves between these various people and types of infrastructure, we need to take a completely different approach to how we secure information.
KCM:
The other thing I would add is that there are two main reasons now that customers typically deploy a DLP solution. The first is a need to comply with relevant regulations, for example around protecting credit card data, or perhaps relevant regulations either EMEA-wide or in particular countries around securing the personally identifiable information of clients. The other big driver is securing intellectual property, and that’s any information that might be sensitive to the business, from road map information to customer contact information. There was a case of this recently in the UK where a Telco suffered a large data breach from employees obtaining information about the mobile contracts of their clients to sell back to the competition.
AM:
That’s just one of many examples we’re seeing. I was reading a story that’s just come out on the wire where 100,000 German credit cards have been recalled following a suspected security breach at a Spanish payment processor. We’re operating across multiple boundaries and these breaches, while they obviously put individuals at risk potentially, are also incredibly damaging for the organizations involved. When we think about the complexity of the infrastructure we’re now trying to protect, it’s clear that traditional security controls just aren’t able to meet the challenge. AK:I COMPLETELY AGREE, IT IS BECOMING MORE COMPLEX ACROSS THE BOARD, PARTICULARLY WHEN YOU’RE TALKING ABOUT THINGS LIKE PCI AND MIGRATING INTO VIRTUALIZATION. SO WHAT SHOULD CUSTOMERS TAKE INTO ACCOUNT WHEN SEARCHING FOR DATA LOSS PREVENTION SOLUTIONS TODAY?
KCM:
One key consideration is to evaluate DLP solutions that can protect information holistically across the infrastructure. It’s not only those solutions that can find sensitive data on the end point, but also solutions that can address the requirements to protect sensitive information travelling via the network, and also looking at finding and
protecting sensitive information stored in the data centre whether that is your databases, SharePoint sites, your file servers and so on. This is important so you can protect your information across the infrastructure with one way to define a policy, and one way to protect the information. Another key consideration of any particular solution is to look at the investment that has been made around developing the policy library to define what information is considered sensitive. You also want to look at the accuracy of the policies offered by the vendor. The accuracy of a solution is very important because it has a lot to do with whether you can prevent sensitive data from leaking out of the organization, but more importantly it has a major TCO element to it. If you have a lot of false positives, that means you need more security analysts to sift through those incidents that are not legitimate. The other key thing to look at is how well the solution integrates with other security products in your environment. For example, how does the solution integrate with whatever tool you might be using for Security, Event and Information Management?
AM:
I’d like to add another level here. As you move around Europe we’ve got the implementation of the various European directives, like the European Privacy Directive, for example, and they’re all slightly different across the markets in Europe. I think it’s important when you’re choosing a technology that you’re also choosing a long term partner who has the critical mass and the ability to continue to develop that content library at the speed with which regulation and best practices evolve in your particular market. I’d also re-emphasise the point that you made, Katie. The great thing about DLP is it adds content awareness to your overall information security strategy, and this system level view of your overall information security infrastructure and the degree to which the DLP technology enables the rest of your policy and strategy, becomes incredibly important. Selecting not just best in class technology but also best in class in terms of integrating into a broader system becomes really critical as well.
AK: YOU’RE BOTH ABSOLUTELY RIGHT, AND IT’S ALSO ABOUT BRINGING IT BACK TO A MORE BASIC LEVEL AND UNDERSTANDING HOW YOUR INFORMATION WORKS, WHERE IT GOES, HOW IT’S USED AND SO ON. THOSE CONSIDERATIONS ARE ALSO VITAL IN CONSIDERING A DLP SOLUTION. SO, DOES A DLP SOLUTION REALLY HELP A CEO TO SLEEP BETTER AT NIGHT? AREN’T THEY MORE CONCERNED ABOUT CONTROLLING THEIR SENSITIVE DATA THROUGH ENCRYPTION, ACCESS CONTROL AND DATA CONTROL STRATEGIES VERSUS ADOPTING DLP?
KCM:
I think that control strategies are very important in protecting sensitive data but in and of themselves are probably insufficient. When we work with clients who have deployed DLP we find that the reason sensitive information is often put at risk is not due to lack of controls, but rather issues around broken business processes within an organization. This is often related to issues around the education of the end user of the company’s security policies. A good case in point here is one of our customers who found that every day at around 5pm the employees were sending out sensitive information from work to their hotmail and Gmail accounts. After looking into it further, it turned out they were doing this because they wanted to continue to work from home, and were doing so because they
“... find the champions in your organization that really care about protecting sensitive data and who are going to be the advocates for deploying a data loss prevention solution.” 37
HEAD TO HEAD ■ DATA LOSS PREVENTION
found their corporate VPN too cumbersome to use. So DLP will help you uncover broken business practices and educate the end user about the important role they need to play in terms of securing sensitive information. If you simply block people’s activities you might not get to the root cause of why people are behaving the way they are.
AM:
Equally, if you started to block some of these activities that are happening in a blanket ban type of way you’re also potentially impacting your business. What Katie’s example just described was people working from home and actually adding value to the business—do you really want to stop that, or do you first want to understand the nature of the information that is potentially being put at risk by exposing it on a public email system? Often we talk about moving from a scenario where it’s about bad people trying to do bad things, to one where the primary risk is good people doing dumb things. I think DLP is a fantastic enabling technology to address many of those kinds of risks. They’re either doing things because the systems and processes don’t allow them to work any other way, or they’re fundamentally badly educated in terms of security risk. So DLP, in its ability to provide security controls around unstructured data and data which is living outside of the normal databases and normal controls, is incredibly powerful in that context.
LEVEL. MANY CUSTOMERS ARE WORRIED THAT THEY MIGHT RUN INTO PROBLEMS DEPLOYING DLP DUE TO CONCERNS AROUND EMPLOYEE PRIVACY, SO HOW CAN THIS CHALLENGE BE ADDRESSED?
KCM:
We run into this issue quite a lot. It comes up globally but I would say it especially comes up in EMEA where the laws protecting the rights of the employees are quite strong. We recently addressed this issue at the RSA conference where we convened a panel with Stewart Room from Field Fisher Waterhouse who’s an expert in IT and privacy, as well as one of our customers in Europe from Scandinavian Airlines, and we discussed this issue at length. What we found during the panel discussion is that it’s very possible to proceed with a DLP implementation, even in organizations where employee privacy is a big concern. The keys to success are threefold. First, your company has to have an established, enterprisewide network or systems usage policy. If that policy is in place and it clearly states that emails may need to be monitored, you’re going to be able to move forward with your DLP project with confidence. Second, it’s important to ensure transparency so employees are aware of the activity and that a DLP solution is in place. The third point is an organizational one. The security team have to work with the relevant stakeholders from the human resources department, legal and the employee union to ensure there is project buy-in. If you follow these best practices it is possible to be able to implement DLP successfully even in organizations where employee privacy is an issue.
“...evaluate DLP solutions that can protect information holistically across the infrastructure. ”
AK: THAT’S A GREAT EXAMPLE— IT’S NOT NECESSARILY THE MALICIOUS SHARING OF INFORMATION, IT’S PEOPLE WHO ACTUALLY WANT TO WORK FROM HOME AND ARE UNAWARE OF ANY OTHER WAY TO DO IT. I THINK IT’S VITAL TO UNDERSTAND EXACTLY HOW YOUR INFORMATION IS BEING USED AND WHAT THE MOTIVATION IS BEHIND PEOPLE DOING THESE SORTS OF THINGS. LET’S TAKE IT TO ANOTHER
38
AK: PERHAPS WE SHOULD GIVE SOME ADVICE TO OUR AUDIENCE ABOUT BEGINNING THEIR DLP INITIATIVE. WHERE IN THE INFRA STRUCTURE IS THE BEST PLACE TO START A DLP PROJECT? MANY OF OUR READERS ARE VERY CONCERNED ABOUT SECURITY ON THE END POINT, FOR EXAMPLE, SO IS THIS THE BEST PLACE TO START?
DLP Assessment Tool
F
or customers who want to understand their risk profile for securing sensitive data
Together with the research firm Aberdeen Group, RSA is offering a complimentary web-based tool to help enterprise companies assess their current capabilities to safeguard sensitive data. Based on over 22 years of customer research conducted by the Aberdeen Group, this tool benchmarks customer’s data protection practices relative to their peers and also quantifies the cost savings associated with investing in DLP based on the number of security/ audit incidents and the cost per incident. This online survey assessment takes approximately 10 minutes to complete, and delivers a customized report to help you understand your organization’s current capabilities for safeguarding critical data. Begin the complimentary data loss prevention assessment at http://rsadlpassessment.aberdeen. com
KCM:
We would recommend starting with an implementation either within the network or looking at data at rest, because you’re going to see a more effective return on your DLP. The network is a good place to start because it’s a lot more straightforward to implement. After deploying the project you’re going to have a good sense of what sensitive information is moving through your email or being leaked out over the network. The value of looking at doing a data loss prevention project within your data at rest or data centre is it really helps you to get to a more proactive stance. We often find that customers don’t even know where their sensitive information is, especially in unstructured repositories like SharePoints and fileservers. So if you do have sensitive information on your end point then it came from somewhere else, and you really want to find out where that information is and what steps you can take to protect it. After getting to a roll out of DLP in your network and looking at your data at rest, the next step is to tackle the end point. If you start out on the end point and there are challenges with the project, you end up blocking people from doing their work and can raise a lot of other issues. You might end up having problems with your DLP project before you can get it off the ground. The other thing that we think is really
DATA LOSS PREVENTION ■ HEAD TO HEAD
important, along with looking at where within the infrastructure to start a DLP project, is what steps to take. We recommend starting first by monitoring what’s happening in your organization as it relates to sensitive information, moving from monitoring to then auditing what is occurring and looking to address issues around broken business processes or end user education. The last and final step is actually blocking people from sending data on thumb drives, blocking email communications and so on. Following that three-step process, versus jumping to a blocking, will enable more successful projects.
AM:
Let me underline that with an example and reinforce Katie’s point. In the last week or so, I’ve been reading about my local council here in the UK who have had three laptops stolen, one of which contained 14,000 records of people registered for postal voting. You have to ask yourself, how did 14,000 records end up on a laptop? With the right DLP controls in the network, you’ve got to believe that potentially those records might never have ended up on a laptop, and might have stayed safe and secure back in the infrastructure, on a server and under tighter control. I think that’s a good example of where an end point strategy would have been the secondary line of defense as opposed to the primary line of defense. Then the stolen laptop would have been far less of an issue had that information not resided on it. AK: THAT’S A FANTASTIC EXAMPLE. WHAT ADVICE CAN YOU GIVE CUSTOMERS WHO ARE LOOKING TO BUILD A BUSINESS CASE FOR DATA LOSS PREVENTION IN THEIR ORGANIZATION?
KCM:
One of the most important things is to find the champions in your organization that really care about protecting sensitive data and who are going to be the advocates for deploying a data loss prevention solution. For example, if your project is more compliance-oriented, you want to have the legal team and your compliance organization behind you and in favour of moving forward with the DLP project. If your concern is more around protecting intellectual property such as road map information or data, then the relevant business unit owners or people within a functional organization are going to be advocates of
deploying DLP because they have a vested interest in protecting their intellectual property. The other thing to look at is quantifying the value of deploying DLP within an organization. It’s going to center around looking at the number of security incidents relating to sensitive data and also the cost per incident. We have found that companies that deploy DLP not only reduce the number of security incidents, but also reduce the cost of addressing an incident over time, and that also shows how you can quantify the benefits of deploying DLP in an organization.
“When we think about the complexity of the infrastructure we’re now trying to protect, it’s clear that traditional security controls just aren’t able to meet the challenge.”
AM:
I’d say number one is to not try and boil the ocean. In any successful information security strategy you’re going to take an information-centric and risk-based approach to defining where your priorities really lie. Figure out what the information is that you really care about and what the real risks are, and that will give you a really good starting point in figuring out who the key stakeholders are and how to move the project forward in a practical and meaningful way, as well as giving you the best return on investment in the shortest possible time.
Andrew Moloney DIRECTOR OF MARKETING, EMEA RSA Andrew is responsible for ensuring the communication of RSA’s strategy and approach for identifying, assessing and mitigating information risk. Previously responsible for RSA’s business in the EMEA Financial Services sector, he brings a wealth of experience and knowledge to the topic. He has wide experience in the industry, having worked for both established and start-up communications, software and mobile vendors.
The other piece that I’d add is something we talked about earlier. This is the integration with other parts of your information security infrastructure—specifically a Security, Information and Event Management platform. If you think about the additional power that might be gained from making that platform content-aware, so that you can now see a security event and can understand the kind of information that is being put at risk, then that gives you a very powerful platform to build out the business case. It also helps management and the decision makers in your organization understand why you’re prioritizing the various projects and stages of a project.
Katie Curtin-Mestre DIRECTOR OF PRODUCT MARKETING RSA
Katie Curtin-Mestre is the Director of Product Marketing at RSA, The Security Division of EMC. In this role, she is responsible for market strategy development, go-to-market planning and execution, and business analytics. Prior to RSA, Katie held a variety of product marketing and product management roles at EMC in the storage resource management, storage virtualization, replication and storage hardware market segments.
39
Could you use a little direction when choosing a DLP solution?
One company is ready to guide you. Visit www.rsa.com/SelectingDLP and download “Five Considerations for Selecting a Data Loss Prevention Solution.”
ITSM— payback time? Responding to change quickly and effectively while keeping an eye on the bottom line is proving to be problematic for a lot of organizations. CLIVE LONGBOTTOM (QUOCIRCA) tells us why ITSM is great for business.
ANALYST FEATURE ■ IT SERVICE MANAGEMENT
In many organizations, more money is spent keeping the IT infrastructure running than is invested in new IT capabilities used for supporting the business. And yet, the vast majority of servers in use are running at less than 10% utilization. Desktop machines tend to be configured at different patch levels—or even running with a range of operating system versions. Data is duplicated many times over, and storage utilization rates only hit around 30% before someone decides that more capacity should be brought in. Strangely enough, all of this seems to be hidden from the business because the perception by IT people is (and this is more than likely correct) that if the truth were known, there would be a few heads that would need to roll. Any business unit that allows inefficiencies of 70-90% to continue needs root and branch reform—particularly with the current pressure on spending. The biggest issue is not that such inefficiencies are there—it’s that they do not need to be there. IT systems have grown with little control; the mindset has been on “one application per hardware server”. As a new department or division asks for something, a new application is provisioned on a new server. Each decision in itself probably made sense at the time, but the problem with the sum total of all decisions is that many organizations now have IT systems that are out of control, with little real knowledge of what is there and what it is doing.
I
MAKE ITSM WORK FOR YOU T Service Management (ITSM) is one way for IT departments to gain control over the IT infrastructure, and to do it in a manner that can make IT far more capable of doing its job—supporting and facilitating the business. Quocirca research (www.quocirca.com/ pages/analysis/reports/view/store250/ item20971) shows that few organizations know how many IT assets they have across their environment. Without such information very little control is possible, and yet asset inventory management tools have been around for some time. Once a believable and trustworthy knowledge of the IT inventory has been built up, including mobile and other occasionally connected devices such as home workers’ PCs, ITSM can start to add real value. On each computing device there is a “stack”— the software that makes the whole thing do what it is meant to. This starts down at the firmware level of the basic input/output system (BIOS), and then possibly a hypervisor (if virtualization is being used), an operating system, possibly an application server platform, and then the application or service that is required to carry out a given function or series of functions. The trouble is that each link in this chain can (and does) change. There will also be dependencies up and down the chain—a change at the application level may well be dependent on device drivers held at the operating system layer. Therefore, manual updates across a large IT inventory can lead to multiple failures, requiring the rolling back of the change to a known position, and then manual updates to the dependent underpinnings. This is expensive, prone to error, and many machines will either end up in a hybrid or unknown state, with the manual changes carried out making each machine different
“ITSM... should be seen as a core investment, as a necessity, as a competitive advantage.”
42
and therefore even more prone to problems and expensive to manage. ITSM comes to the rescue as it drills in to each machine and builds up a complete picture of what there is on that machine—all the hardware, the firmware and the software, along with the revision and patch level of each item. Then, when a change needs to be made, ITSM can rapidly identify all devices that can be automatically upgraded. For those that cannot be automatically directly upgraded with the patch/upgrade due to an underlying problem, ITSM can advise on what needs to be done. If it is something as simple as a new device driver, it can automatically provision that, and then do the upgrade. For something more fundamental (e.g. an old machine that fails to meet the hardware requirements for the upgrade), it can be left at its existing level while raising an exception to IT service personnel to identify that this had to be done—and give the reason why. Even in cases such as these, many ITSM systems can automate the ordering of a new device that meets the requirements for the new upgrade. Once the new kit has been delivered and installed in the data centre or at the user’s desk, ITSM can then ensure that the new device is brought up to date by checking its configuration and automating any updates as required. This then brings in other areas of ITSM, for example, the help desk. Cutting out manual tasks means more cost savings, and in many cases, efficiencies are introduced that can really start to provide payback. For example, providing webbased self-service to users for areas such as resetting passwords means that help desk staff are freed for other more complex or challenging activities. Enabling web-based ordering of IT and office equipment can ensure that employees order from authorized suppliers—maximizing volumes and so optimizing discount levels. Self-service portals can enable users to request specific software that can then be provisioned to them automatically, and cross-charged accordingly. ITSM tools can track software licences as part of a machine’s inventory, ensuring that an organization is neither under- nor over-licensed. Many ITSM systems automate the movement of licences from one machine to another which is very cost-effective where concurrent licensing is allowed.
IT SERVICE MANAGEMENT ■ ANALYST FEATURE
“IT systems have grown with little control...”
S
BUILDING COMPETITIVE ADVANTAGE o, the main benefit of ITSM is automation—cutting out much of the expensive and error-prone human activity from IT management processes. However, the majority of ITSM systems acknowledge that many organizations will not want everything fully automated from the get-go. Therefore, the majority of processes will still include the kicking off of events that include human interaction, even if it is only to validate and agree a course of action that ITSM will do automatically anyway. Quocirca has found that most organizations require a high degree of human intervention to
start with, but will increasingly need less as their faith in ITSM grows. If ITSM can be used to gain sufficient control over an IT environment to change the operational expense/investment spend ratio by just a few percentage points, it can make a major difference to most organizations. The worst organizations are running at around an 80/20 ratio—if this can be moved to only a 70/30, an organization will be investing 50% more in supporting the business. If the utilization of server hardware can be increased to 20% from 10%, then half the amount of hardware will be required—with the concomitant savings in power that go with this. If 20% less calls come through to the help desk, you gain 20% more time from your
help desk staff that can be spent on business issues. None of the thing previously mentioned particularly stretch targets, yet just imagine how the business would view such savings. ITSM should no longer be seen by an organization as a cost or as a “nice to have”: it should be seen as a core investment, as a necessity, as a competitive advantage. If not, the risk is that everyone else will go for full ITSM—and then the lack of it is a definite competitive disadvantage. ITSM not only brings in control, it also brings in flexibility. Without ITSM an organization will always be fire fighting and will be unable to respond to change as rapidly as the business requires.
Clive Longbottom
SERVICE DIRECTOR, BUSINESS PROCESS FACILITATION, Quocirca
In his position Clive covers the need for companies to understand the core processes in their value chains, and the technologies that should be utilized to facilitate these processes in the most flexible and effective manner. In his remit, Clive covers collaborative tools, workflow, business process discovery and management tools, service-based architectures and outsourcing, as well as other associated areas such as security, voice/data convergence, and IT asset optimization.
43
GARTNER SYMPOSIUM/ITXPO
THE WORLD’S MOST IMPORTANT GATHERING OF CIOs AND SENIOR IT EXECUTIVES BALANCING COST, RISK AND GROWTH Symposium/ITxpo 2009® is designed to deliver the insight, tools and relationships you need to get through what may be the toughest year of your career. More than 200 presentations delivered by world-renowned Gartner analysts will cover all facets of how business technology can help you strike the right balance between cost optimization, risk mitigation and a carefully timed return to growth. In challenging times, organizations rely on their leaders.
IT leaders rely on Symposium.
Vision
Meet IT’s best minds. Keynotes by top CEOs.
Experience
CIOs, senior IT executives, and industry experts conferring on tough challenges.
ROI
Immediately actionable take-aways for each of nine IT leadership roles.
Solutions
The world’s top technology companies across IT.
Visit gartnerinfo.com/symposium/etm for an exclusive Enterprise Technology Management discount on your registration.
© 2009 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. or its affiliates. For more information, e-mail info@gartner.com or visit gartner.com.
18 – 22 OCTOBER, ORLANDO, FLORIDA GARTNER.COM/US/SYMPOSIUM 2 – 5 NOVEMBER, CANNES, FRANCE GARTNER.COM/EU/SYMPOSIUM
2009 The world’s most important gathering of CIO’s and senior IT executives
CHANGE MANAGEMENT ■ HEAD TO HEAD
The season for change T
he management of IT change can make or break effective IT service management. SCOTT CRAWFORD (ENTERPRISE MANAGEMENT ASSOCIATES) moderates a discussion with GEORGE GERCHOW and JOHN MURNANE (EMC) who say that change management is central to assuring the quality of IT.
http://www.GlobalETM.com SC: CHANGE MANAGEMENT IS SOMETHING THAT WE AT EMA RECOGNIZE AS BEING CENTRAL TO A NUMBER OF VITAL IT INTERESTS. IT CAN CERTAINLY BE A KEY ENABLER OF PERFORMANCE AVAILABILITY AND SERVICE LEVEL DELIVERY WHILE PROBLEM/INCIDENT CHANGE CAN BE AT THE HEART OF SOME OF IT’S GREATEST PROBLEMS. EMC HAS IDENTIFIED FOUR CRITICAL SUCCESS FACTORS FOR CHANGE MANAGEMENT—CAN YOU DESCRIBE EMC’S TAKE ON THIS GEORGE?
OF YOUR PROGRAM. IT SOUNDS LIKE THAT’S BEEN THE EXPERIENCE OF YOUR CUSTOMERS AS WELL, JOHN.
JM:
I think it starts off with visibility. A lot of organizations don’t have clear visibility to what changes are taking place, especially if they go outside of the process. We’ve started to get to the point where people are using workflow or change management processes, but what’s really important is having that visibility across the IT infrastructure. Another factor is lSO—what is the impact of the change that actually took place? That leads to accountability. So if I have visibility into the changes, now I have accountability as well. Let’s say that you’re supposed to patch 1000 systems, or update 100 different router’s IOS’s. Did you really do that or not? How many people go in and verify that those changes took place? So we see that visibility also leads to accountability. The other side of that is measurement. Now that I have visibility and accountability, I can start measuring my change management process.
I think you’ve both referred to a couple of critical things here. One is, knowing what to measure. That sounds very self-evident but it in effect defeats people before they even get out of the gate. It’s very closely related to something you referred to Scott which is that people design processes but often fail to execute them. And because you’re not executing them, you don’t really have anything to measure, and when you design these processes you have to consider those two key factors. How am I going to execute these processes? Can I execute them? Are these practical, tangible things that I can drive into my organization? If they are that will be very closely aligned to what you can measure. I know this sounds slightly vague, but the most common issue we have is where people have engineered metrics for their performance that can’t be measured. They pull together lots of different data from lots of different silos every month and just bring it together. This is a major issue in a number of businesses both internally and in terms of relations to your outsourcers and so on. I know it’s slightly off topic, but we have people in the business world doing a lot of interaction with third parties, and in a lot of the cases they’ve outsourced the management of critical bits of their infrastructure. But what they’ve struggled to do is drive a change process that incorporates all those players. There are different drivers and different demands, but the trick is to work out exactly what it is you need to measure.
SC: FROM EMA’S PERSPECTIVE THE THINGS THAT ARE MOST CRITICAL ARE DEFINING YOUR CHANGE MANAGEMENT PROCESSES AND IMPLEMENTING THEM. NEXT IS MONITORING THE ENVIRONMENT OR CAPTURING THAT VISIBILITY, BUT MEASUREMENT IS ALSO IMPORTANT TO MONITOR THE SUCCESS AND EFFECTIVENESS
SC: THAT’S ONE OF THE THINGS WE’VE SEEN AS BEING A CRITICAL SUCCESS FACTOR, CERTAINLY IN AREAS SUCH AS IT GOVERNANCE AND RISK CONTROL. HAVING SUPPORT FROM SENIOR MANAGEMENT IS ABSOLUTELY ESSENTIAL BECAUSE WITHOUT IT YOU ARE HANDICAPPING YOUR ABILITY TO SEE THROUGH A MATURE APPROACH TO CHANGE MANAGEMENT.
GG:
45
HEAD TO HEAD ■ CHANGE MANAGEMENT
JM:
You need a separate process for an emergency change. For example, I was the worst possible person to work with in IT when it came to emergency changes because I was fighting fires all the time. It’s an emergency change, it’s late in the change management process, and I had to get them through. The issue is that there’s not enough time and visibility to understand the impact of an emergency change, what the cause and effect is going to be and if everything gets audited and logged. So there has to be a separate process that has accountability for emergency changes and how often they’re being done.
SC: THAT’S A VERY GOOD POINT AND I THINK A LOT OF ORGANIZATIONS ARE SEEING THIS AS REFLECTING THE MORE MATURE ORGANIZATION. CAN YOU TALK ABOUT THE IMPACT CHANGE CONTROL HAS AND THE BENEFITS OF IMPROVING STABILITY AND RESOLVING SOME OF THESE ROOT CAUSE ISSUES THAT LEAVE BUSINESSES IN FIRE FIGHTING MODE SO MUCH?
I agree George, I think emergency changes often cause people the most trouble because you actually do your change management retrospectively. You don’t have time to get the CAB together and drive it through all the forms, so you make the change, but what’s critical is that you retrospectively understand why. And what you do then, coming back to this cycle of visibility, is that you’re able to measure the emergency changes you’re having more critically. For example, I have a series of emergency security changes. Is it because I’ve got a flaw in my security? So we see the essence of change management is about stabilizing your business.
GG:
From our standpoint change management is the absolute key because if you just mention the two changes you save, I think configuration would be the same if things didn’t change. Security policies for example—my standards would be the same if things didn’t change. My services would not go down if things did not change. If they can get a good solid change management process that has visibility and accountability, then they’ll be ahead of the ball game. Let’s also not lose sight of governance. As a security person, or as a line of business, if I can have visibility into what changes I keep making to the infrastructure that may impact our parts of the organization, then that’s key. SC: OBVIOUSLY ONE OF MY PRIMARY INTERESTS IN CHANGE CONTROL IS ITS IMPACT ON SECURITY. GEORGE, HOW DOES CHANGE MANAGEMENT AND SECURITY WORK TOGETHER, AND WHAT DOES A MATURE APPROACH HAVE TO DO WITH AN EFFECTIVE RESPONSE TO A SECURITY ISSUE?
GG:
Well, there’re a couple of different ways to look at it. The first one is that not enough security people have a role within change control or a change advisory board (CAB). Change directed by business or IT can affect the security and compliance posture especially when it comes to PCI. The other side of it is that when a security guy wants to push a change out there—how is that going to impact our business? We have to be able to do business first and we have to be able to service our customers. Missing those things really makes the change management process ineffective, and you’re shooting things out to the environment without understanding what the cause and effect is going to be on the business. SC: ONE OF THE THINGS I HEAR ABOUT FAIRLY OFTEN IS THE QUESTION OF HOW TO BALANCE A WELL-DEFINED APPROACH TO CHANGE CONTROL WITH THE NEED TO RESPOND TO AN EMERGING HIGH-PRIORITY VULNERABILITY OR THREAT, PARTICULARLY WHEN AN ACTIVE EXPLOIT IS CIRCULATING IN THE WILD. HOW DO HIGH-PERFORMING ORGANIZATIONS DEAL WITH THIS SITUATION WHEN AN EMERGENCY CHANGE CANNOT WAIT THROUGH THE SOMETIMES VERY DELIBERATE PROCESSES A CONFIGURATION CONTROL BOARD MAY REQUIRE FOR LESS URGENT CHANGES?
46
GG:
It continues to fascinate me in the world of today the amount of businesses that still need to address that question. The stability of your IT infrastructure is now becoming a critical metric of the performance of your business and is directly connected to your ability to compete and deliver revenue and your goods and services. In change management it’s dangerous to elevate one process over another, but I believe the benefits of change management are so stateable in terms of getting control of that infrastructure in order to effectively drive the profitability of your business, that it’s rapidly reaching the point where it’s an essential function.
JM:
SC: THIS SPEAKS TO ONE OF THE METRICS FROM OUR 2008 STUDY AT EMA THAT WAS REALLY INTERESTING TO US: WHEN WE SURVEYED ORGANIZATIONS IN TERMS OF THE GUIDANCE IT ORGANIZATIONS ADOPT MOST, ITIL CAME OUT ON TOP AS THE ONLY ONE ADOPTED BY A MAJORITY, AT 55%. NOR WAS THE NEXT MOST COMMONLY ADOPTED GUIDANCE ANY OF THE COMPLIANCE- OR GOVERNANCE-RELATED STANDARDS OR FRAMEWORKS SPECIFIC TO IT—IT WAS QUALITY MANAGEMENT. TALK TO US A LITTLE BIT ABOUT BRINGING BUSINESS QUALITY, IT AND CHANGE MANAGEMENT TOGETHER.
JM:
Quality practice standards are very much driven by constant examination of the business to understand what changes you can make to continually improve the quality of what you’re doing. In many ways they predate the ITIL or IT management framework and I believe there’re trying to achieve the same goal. ITIL is about how you manage your IT functions effectively and ensure quality initiatives, specifically how you run your business to induce best practice. They’re set out in a very similar place and, as a result, in organizations who adopt ITIL and already have Six Sigma best practice or similar frameworks in place, they tend to be very successful because they understand continual service improvement and examination. You design a process and within that process you have an actual methodology to continually improve and guide it, and I believe they’re very similar in intent.
GG:
I’m in complete agreement. I think that Six Sigma approaches and continual service improvement lie really well with ITIL. People pick up ITIL first for their organization to get IT running, but then they need to start putting measurement in place and then improvement which takes us to the fourth step. If I’m going to improve my overall infrastructure, I have to start finding deficiencies to create projects to get better when it comes to my quality of service delivery from IT to the business. As an example, one of the best questions I’ve ever heard in my life
CHANGE MANAGEMENT ■ HEAD TO HEAD
about improvement was a management service provider who said they were going to a virtual infrastructure but they wanted to know if the virtual machines were more efficient at delivering IT services to the business. Do they have more incidents, more problems, and more staff members working on those virtual machines than I do my physical machines? Until I can answer those questions I cannot fully say that the virtualization move for my organization what a good one. The only way to start doing that is to put analytics and metrics across the virtual and physical environment and asking which one has the greatest cost in the environment. SC: THAT’S A GREAT POINT, BECAUSE VIRTUALIZATION IS HAVING SUCH AN IMPACT ON THE BUSINESS, AND THIS IS AN AREA THAT IS CENTRAL TO ASSURING AN EFFECTIVE VIRTUALIZATION STRATEGY. HAVING VISIBILITY INTO THE CONFIGURATION OF THE INDIVIDUAL VIRTUAL MACHINES AND COMPLICATIONS WITH THINGS LIKE NETWORKING AND COMMUNICATIONS WILL INTRODUCE A NUMBER OF NEW PARAMETERS INTO MANAGING THE ENVIRONMENT. THAT COMES BACK TO OUR DISCUSSION OF METRICS. GEORGE, YOU’VE MENTIONED A FEW, BUT WHAT ARE SOME OF THE METRICS INVOLVED IN MEASURING AND MANAGING CHANGE IN THE VIRTUAL ENVIRONMENT THAT GO BEYOND THE EXAMPLES YOU’VE ALREADY SHARED?
GG:
One of the first ones is the rate of constant change within the virtual environment itself. We talk about virtual sprawl all the time—are my machines moving around the different types of environments and what’s the cause and effect of that to the business? Another one is how is the virtual environment affecting my PCI or compliance posture? Because of the complex virtual environment, we find that it’s harder for people to stay compliant because they don’t have that visibility. Here is where security is so important. If security starts getting involved in the change management process and sitting on that CAB, now they have visibility across not only the virtual and physical infrastructure, but across the different silos of IT. This gives them true visibility into what’s happening within IT operations and lines of business.
SC: PLUS THE OPPORTUNITY THAT MAKES THE CASE FOR INTEGRATING SECURITY MORE EFFECTIVELY INTO THE VIRTUALIZED ENVIRONMENT.
JM:
Absolutely, there’s no question in my mind that virtualization creates a whole raft of additional security considerations. The decision to move into that virtualized space should directly involve the security function and CAB. Virtualization is a great example for ITSM. You get what happens in IT because it’s such an immature function which is proven by the fact that the underlying technologies change so rapidly. Virtualization goes from having a one-to-one relationship between a physical server and a host to having a one-to-many. That creates, by default, massive security issues— never mind the management issues. In turn you’ve got compliance, governance and so on, and what happens in IT is that people rush in to stop new technologies because the reasons to do so are compelling, and virtualization is a great example of that. SC: YOU ALLUDE TO THE POTENTIAL OF THINGS LIKE PREVENTATIVE CHANGE CONTROL WHICH, FROM A SECURITY PERSPECTIVE, SOMETIMES GETS GROUPED UNDER THE GENERAL CATEGORY OF WHITELISTING. GEORGE, PERHAPS YOU CAN TELL US ABOUT THE ADVANTAGES AND CHALLENGES OF A PREVENTATIVE APPROACH TO CHANGE CONTROL?
GG:
Sure, I think taking preventative change control is great especially when it comes to a virtual environment. In virtual environments people are making any changes they want and most of them aren’t even going through the normal processes, so I think it’s a great place to start again by forcing those changes to go through, the same way I would in the physical environment. If people are defining service catalogues, then I have to make accountability of what’s virtual and what’s not, and I don’t see that a lot. I also don’t see regulatory compliance or governance. Are these mandated machines? Are these mandated router switches? How does that fit into my service catalogue so I can truly understand the impact of those changes from a security and compliance perspective? SC: WE’VE HAD SOME LONG TALKS WITH YOUR TEAM, GEORGE, ABOUT THE POTENTIAL FOR CHANGE
MANAGEMENT AND VIRTUALIZATION TO WORK TOGETHER AS A KEY ENABLER FOR A TOP-DOWN APPROACH TO IT GOVERNANCE. WHAT ARE SOME OF THE ASPECTS OF CHANGE METRICS THAT ARE DIRECTLY RELEVANT TO SECURITY? YOU ALLUDED TO PRIORITIZING VULNERABILITY REMEDIATION AND EMERGENCY PROCESSES—HOW DO ORGANIZATIONS DETERMINE AND MEASURE WHAT IS MOST IMPORTANT AS PART OF A MATURE CHANGE MANAGEMENT PROCESS THAT RECOGNIZES THE IMPACT OF SECURITY?
GG:
Here’s where the line of business shouldn’t control the entire change process. For example, when I was working for a financial organization they decided to buy software that required all of the back end sequel databases to have blank SA passwords. Come the day of implementation they had spent well over six figures, and then here we are rolling it out with blank SA passwords. How many audits do you think we would fail because of that, and how many vulnerabilities are opened up? Without having that visibility from a security standpoint it’s very difficult. Let’s take software updates. I’m doing a software update within the environment— what does that software really look like to the
Scott Crawford Moderator MANAGING RESEARCH DIRECTOR, Enterprise Management Associates (EMA) Scott Crawford, CISSP, CISM, heads the Security and Risk Management practice at Enterprise Management Associates (EMA). The former head of information security for the Comprehensive Nuclear-Test-Ban Treaty Organization’s International Data Centre in Vienna, Austria, Scott has been an IT professional in both the private and public sectors. A frequent speaker at industry conferences and events, Scott is also a Commercial-certified pilot and flight instructor. He can be reached at scrawford@enterprisemanagement.com.
47
HEAD TO HEAD ■ CHANGE MANAGEMENT
environment? Let’s say that I have a database that I know is regulatory mandated by PCI, however the front end interface does not. And now anybody can access that data. SC: JOHN, ONE OF THE THINGS WE HEAR FROM ORGANIZATIONS, PARTICULARLY THE LESS MATURE IN TERMS OF ITSM, IS THAT THEY MAY BE ITIL SCEPTICS BECAUSE THEY HAVE NOT EXPERIENCED EFFECTIVE IMPLEMENTATION. HOW DO THEY IDENTIFY THE THINGS THAT REALLY DO DELIVER TANGIBLE BENEFIT WITH AN ITIL APPROACH TO CHANGE MANAGEMENT?
JM:
That’s a good question, but my answer isn’t necessarily what you’d hear a lot. ITIL is a management methodology for managing IT. It’s not prescriptive. What it’s really saying is that you need to manage your IT function the same way you manage any other part of your business. Because IT is immature, highly volatile and frankly very complex, you need to adopt the wisdom and knowledge of a community of people who have evolved this platform to get you further. Change management is like a balance sheet of finance. It’s where you start. The challenge of managing that function, back to ITIL, is to sequence what you’re doing and in what order. I don’t see that discipline can be anything but beneficial to the business. SC: DRAWING THAT PARALLEL WITH FINANCE IS A GREAT EXAMPLE, BUT WHAT SPECIFIC GUIDANCE WOULD YOU RECOMMEND TO THE ORGANIZATION THAT NEEDS TO IMPROVE THEIR CHANGE MANAGEMENT PROCESSES AND SECURITY MANAGEMENT THROUGH IMPROVED CHANGE CONTROL?
GG:
A great way to start implementing and improving it is getting everybody on the same page and having the right people on the CAB across different parts of the infrastructure. You have so many different silos when it comes to organizations that nobody really talks and plays together. I think change management is a perfect place to start. If you get the right people on it you have one common central repository where you can start seeing changes. And people can start discussing what the impact of those changes may be. I also think that security people need to
48
start paying more attention to ITIL, ITSM and Six Sigma in general because this gives them a vehicle to tie right into the business. SC: I COULDN’T AGREE MORE WITH THAT. IN SOME ORGANIZATIONS WE SEE SECURITY OPERATIONS THAT HAVE A “RED TEAM” AND A “BLUE TEAM” COMPONENT. FOR THOSE NON-SECURITY PROS, A “RED TEAM” IS BASICALLY THE VULNERABILITY ASSESSMENT AND PENETRATION TESTING FUNCTION, WHILE “BLUE TEAM” REFERS TO VULNERABILITY REMEDIATION AND WORKING WITH IT OPS TO IDENTIFY AND RESOLVE VULNERABILITIES. ANY SPECIFIC GUIDANCE PARTICULARLY MORE ON THE “RED TEAM” SIDE IN TERMS OF INCREASING AWARENESS AND SENSITIVITY TO CHANGE MANAGEMENT PROCESSES, AS WELL AS FOR IT OPERATIONS PROS TO BE MORE AWARE OF HOW BETTER TO WORK WITH VULNERABILITY ASSESSMENT AND REMEDIATION?
GG:
It’s got to be a combination of both process and tools, and this is something that drives process people crazy when I say it at conferences. The first place to start is dealing with the process. If I can start getting processes defined then I’m part of the way there, but I have to have the tools, again, to have the visibility, accountability, measurement, and then improvement across the organization. So I must start combining those two. Most of these tools need to start having interfaces and integration back into service desk and change management workflows.
JM:
It’s no longer possible to plan just processes or just tools—nor should you. My view is that when you move into complex places like virtualization, you will never successfully manage those environments without being able to execute your processes across your combined tool environment. Security blankets the entire IT function and so does change management, and that’s why they need to sit side by side because they’re very closely related when you’re talking about the actual nuts and bolts of running an IT function. Once you start automating and controlling processes and using very sophisticated toolsets, you now create a scenario where it becomes, in principle, easier to create a lot more security problems in the environment because things
that didn’t previously talk to each other now do. Security needs to get close to this planning phase and to that architecture, and at the end of the day that’s what change management is. It all breaks down into processes, ITIL and terminology, but at the end of the day we’re just talking about trying to manage your IT function more effectively.
George Gerchow DIRECTOR OF BUSINESS DEVELOPMENT AND CORPORATE STRATEGY, EMC George Gerchow brings 15 years of IT and systems management expertise to the application of IT processes and disciplines that impact the security, compliance and operational status of complex, heterogeneous computing environments. George’s practical experience and insight from managing the infrastructures of some of the world’s largest corporate and government institutions makes him a highly regarded speaker and invited panellist on topics including ITIL, configuration management and operational and security compliance.
John Murnane IT SERVICE MANAGEMENT SPECIALIST, EMC John Murnane is an ITSM Specialist with EMC Ionix, a division of EMC devoted to best practice IT and Service Management. John has over 15 years experience in IT infrastructure and processes, gained on both the client and vendor side. John’s extensive experience includes providing ITSM implementation and consultancy services to customers of all sizes across all industry sectors. With a strong understanding of current and future market demands, John frequently presents on a wide range of ITSM and industry topics.
Keep your eye on IT EMC Ionix
From Physical to Virtual to Cloud. Next Generation IT Management EMC’s Ionix portfolio of IT and Service Management solutions delivers the visibility and control needed to effectively manage your physical and virtualised environments while meeting increasing service-level requirements. In short, EMC Ionix helps you realise the business value of IT in today’s rapidly changing environments, including specific solutions for: • • • •
Automated ITIL Service Management Service Discovery and Dependency Mapping IT Operations Intelligence and Root Cause Analysis Data Center Automation and Compliance
To understand & demonstrate the tangible benefits an ITIL approach can deliver to your business, visit: www.emcionix.com/servicemanager/uk to access EMC’s ITIL ROI Calculator.
EMC2, EMC, Ionix and where information lives are registered trademarks of EMC Corporation. © Copyright 2009 EMC Corporation. All rights reserved.
EXECUTIVE PANEL â– THE SERVICE DESK
Future focus
C 50
http://www.GlobalETM.com
urrent market forces are changing the way we see the future, and the roles of IT service management and the service desk are only becoming more important for the business. LISA ERICKSON-HARRIS (ENTERPRISE MANAGEMENT ASSOCIATES) moderates a discussion with three industry experts: CHRIS WILLIAMS (BMC), TIM ROCHTE (CA) and MATT FRENCH (SERVICE-NOW.COM).
THE SERVICE DESK ■ EXECUTIVE PANEL
LEH: LET’S START WITH TIM AT CA. ONE OF THE MARKET FORCES SHAPING THE SERVICE DESK OF THE FUTURE IS THE SERVICE CATALOGUE AND SERVICE REQUEST FUNCTIONALITY. HOW DOES CA SEE THIS DISCIPLINE AS CONNECTED TO THE SERVICE DESK, OR NOT, AND WHERE IS THE TECHNOLOGY HEADED?
TR:
We’re certainly seeing that same trend in the market. More and more IT departments are looking at how best to align with the business, and the shift of service catalogues and request management ITIL is critical for that. We’re seeing it tracking along with the ITIL v3 adoption curve which has been dramatically faster than the v2, largely because it’s building on that adoption curve in the first place. It starts to tie together the classic service desk best practices in terms of making it a single point of contact and automating the front end etc, as well as providing clear communication between IT and customers. When you start looking at a service catalogue you start getting a unified service model. It gets stored together, all the elements are linked so that everyone can agree on what really matters, and you can bring both service support and delivery into one cohesive path. What seems to be the driving force is that IT has to start looking at things from a business perspective. They have to describe what they’re doing in business language, provide standardized packages and options that work well with their customers and the rest of the business, and ultimately provide links out to some form of accountability. This doesn’t always have to go as far as charge back, although service catalogues certainly do start providing that infrastructure, but often it’s just being used for a show-back and making it possible to have communication about what the relative merits of the service are compared to the alternatives. When you tie it in with request management you also get the advantage of making self-service work well, so that it’s clear to the end user what they’re asking for: > What are the constraints? > What is the opportunity? > What does it cost? > Does it let you track how it’s being consumed? > Do you get to automate the delivery? > Are there standardized processes and ways to measure the fulfilment etc.
At the end of the day you get to measure that much better, you have better cost data so you understand the portfolio and you relate the costs to what the value is—this is the business service rather than just technical service. Ultimately we think that in the future it leads to looking at service portfolio management across the board rather than individual services. LEH: THAT IS CERTAINLY IN ALIGNMENT WITH EMA’S RECENT RESEARCH THAT WE EXPECT TO TAKE OFF OVER THE NEXT YEAR. THE NEXT QUESTION IS FOR CHRIS AT BMC. INTEGRATION, AUTOMATION AND WORK FLOW ARE KEY TECHNOLOGIES FOR ADVANCING A MATURE PROCESS-DRIVEN SERVICE DESK OPERATION. THESE INTEGRATIONS CROSS BUSINESS AND SILO MANAGEMENT DOMAINS. HOW DOES BMC RECOMMEND INTEGRATING SOLUTIONS INVOLVING MANY THIRD PARTIES AND KEEPING UP WITH THE EVER-CHANGING WEB LEVELS OF THESE PRODUCTS, AND WOULD YOU SEE THE CMS AS BEING THE ONE SINGLE ANSWER TO THIS DILEMMA, AND CAN IT SERVE WELL IN AN ENTERPRISE WITH MULTIPLE VENDOR SOLUTIONS?
CW:
That’s a pretty large topic, but it does have some unifying themes to it. Let’s take a look at what organizations are faced with today and what they’ve faced historically as far as getting their arms around the different solutions and versions they have and managing the nomenclatures associated with the different moving parts in their IT stack, whether it is from internally developed releases and products or those provided by vendors. The CMS definitely plays a very strong and integral part as a unifying reference architecture or structure that organizations can use to better manage all of those different offerings. What sits behind this is a need not only to embrace the technologies— this can certainly drive that if we’re talking about utilizing, for example, a federation process, so that we don’t have to replicate all of the data from different sources—but you still have timely and accurate management of the references to the different versions that are being consumed. There’s also the need to embrace a couple of very basic cultural shifts which is one faction that is the missing gauntlet for a lot of organizations— that is a dynamic shift to understanding how to properly record names, reference and reconcile them. Essentially it’s about how to better utilize the disparate standards across the entire organization, even though there are some prerequisite requirements that have already been identified as common processes. So does the CMS actually drive this? I think yes, it’s a very strong component. Going back to Tim’s response, when we get down to the portfolio level and understand the management practices, then we can utilize the CMS to go even further into the vendor consolidation activities or financial considerations that we want to standardize. But again, you still need those common practices, standards and nomenclatures in place in order to make the CMS a very effective tool.
“It’s important to have that whole holistic model in your mind.”
LEH: THOSE ARE GOOD INSIGHTS CHRIS. THE CMS, FORMERLY THE CMDB, CERTAINLY HAS TAKEN OFF. WE’VE SEEN STEADY GROWTH AND THE CHALLENGES FOR INTEGRATION JUST GET STRONGER AS WE DIVERSIFY OUR TECHNOLOGY BASE IN IT.
CW:
That’s true. I think there is one other driving factor that’s a very positive proponent, which is that IT on the whole is becoming a bit more mature. The elements available to support these different practices have definitely grown over the past few years and the understanding of how to deploy and manage them is increasing. I think we’re seeing a more intelligent approach to the different tools and how to leverage those in ITIL-based environments. LEH: I WILL SHIFT GEARS HERE INTO A COMPLETELY DIFFERENT DOMAIN OF EXPERIENCE FOR MATT FROM
51
EXECUTIVE PANEL ■THE SERVICE DESK
SERVICE-NOW.COM. IN MANY MANAGEMENT TECHNOLOGY DOMAINS, SOFTWARE-ASA-SERVICE HAS EMERGED AS A MEANS FOR MOVING FORWARD WITH LIMITED BUDGET AND WITHOUT THE HEADACHES OF MAINTAINING EQUIPMENT AND SOFTWARE INHOUSE. CAN YOU DISCUSS THE FLAVOURS OF IMPLEMENTING SOFTWARE-AS-A-SERVICE THAT ARE IN USE IN THE MARKET? HOW HAS SERVICE-NOW.COM CHOSEN TO ARCHITECT ITS SAAS SOLUTION?
MF:
“There are so many different moving parts in IT service management and you need to develop a very strong long term plan...”
I think there is a tremendous amount of momentum toward softwareas-a-service, but I think there’s also confusion surrounding the service. From our perspective we see a different type of movement than just cost-savings and the momentum behind software-as-a-service. It’s attributed simplification and not so much the technology aspect. Our customers are not buying Service-now.com simply because it’s software-as-a-service—and I think there are a lot of advantages when you talk about total cost of ownership—but it’s really more about using the flavours of technology that are used when you go home at night. For example, using online banking, amazon.com and iGoogle. These web applications are what’s familiar to us. Our goal is to bring the same usability, simplicity and power of online business to consumer apps to IT (what we call the consumerization of IT), to make IT and the tools that you use much more familiar and easier to use. Software-as-a-Service is a term that our competition tends to overuse. Our competitors are marketing their hosted client server applications as SaaS, when in reality, they are simply a regurgitation of a 1990s Application Service Provider (ASP) model. The big difference here is that what software-as-a-service, salesforce. com and even service-now.com have done is we’re delivering a more modern software—a technology that automatically upgrades. This happens three times a year where a company receives continuous improvement and new functionality on a regular basis as opposed to what we’ve dealt with in the past. You also have access to the software and can make as many changes as you like over time. Salesforce.com was the pioneer in Software-as-a-Service where multitenancy was the preferred delivery model. Service-now.com has learned a great deal from Salesforce.com, but we have also done things very differently. At the core, our application is delivered as a single-tenant system where each customer instance includes a dedicated data base and application set. We believe this architecture allows us to be highly scalable while ensuring customer data is not mixed with other systems. We think that software-as-a-service has been very beneficial for our customers, not only from a usability standpoint, but also total cost of ownership avoidance. Our SaaS delivery model reduces consulting and management costs by up to 70%, removes upgrade costs and eliminates infrastructure costs. LEH: THANKS MATT, IT SOUNDS LIKE A GREAT DEAL OF FLEXIBILITY CAN BE BUILT IN FOR THOSE ORGANIZATIONS. LET’S TOUCH ON YET ANOTHER SERVICE DESK TOPIC WITH
52
THE SERVICE DESK ■ EXECUTIVE PANEL
CHRIS AT BMC ON KNOWLEDGE MANAGEMENT, WHICH IS ONE OF MY FAVOURITE TOPICS. KNOWLEDGE MANAGEMENT HAS RISEN TO THE TOP PRIORITY LIST FOR FORWARD-THINKING SERVICE DESK LEADERS IN THE ENTERPRISE. IT OFFERS SO MUCH IN TERMS OF IDENTIFYING ROOT CAUSE QUICKLY AND PUTTING TOOLS IN THE HANDS OF SUPPORT ANALYSTS TO SOLVE ISSUES, AND IT ALSO PUTS INFORMATION IN THE HANDS OF USERS TO SOLVE THEIR OWN PROBLEMS. IS THERE A WAY FOR SERVICE DESK LEADERS TO PUT AN EFFECTIVE KNOWLEDGE MANAGEMENT PROGRAMME IN PLACE WITHOUT SIGNIFICANTLY ADDING STAFFING RESOURCES? IF SO, HOW CAN A KM TOOLSET ASSIST WITH THAT EFFORT? WE UNDERSTAND THAT KM IS A PROCESS THAT HAS TO BE MANAGED TO KEEP KNOWLEDGE FRESH, SO IF YOU COULD RESPOND IN THE CONTEXT OF LIFECYCLE UNDERSTANDING FOR KM THAT WOULD BE GREAT.
CW:
I think you’re hitting it straight on there when you talk about the importance or at least the awareness about knowledge management today. It directly affects everything that we try to accomplish within incident and problem management—specifically getting down to the roots, keeping problems from re-occurring and eliminating the critical impact to different services. It really has come down to a couple of base items that organizations have recognized. One is that the idea of hero-worship, or the tribal knowledge that exists in an organization that they’ve relied on for so long, isn’t scaleable. If anything, it is more costly to hoard that knowledge rather than share it across other ITSM disciplines that rely upon the information to restore services, provide work around processes, or to relay to respective service customers and consumers, as quickly as possible. There are also underlying business initiatives that go with that. It’s a more competitive environment than ever before, and of course everything comes down to reducing cost. No matter what type of vertical industry that we look at in the past few years, that’s been the common mantra among IT and business organizations. So how do I do more with less, and better yet, how do I become more productive with less? This drives right to the portion of your question about getting programs into effect without having to add resources. When you look at knowledge management you also have to take one step up and look at what knowledge management serves beyond just problems and incidents. That goes back to users helping themselves. Every transaction that an end-user can perform on their own saves time or cost, one way or another, across the service desk. So getting that information into the hands of the end-users and then changing how they look at utilizing not only the services, but the information that goes with those services, is something that drives the effective implementation of knowledge management endeavours without adding head count. The other side to that are those people traditionally fighting the fires. If we simply shift some of their activities to root cause analysis, doing the diagnostics, recording, documenting and so on, you become more effective in the incident triage practice and then the problem management practice, which effectively contributes to eliminating those occurrences. Another thing would be to use service mechanisms which should be part and parcel to a knowledge management solution to see how effective, for example, articles are; how they’re being absorbed, and what kind of feedback they’re getting, not just from IT but from the end user. So there’re a lot of different elements at play here, but I think one of the things that
53
EXECUTIVE PANEL ■THE SERVICE DESK
really drives us in this case is a sound integrated technology set. It’s one thing to have knowledge available, to publish it and spend a lot of time in creating that information. But unless you can indoctrinate that, let’s say, into an incident process or a problem documentation process where service desk representatives can actually do the pattern recognition against the knowledge articles, then we’ve actually fostered that knowledge management culture into other IT disciplines. We’ve learnt over the past twenty years or so that when you have to go back and re-create the knowledge of how we do our workaround, how we do the fix, how we eliminated a problem, post-fact, it’s probably not going to be as accurate or as effective as if it was part of the native process itself— performing our jobs. So to recap: it’s part of a larger self-service environment, it’s getting the end-users on board and it’s making sure that the information that is provided is accurate through surveys and making it an in-screen process to the rest of the IT service management processes. LEH: IF YOU LOOKED AT YOUR KM TOOLSETS AND SEPARATED THEM OUT FROM THE PROCESS, WOULD YOU SAY THERE ARE TWO OR THREE IMPORTANT FEATURES FOR USERS THAT ARE LOOKING TO DO A LOT OF THE VERY PRACTICAL THINGS THAT YOU’RE SUGGESTING HERE?
TR:
I think you’re seeing the same dynamic in the market that we are—that it’s not so much about one-stop shopping as having one set of tools that work well together. I think customers have gotten tired of playing the piecemeal game of everybody pointing to the other on why these things don’t connect together and why they don’t work well together. They’re becoming more and more inclined to look at a single vendor. But as you point out, Lisa, the flip side of that is to do with the challenge which is to rip and replace everything you have which is a completely non-viable option for most customers, or use point solutions for specific areas where individual organizations are different from the mass, or different from the way a particular primary vendor looks at a problem. We think we’ve done a pretty good job of addressing the balance of providing comprehensive solutions as well as making a point to ensure they all work well in an open environment that connects up to other systems. Now that’s providing a single answer which was a really big change in the IT service management approach a year ago, where we rebundled Service Desk Manager into a comprehensive product. We came to the conclusion that there was a certain core set of functionality that worked well together and was process integrated so significantly that the technology should be tied together as well. Obviously, incident/problem management fit together and change management layers on top of that, and knowledge management and automation tools are needed to make that management work efficiently. Then we included some discovery tools to get to configuration data correctly. That was created into one base unit which we call Service Desk Manager. It’s bigger than the normal atomic level things are sold in, but it’s the reality of what we believe needs to tie together to do IT service management correctly. Ideally we’re really trying to take a very customer-centered viewpoint on this, with a best-practices recommendation and what we consider a minimum set that we priced competitively. We then work with our customers to find out the best answer from there.
“... software-as-a-service has really taken hold and at this point we don’t believe it’s a trend anymore, we believe it’s modern software.”
CW:
One of the things is making sure that you’ve got access to the knowledge articles that support the work-around, restoration and problem solving activities. Having that data native to an incident record or a problem record drives effective process management. Having direct links and search tools for patterns and context is vital to the creation of incident tickets. Knowledge provided to you in-flight rather than having to request it will certainly drive the use of a problem or knowledge management tool as far as finding solutions and utilizing the data. The more interaction we have, breaking down the silos between service management disciplines, will lead to more effective use of knowledge, so I really do believe in interdisciplinary use of the knowledge tool itself. LEH: NOW WE’LL MOVE TO TIM AT CA, AND THIS QUESTION IS RELATED TO THE VERY BROAD DISCIPLINES OF ITIL, BUT ALSO HOW CA HAS CHOSEN TO BREAK THEM DOWN. IN EMA RESEARCH, USERS HAVE INDICATED A PREFERENCE IN PURCHASING ITSM SOLUTIONS FROM A VENDOR THAT OFFERS A BROAD PRODUCT SWEEP. WITH A LARGE VENDOR, SUCH AS YOURSELF, AND MANY SERVICE MANAGEMENT PRODUCTS, HOW DOES A USER GET WHAT THEY NEED WITHOUT BREAKING THE BANK, AND HOW DOES IT NOT BECOME TOO EXPENSIVE? I’M AWARE THAT CA HAS PACKAGED TOGETHER A CORE OFFERING MEANT TO MEET ALL THE BASIC SERVICE DESK NEEDS FOR SOME ORGANIZATIONS, AND I’D BE INTERESTED
54
IN HEARING YOUR THOUGHTS ON HOW THE FUNCTIONALITY INCLUDED IN THAT CORE PACKAGE WAS PUT TOGETHER, HOW THE DECISIONS WERE MADE ABOUT WHAT SHOULD BE INCLUDED, AND ALSO HOW IT HAS WORKED OUT FOR YOU?
LEH: TIM, WOULD YOU SAY THAT SPECIFIC PACKAGING WAS GEARED TOWARDS MID-MARKET CUSTOMERS, OR WOULD YOU SAY THE NEED THAT YOU’RE DISCUSSING CUT ACROSS COMPANY SIZE; SMALL, MEDIUM AND LARGE?
TR:
I really think it cuts across them with different dynamics, but
WWW.BMC.COM
BECAUSE I.T. BUDGETS ARE SHRINKING, NOT I.T. RESPONSIBILITIES. BECAUSE EXTRACTING EVERY OUNCE OF EFFICIENCY IS THE ORDER OF THE DAY. BECAUSE I.T. SHOULD DO MORE THAN KEEP THE LIGHTS ON AND THE SERVERS RUNNING.
BECAUSE BUSINESS RUNS ON I.T.
© 2009. BMC Software, Inc. All rights reserved.
EXECUTIVE PANEL ■THE SERVICE DESK
Lisa EricksonHarris, RESEARCH DIRECTOR, Enterprise Management Associates (EMA)
Lisa has over 18 years of experience in the computer industry, having served in a variety of technical, marketing and managerial roles. Lisa focuses on service level management, business process management, small-to-medium business infrastructure management needs, and partnership strategies for channels and strategic relationships. Prior to joining EMA, Lisa was responsible for the SPECTRUM Partners program for Cabletron Systems (now Aprisma). She writes as a guest columnist frequently for Network World Fusion and contributes articles to slm-info.org. Lisa is also co-author of SLM Solutions: A Buyer’s Guide, now in its third edition.
Chris Williams is responsible for the Product Marketing of BMC Software’s Service Support Discipline which includes BMC Remedy IT Service Management Suite, Service Desk Express, and the Service Resource Planning Suite. In this role, Chris manages a team that focuses on solution level strategies that leverage the comprehensive capabilities of integrated IT processes and practices, Chris Williams, whose efficiencies and effectiveness are LEAD AND SENIOR advanced through the use of the BMC MANAGER, Service Support offerings. BMC Service Support Chris has over 27 years of IT Solutions experience including 15 years managing data centers, operations and technical support organizations for financial, government, retail and manufacturing organizations and is an ITIL v3 certified instructor.
“... you should involve the security team very early on—simply because you have to get their buy-off and they will have good recommendations.”
essentially the needs are the same. There are core processes that are necessary for running ITSM and they’re the same for mid-sized guys as for the larger guys. The mid-sized ones are more likely to be in that mode of: I can just buy it, put it in and not have to worry about it. The big ones are looking at it from the sense that everything should fit together and they should be clean and tidy. I think it applies to both even though the thought processes might be different. LEH: AND NO DOUBT FOR THE LARGE COMPANIES IT’S THE JUMPING OFF POINT FOR OTHER THINGS. LET’S ROUND OUT THE QUESTIONS WITH ANOTHER ONE FOR MATT AT SERVICENOW.COM. THE SERVICE DESK, CHANGE MANAGEMENT AND THE CMS ARE OFTEN SEEN AS STRATEGIC LYNCHPINS IN AN ITSM STRATEGY. EMA HAS SEEN THE SERVICE DESK AS A STARTING POINT FOR ITSM INITIATIVES, AND CHANGE MANAGEMENT OFTEN GETS LINKED VERY TIGHTLY AND THEN OF COURSE THE CMS SOMETIMES GETS BUILT INTO THAT. MANY CIO’S WANT TO KEEP THESE DISCIPLINES CLOSE TO HOME. HOW WOULD YOU EASE THESE CONCERNS AND PERHAPS COUNTER THEM WITH A SAAS-BASED SOLUTION?
MF:
That’s a very good question. We’ve seen that the CMS has been the foundation for understanding and defining what’s in your service portfolio. In the last year and a half, security questions, or the “Where does my data reside?” question, does not come up as often as it used to. In our discussions with prospective clients, there is a much higher understanding of SaaS which I believe has alleviated the perceived risk. One thing we highly recommend, if you are interested in softwareas-a-service, is that you should involve the security team very early on—simply because you have to get their buy-off and they will have good recommendations. We don’t see too much concern from the CIO as to where their CMS data resides. One of our largest customer verticals is the financial sector. These organizations have had a lot of experience in communicating and sending financial data over the wire for many years. We take a lot of those daily working principles and embed them into our technology to allow CIO’s to become more comfortable and familiar with
56
THE SERVICE DESK ■ EXECUTIVE PANEL
what we’re doing from a security standpoint. For example, if you look at a data center and you’re talking to a software-as-a-service vendor, make sure that their data center is SAS 70 Type II certified. I would also recommend that you do an onsite visit so you can see what type of physical security they have in place, and so you can understand where your data will be living. One of the common themes that we hear is that the data center we’re providing is more secure than their own data center. As a SaaS vendor, we view security to be a core competency of our service. We take it very seriously and have built the highest level of redundancy and data protection into our service. Outside of the data center and the physical security, you’re also looking at communications security—is the application built using SSL and HTTPS secure connections? The other aspect of software-as-a-service that I think is interesting is the fact that although the data resides in our data center, the customer still owns it and if they choose to move from the service they have the ability to take their data. So, going back to the beginning of the discussion, I think software-asa-service has really taken hold and at this point we don’t believe it’s a trend anymore, we believe it’s modern software.
Tim Rochte, SENIOR PRINCIPAL PRODUCT MARKETING MANAGER, CA Inc.
LEH: I LIKE THE POINT YOU MADE ABOUT HOW SOFTWARE-ASA-SERVICE CAN BE A BETTER SOLUTION, AND YOU REALLY LAID DOWN SOME KEY CONSIDERATIONS FOR ANYONE THAT MAY BE THINKING OF GOING DOWN THAT PATH. I WONDER AT THE TRIO OF DISCIPLINES; THE SERVICE DESK, PROBLEM/CHANGE AND THE CMS—ARE YOU SEEING IN YOUR PROSPECTS THAT TRIO AS SOMETHING THAT IS PURSUED REGULARLY IN AND OF THEMSELVES?
MF:
The Service-now.com subscription includes access to all applications. While a number of customers have chosen to deploy Service-now. com in a “big bang” ITIL implementation, most start with incident, problem, change and CMDB/CMS light. But we’ve also seen companies lead with change management and the CMDB or CMS. Employee self-enablement supported by request management has also been an area of high interest. The IT organization only gets so many opportunities to set customer expectation and increase their perception. Service request can also help to reduce the cost of service and ensure end users are given tools that allow them to receive service in a way that best meets their needs.
Matt French, DIRECTOR OF MARKETING, Service-now.com
Tim Rochte is currently responsible for CA’s Service Desk Manager in the Service Management group. Prior to CA, he held product management, marketing and strategy positions with several technology companies in the United States and Europe, including BMC/Remedy, Questra and NCR. In addition to numerous user group events, he has spoken at conferences for META Group and Pink Elephant, as well as HDI Australia and US. He holds an ITIL foundation level certificate. He earned a Bachelor degree in economics from the University of California at Davis, and an MBA from the Darden School at the University of Virginia.
Matt leads a team that is responsible for strategically positioning Service-now. com’s brand, applications, employees, customers and partners as market and industry thought-leaders. Matt has 13 years of enterprise software marketing, sales and product strategy experience. Prior to Service-now.com, Matt drove product line strategy at Oracle, Peregrine Systems and most recently Symantec Altiris. As a member of the Altiris Business Unit within Symantec, Matt held positions of Director of Product Marketing, Business Line Manager of Altiris Service and Asset Management applications, and Director of Global System Integrator Strategy.
LEH: THANKS, MATT, FOR THAT FINAL WORD. I HOPE THIS PODCAST SHEDS LIGHT ON JUST HOW BROAD THE SERVICE DESK OPERATION HAS BECOME. AT EMA WE THINK IT’S A LEAST A 20-YEAR-OLD MARKET, MOVING FROM THE HELP DESK TO WHAT THE SERVICE DESK IS TODAY, AND ALL THAT’S DEMANDED OF IT AND DELIVERY ON THAT SERVICE DESK MESSAGE. WE’VE TALKED ABOUT A NUMBER OF AREAS, NOT EVERY AREA BY ANY STRETCH, BUT I’D LIKE TO INVITE EACH OF THE PARTICIPANTS TO CONTRIBUTE A CLOSING REMARK TO SUMMARIZE THEIR MAIN THOUGHTS.
TR:
I think there has been some good conversation on how important knowledge management is and we certainly agree with that in our offering. In terms of how many customers are looking at a SaaS offering instead of just a conventional on-premise model—CA is very much going down that path as an option for our customers. This has raised a whole spectrum of issues in this field that are going on right now.
CW:
We’ve been going through a vast set of topics, but I think we need to impart to the consumers of IT service management solutions the idea of
57
EXECUTIVE PANEL ■ THE SERVICE DESK
working with industry recognized leaders. There are so many different moving parts in IT service management and you need to develop a very strong long term plan as far as prioritization and the return on your efforts from the investment of the solutions, but also the implementation time and the yielding results from those activities. It’s important to have that whole holistic model in your mind. Every consumer should be aware of where they’re going and if it aligns with their business needs. One of the things that BMC Software has done for years is take that approach for business service management, looking at the big picture and making sure that our customers are aligned with their long-term missions.
MF:
I think it’s been a discussion full great information for the audience. My main parting thought here is that if you feel like you’re stuck in a tool or that your organization is not able to mature further, I would recommend taking a step back and defining what you’re trying to achieve as an organization. Forget about the tools and instead define your IT organizational goals, and then look for a tool after that.
58
Frost & Sullivan’s premier networking event, Growth, Innovation and Leadership (GIL), brings together the best and brightest of visionaries, innovators and leaders to inspire and be inspired.This interactive exchange of fresh ideas, innovative strategies and proven best practices empowers CEO’s and senior executives with the necessary tools to accelerate the growth rate of their companies.
Join us - learn, share, engage, inspire and be inspired. CEO’s and their growth teams frequent GIL to: · · · · · · · · · ·
Focus on driving growth, innovation and leadership Discover fresh and innovative ideas Exploit opportunity in any economic climate Network with cross-industry peers Gain a 360 degree perspective of their industry Learn best-practices in driving growth Benchmark award-winning tools and strategies Actively engage in our global community Advance their ability and career Become innovators, visionaries and leaders
Attend Today! www.gil-global.com/Israel Email: yifat.wegner@frost.com Tel: +972 9 9502888
GIL 2010: Israel Growth, Innovation and Leadership A Frost & Sullivan Global Congress on Corporate Growth March 15, 2010 Israel Congress Center
HEAD TO HEAD ■ BUSINESS INTELLIGENCE
It’s called business intelligence for a reason How do you make the data you already have work for you? BILL DUNN (DUNN SOLUTIONS GROUP) says it’s easy if you have the right strategy and the right tools. Interview by ETM’s ALI KLAVER. http://www.GlobalETM.com
FACT FILE
H
ISTORY
• William M. Dunn founded Dunn Solutions Group Inc. in September 1988. • The firm grew throughout the 1990s, becoming a successful consulting firm serving a variety of Fortune 500 companies. • In 1995, Dunn Solutions Group added quality assurance and business intelligence practice areas to its original custom application development services. • User interface development and web design/services were added in 2001 with the acquisition of Streams, a Chicago web design firm. • In January 2007, Dunn Solutions Group was acquired by Cranes Software International, a Bangalore, India-based software and solutions provider.
60
HEAD TO HEAD ■ BUSINESS INTELLIGENCE
AK: BILL, WITH VIRTUALLY EVERY BIG SOFTWARE COMPANY OFFERING BUSINESS INTELLIGENCE TOOLS TODAY, AND WITH EACH TAKING A DIFFERENT MARKETING ANGLE, HOW IS DUNN SOLUTIONS GROUP POSITIONING BUSINESS INTELLIGENCE?
BD:
We have a new marketing initiative and are focusing on business intelligence—the business side of it. The underlying concept is that business intelligence is not just the technology— it’s actually a key part of your business strategy. With business intelligence, organizations of all sizes can turn data into insight, which helps them make informed decisions, find hidden opportunities, and measure and manage performance. The big one is empowering people with the information they need to do their jobs better— knowing the truth about your business, versus gut feel, and allowing you to make predictions, identify inefficiencies and sharing important knowledge that everyone believes. AK: I HEARD YOU SAY “IDENTIFY INEFFICIENCIES” IN YOUR ANSWER. TODAY, BUSINESSES OF ALL SIZES ARE WORKING HARD TO CUT COSTS, BUT HOW CAN BI SAVE MONEY AND DRIVE EFFICIENCY IN AN ORGANIZATION?
BD:
There are about five different ways to drive BI efficiency. The first one, looking deeper for hidden costs and wastes, is about finding cost-savings beyond the expenses that you cut first. The problem comes when the obvious things are gone, and those that are left are not obvious. How do you find those hidden costs and wastes? You can do this by leveraging business intelligence. You can extract from your core systems information about how you’re spending money, which vendors you’re using, how much you’re paying them and so on. For example, you can consolidate that information from multiple divisions of an organization that may be spending money with a vendor, find out that you’re spending more with a particular vendor, and perhaps negotiate a better rate. A lot of information is tied up inside of back-office accounting systems—they’re really processing the information—but it’s the frontline people that are analyzing it. This brings me to my second point, which is tying the front office to the back office. If you free up the information that’s in your accounting
system and make it available to front-line managers, you can change the behaviour of the people making purchase decisions. Take a simple example like an office manager who is ordering office supplies. They do this without really understanding the impact down the road. But if that person gets a report on a monthly basis that says they already have four thousand Post-it notes, this may change their behaviour, because they’re normally just on re-order. But if they have this information, they can re-evaluate how they’re going to spend money. Of course, this could apply to manufacturing and even staffing, so getting information out of the back office and into the front office is very important. The next point is finding it from your customers. We all want to say that the customer is king, but the truth is that not every customer should be treated like a king, and most organizations don’t know who their best customers are. So who should we treat like a king? BI will be able to help us there too. If we can analyze our customers and understand which ones are contributing to our profitability, we’ll not only be treating those customers as kings, but in fact turn customers that are costing us money over to our competitors. I know that sounds odd, but sometimes you have to get rid of some of your customers in order to focus on the better customers, and BI will help us do that. From an optimization of resources and operations standpoint—we need to apply computers to help do things quicker. That’s really what we’ve been saying in IT for the past thirty years. Up to now we’ve been taking processes that are manual and automating them, but without automating the decision-making. BI can help you automate some of the obvious decisions that may be bottlenecks in the organization. For example, if you’re in the insurance business you could use predictive analytics and BI modelling for 70 per cent of your approve/ disapprove transactions, leaving you to focus on the 30 percent of the transactions that really do need human intervention. The last one is avoiding ongoing leakage. People spend a lot of energy in a downturn identifying these things, and then over time they forget about it. What BI will do is automate this into a process which makes it harder to forget —it’s on a dashboard that you see every day. BI lets you take these ideas and then automate them in simple ways.
AK: LET’S PAUSE HERE TO MAKE SURE OUR AUDIENCE IS WITH US. HERE AT ETM WE SEE THAT THE DEFINITION OF BUSINESS INTELLIGENCE HAS CHANGED AS THE TECHNOLOGIES BEHIND IT HAVE CHANGED�SO HOW DOES DUNN SOLUTIONS GROUP DEFINE IT NOW?
BD:
Our feeling about BI is very different today from when we started. We got into the BI world in the mid-1990s, and at that point BI meant writing and distributing reports. But BI today is much more than that. BI now, the way we define it, is proactive use of knowledge. So if we are able to get the information, assimilate it, score it, and then push it out to those who need it on a timely basis and change their behaviour in a positive way, that is business intelligence.
FACT FILE
P
RODUCTS
• Dunn Solutions Group covers Application Development, Business Intelligence and Predictive Analytics. • Products include ALM Analytics, Legal Dashboard, SAP BusinessObjects and One Point Law Enforcement Analytics. • Partners include IBM, Microsoft, SAP BusinessObjects and Oracle. • They boast a consulting and training arm which is focused on a “practitioner’s approach”, offering practical insight that increases the effectiveness of the training experience. • They offer four levels of training: > Instructor-Led Training (ILT): Stand-up training on-site and at various open-enrolment facilities, including training for virtually every core tool from major software companies including SAP BusinessObjects, Microsoft, HP/Mercury. > Jumpstart™: Implementing a new technology? Jumpstart provides the roadmap for success with an expert to work closely with development teams. > Mentor+™: This is one-on-one consulting/ training on specific technical topics or high-level concepts. > Computer-Based Training (CBT): Develops and delivers online training solutions and learning-management systems (LMS) for clients to be used for on-demand pre- and post-training review.
61
HEAD TO HEAD ■ BUSINESS INTELLIGENCE
AK: THAT’S A GOOD POINT�IT’S A PROACTIVE USE OF KNOWLEDGE. FROM A BUSINESS POINT OF VIEW, WHAT ARE THE CHARACTERISTICS OF A GOOD BUSINESS INTELLIGENCE SOLUTION?
BD:
I’m a big evangelist of information being distributed throughout the entire organization. A lot of the time we hear customers talk about the CEO or the VP getting an executive dashboard, and this is important if those people are making strategic decisions. But I think that misses the true promise of BI. Information needs to be made available at all levels of the organization, not just upper management, and BI needs to be integrated into core business processes on a daily basis. Credit decisions are a great example of integrated BI—a number of organizations have a credit decision in less than five seconds. That is a strategic decision made by a line manager that has a whole bunch of BI behind it. BI helps grease the wheel of commerce. If you know about your vendors and can make strategic decisions quickly without having to do a lot of research each time, you can move your company forward. AK: LET’S NOW LOOK AT SOLUTIONS FOR OUR AUDIENCE�WHAT ARE THE KEY ENABLING TECHNOLOGIES DUNN SOLUTIONS GROUP UTILIZES IN SOLUTIONS?
BD:
To get the BI vision accomplished we don’t need to build things from scratch. We are very lucky to be in an area where there’s a lot of great technology we can leverage. For example, Dunn Solutions Group is a SAP BusinessObjects Gold Partner, and we leverage many of SAP’s tools and products to accomplish our BI initiatives. Let me take you through what I would consider to be the core tools and technologies. At the very heart is the database. You can use any solid database engine—Oracle, SQL Server, UDB, even MySQL—but having the database is not enough. To have an effective BI solution you need to work with people that know how to create the data model that will hold your information. Then you need to get the information out of those core transactional systems and into your data warehouse or data mart. In the past, people used to write code to do this, which is very hard to maintain and upgrade, plus it doesn’t give you any information about what’s happening inside those transformations. So the second thing you want to do is get an ETL tool.
62
Bill Dunn
PRESIDENT, Dunn Solutions Group
Bill Dunn founded Dunn Solutions Group in 1988. Over the next 21 years he grew the company from a small boutique firm to a 75-employee consulting firm that serves Fortune 500 and aggressive, growing companies throughout North America. Today the company, owned by Cranes Software International, has headquarters in Chicago with offices in Minneapolis, Raleigh, Charlotte, Fort Lauderdale and Bangalore, India. Dunn received both his Bachelor’s and Master’s degrees in Computer Science from the University of Illinois at Urbana-Champaign. He gained early experience at AT&T, Bell Laboratories and U.S. Robotics. For instance, we use SAP BusinessObjects’ Data Integrator—a great product for getting data from the source systems to the destination. As a by-product of that, if your data is not the cleanest in the world, consider using the Data Quality tool. Once you’ve got your data warehouse built, you want to allow people to access information in a very easy and intuitive way. To do that you should leverage an ad hoc querying tool that is web-based, so that IT doesn’t stand between the users and the data. You also want to standardize on your reporting tool. Once people learn how to use it, there will be a higher adoption rate, and your BI initiative will be much more successful. Finally, for people who need to look at information at a glance—we like to joke about them as: “If the dial is green, you can go golfing today” dashboards. These are what you want to put in front of people that have the KPIs as to how their business is doing. Then you want to start thinking about the next level and looking ahead versus looking in the rear view mirror. AK: HOW ARE BI SOLUTIONS FOR MID MARKET COMPANIES DIFFERENT THAN THOSE FOR HUGE ENTERPRISE LEVEL ORGANIZATIONS?
BD:
Smaller organizations need exactly the same things as larger organizations. Until very recently that was difficult to do because the tools and the technologies were so expensive that only four to five hundred companies could afford them. I’d say there’s a lot of good news for midsized organizations and some fantastic tools— the same enterprise tools that are available for larger companies are now available, and priced appropriately, for smaller companies. We’ve been working with SAP and some other vendors on delivering these types of tools to the mid-market so they can benefit from BI, just like larger organizations. The one thing that smaller organizations need to know is that BI doesn’t need to be done as a big-bang project. We break many of our projects into bite-sized chunks. You can do a 90-
day project, deliver value, and then deliver more value maybe six months later on an engagement that adds to that. AK: LET’S EXPLORE THAT A LITTLE MORE. WHY IS IT BETTER TO BREAK A BI INITIATIVE INTO PIECES? AND WHAT IS THE BEST WAY TO GO ABOUT IT?
BD:
I think that for most mid-market organizations, they don’t have the option to do the whole thing at one time. They generally don’t have the budget, and may not have the time to do a complete BI initiative in one shot. What I suggest they do is start with the low-hanging fruit. There’s a lot of value in doing a small BI solution that’s architected properly and can add value right away. The nice thing about BI as opposed to other types of IT initiatives is that it can grow. Say you start with a departmental solution. If it’s architected properly, you can add additional departments, and before you know it you’ve got an enterprise solution—but you didn’t have to start that way. A big part of being successful, especially in the mid-market, is partnering. Look at the top vendors in the industry, partner with software vendors with the right solution and who are BI-focused, and partner with consulting organizations who can help you get there without having to figure it out by yourself. Over time, most people feel that the value they get out of partnering with the right consulting firm and buying the software far exceeds any savings from doing it themselves. And the long-term value is that they can build on this solution. We have a client that literally spends twentyfive thousand dollars every six months on their BI initiative. That’s not a lot of money for the value they’re getting while they continue to build on their initiative. Over time, they’re going to have a complete BI solution that they can deploy to their entire organization. But in the interim, they’re getting value out of the solution they’re deploying today.
BI IS A BUSINESS STRATEGY. Turn data into insight. Make informed decisions. Find hidden opportunities. Measure and manage performance. Empower employees. Know the truth about your business. Make predictions. Identify inefficiencies. Share important knowledge. OUR BI SOLUTIONS CHANGE HOW BUSINESS WORKS. LEARN ABOUT OUR CONSULTING SERVICES 800.486.DUNN DUNNSOLUTIONS.COM
CHICAGO • MINNEAPOLIS • RALEIGH • CHARLOTTE • FORT LAUDERDALE • BANGALORE
IN THE HOTSEAT ■ PROFESSIONAL PROFILE
Security and business continuity
W
ith an impressive list of partners and clients, NETASQ is fast becoming the security solution provider. FRA NÇOIS LAVASTE and DOMINIQUE MEURISSE (NETASQ ) join ETM’s ALI KLAVER to discuss exactly what they can do for you.
http://www.GlobalETM.com 64
AK: LET’S JUMP STRAIGHT INTO THE FIRST QUESTION— CAN YOU GIVE OUR AUDIENCE AN EXPLANATION OF WHO NETASQ IS?
FL:
Absolutely. NETASQ is a United Threat Management technology pioneer and visionary, and we are a leader in security appliances. Our award-winning solutions are designed to protect the networks, data and applications of our customers against what has become an ever-increasing issue to them, i.e.cyber-criminal activity. Our customers particularly value our capacity to block intrusions, hackers, viruses, malware and spam as well as filter content. They highly rate our ability to reduce internal risks and to enable secure mobility with our SSL and IPsec VPN. Since its inception in France 10 years ago, NETASQ has enjoyed double digit growth year over year. We employ 90 people world-wide and are a profitable company. Our vision is to permanently innovate in order to protect businesses of all size with the highest level of security against both known and
PROFESSIONAL PROFILE ■ IN THE HOTSEAT
“... NETASQ’s solutions clearly stand out from those developed by other security vendors.”
unknown threats, also known as zero-day threats. Our customers range from SMBs with a few workstations to the largest multi-site infrastructure both in the private and the public sector.
stand out from those developed by other security vendors.
AK: NOW WE KNOW WHO YOU ARE, CAN YOU TELL US ABOUT NETASQ’S STRATEGY?
AK: SO WHY DOES NETASQ THINK THEY ARE THE BEST SECURITY SOLUTIONS PROVIDER? WHY SHOULD A COMPANY COME TO YOU?
FL:
We invest 20% year after year in research and development to innovate in technologies that are a true differentiator. In addition to this, our company has always been geared to implement a proximity policy with our customers who are reached and supported through a two-tier distribution network. Our sales model is totally indirect. In each of the regions we focus on— Europe, the Middle East, Africa, Asia-Pacific— we have recruited highly qualified value-added distributors who in turn closely work with resellers, integrators, Telco operators and MSSPs. Over eight hundred certified and active partners are actively working with us. They distribute and install NETASQ solutions in over thirty countries. We naturally also have channel sales and pre-sales teams in France, the UK, Benelux, Italy, Spain and a team covering the rest of the world.
DM:
NETASQ has designed the most powerful technology to proactively protect against the newest forms of cyberthreat. The NETASQ Software Engine is designed at the driver of the Kernel and starts at the highest OSI model of deep stateful inspection of the content (Protocol and Heuristic Analysis, Contextual Signatures) and then should the threat have not already been blocked, activates traditional security features such as IPS and FW. Since the IP packet is processed in this way, our software engine allows:
> Control and protection against every kind of threat ( DoS, SQL injection, TCP Scan, etc).
> Exceptional network performance because AK: SO FRANÇOIS, YOU’RE ESSENTIALLY OFFERING A SECURITY SOLUTION THAT HAS GLOBAL REACH. MY NEXT QUESTION IS FOR DOMINIQUE—WHAT ARE SOME OF THE MAIN BENEFITS OF WORKING WITH NETASQ?
DM:
To date, NETASQ has sold fifty thousand appliances to over fifteen thousand customers worldwide, 30% of which are in environments that are highly sensitive to security issues (governmental organizations, military forces, the European Council, banks or Nuclear Industries). None of them have complained of suffering damages after being attacked. At a time when there are daily reports in the media or internet of prejudices met by companies after they have been the victims of security attacks, NETASQ’s solutions clearly
processes are run sequentially and top- down, whereas other solutions on the market have a “lego style” approach by which processes are run in parallel.
> High level security features to inspect content such as AV, spam, malware and URL filtering.
> The best UTM performance. This allows us to deliver outstanding UTM performance with the lowest TCO within a three-year time frame. AK: CAN YOU GIVE OUR AUDIENCE SOME VALUABLE CUSTOMER FEEDBACK THAT DEMONSTRATES NETASQ’S VALUE?
DM:
Our customer feedback truly
demonstrates NETASQ value. Let’s take a few examples. Amadeus in Spain delivers nextgeneration solutions to travel agencies mostly to make airline reservations. They experienced a significant increase in the number of web transactions due to the success met by low-cost companies, and did not want this to impact their business SLA of three seconds to acknowledge a reservation. After testing different solutions, they chose NETASQ to secure their highly sophisticated and intense web traffic, thereby blocking all malware and viruses which could affect their business SLA. Persian Bank in Tehran, Iran, is another example. During the most recent presidential election Iran experienced the most damaging cyber attacks known worldwide. They wanted to secure their business and after conducting in-depth testing and a thorough analysis of the logs provided by the different solutions they envisaged, they chose NETASQ. Their reason was simply that NETASQ solutions provide the most comprehensive information in the area of log collection as well as the highest level of protection. Another great illustration is the reason why we were selected by a French bank. This bank was concerned about phishing attacks made to online banking environments and the impact this might have on their business. They conducted an exhaustive risk analysis and discovered that there existed some very specific types of attacks that could potentially give unauthorized access to their online accounts. Since their current security provider could only recommend to wait for the development of specific patches, they decided to evaluate other solutions including NETASQ. They were most impressed because this threat was already blocked by NETASQ’s software engine, and they therefore implemented NETASQ solutions throughout their web data centre.
65
IN THE HOTSEAT ■ PROFESSIONAL PROFILE
Another business case is the example of the public integrator of Telecom Malaysia. This company has overall responsibility for secured internet access to the public and governmental organizations in Malaysia. Given the particular nature of their customers and the obligation to protect highly sensitive information, they chose NETASQ and have been using our solutions for five years now. AK: THANK YOU DOMINIQUE, THOSE ARE SOME FANTASTIC GLOBAL CASE STUDIES. FRANCOIS, FOR THE FINAL QUESTION, CAN YOU TELL US ABOUT NETASQ’S INTERNATIONAL REPUTATION?
FL:
NETASQ has been recently awarded EAL4+ level certification based on Common Criteria 3.1. These are the latest and most constraining international standards. Moreover, the European Council granted NETASQ the RESTREINT UE certification, thereby acknowledging that NETASQ solutions have the capacity to protect sensitive information in the entire European Union. This is very a high level recognition and a much sought after recommendation. In addition, each time our solutions have been evaluated by independent laboratories or been appraised by the press, they won “Best Buy”, or four stars, and were granted several times the “Security Product of the Year” awards. Therefore our sole and only mission is to develop security solutions which proactively protect the information system of our customers. We enable business continuity and ensure our clients that threats will not jeopardize their activity, thereby contributing to their success because they can focus 100% on their own mission.
“… our sole and only mission is to develop security solutions which proactively protect the information system of our customers.”
Upon joining NETASQ in 2004, Dominique Meurisse first assumed the general management of NETASQ for France before directing NETASQ’s Sales and Marketing in 2006. Dominique kicked off his career at 3Com Europe in 1987 first as the Manager of Corporate Accounts, then as the Manager of VARs. In 1991, as the co-founder of network integrator ARCHE Communications, he was the Director of Sales and Marketing. In 1999, he became Dominique the cofounder of ClarITeam (European QoS Meurisse operator) and was its General Manager. From EXECUTIVE VICE 2002 to 2004, he was the Director of Infovista for PRESIDENT, SALES France and the Benelux countries. AND MARKETING, Dominique Meurisse holds an engineering MEMBER OF THE degree of Telecommunication from ESME MANAGEMENT Sudria Paris and holds an Executive MBA from BOARD HEC-PARIS. NETASQ François Lavaste took over the presidency of NETASQ’s management board in early 2007. Prior to that role, he was the General Manager of TRICIPHER, a specialist in strong authentication, for Europe, the Middle East and Africa. He began his career in IT in 1992 as a partner in the creation of Eneide, a CRM software company, publisher of CONSO+, which was acquired by Coheris in 2000. In 1996, he moved to the Silicon Valley where he stayed for François Lavaste 10 years and joined Intuit, the worldwide leader CHAIRMAN OF THE in financial management software. From 2001, as MANAGEMENT Vice President of Marketing, he contributed to BOARD the success of BRIGHTMAIL, the pioneer and NETASQ leader in anti-spam technologies that Symantec acquired in 2004; as well as to the success of CYANEA SYSTEMS, which specializes in application performance measurement software and was acquired by IBM in 2004; and that of Mindjet, the publisher of the MINDMANAGER software program. François Lavaste graduated from ESCP-EAP, a European graduate business school, and holds an MBA from the Harvard Business School.
http://www.GlobalETM.com 66
ANALYST FEATURE ■ BI FOR THE SMB
“For most organizations the key to improving decision quality comes down to timeliness of information.”
Secrets to success
Dedicated action is needed to get the best out of small and medium sized businesses. MICHAEL LOCK (ABERDEEN GROUP) tells us how to slash cost and empower the business user.
Benchmarking Best-in-Class SMBs
A
berdeen’s research demonstrates a marked increase in the use and applicability of business intelligence (BI) technology in small and medium sized businesses (SMBs). By examining methods to improve business visibility, optimize delivery of BI, and manage the Total Cost of Ownership (TCO), SMBs are seeing measurable performance improvements across many areas of their organizations. The research shows that Best-in-Class SMBs are leveraging a broad spectrum of organizational capabilities to deliver BI to more users and deploy the solution in a shortened time frame; all in a self-service, non IT-assisted capacity. This benchmark report is based on feedback from 530 SMB organizations worldwide. For most organizations, the key to improving decision quality comes down to timeliness of information. Much has been said and written about the speed of business in today’s environment. Mass collaboration and the improvement in information access have brought about a situation in which organizations need clean, relevant information, and they need it quickly. Faster and more informed decisions enable a company to react more quickly to threats and opportunities and remain competitive in a struggling
68
economy. The research supports this claim in showing that the top business pressure forcing SMBs to invest in BI solutions is the need to improve the speed of access to relevant business data (see Figure 1). Utilizing BI for better and faster data access has paved the way for significant performance improvements for some SMBs. Aberdeen used four key performance criteria to distinguish Best-in-Class SMBs from Industry Average and Laggard SMBs. Best-in-Class SMBs achieved the following mean class performance: 29% average year over year increase in operating profit 4.0% average year over year decrease in BI cost per user 99% deliver BI to end-users in a self-service capacity 14 days average deployment time of BI applications.
BI FOR THE SMB ■ ANALYST FEATURE
Figure 1: Top pressures driving BI investment for SMBs
Source: Aberdeen Group, July 2009
Figure 2: Best-in-Class BI technologies in use
Source: Aberdeen Group, July 2009
Requirements for success
B
ased on the findings of the Competitive Framework and interviews with end users, Aberdeen’s analysis of the Bestin-Class demonstrates that successful deployment and use of analytical tools in the SMB market depends on a combination of specific capabilities and technology enablers. Aberdeen’s research has identified several capabilities that Bestin-Class companies leverage in order to achieve their analytical goals. Process—recent Aberdeen research shows that even smaller businesses are managing an average of 15 unique data sources. As data is collected from these multiple sources (such as transactional systems, spreadsheets, data warehouses and external data sources), it is of paramount importance to define a formal process for this activity. Aberdeen’s research
shows that Best-in-Class SMBs are 62% more likely than all others to have this capability. Organization—one of the most commonly cited barriers to a successful BI implementation has to do with support from the executive ranks, or lack thereof. Having executive level support will help facilitate faster implementation and promote greater adoption within the organization. The research shows that Best-in-Class SMBs are 42% more likely than Laggards to have garnered executive level support for their BI initiative. Knowledge management—companies lacking expertise in managing the end-toend process of BI—as it relates to ETL, data warehousing and other expertise intensive processes—stand to benefit greatly through
the use of automation. Aberdeen’s research shows that Best-in-Class SMBs are more than twice as likely as Laggards to automate report generation and delivery to end users. Performance measurement—one thing that many companies struggle with, particularly in the SMB space, is not only providing access to BI for more employees, but also ensuring that the solution is actually being utilized. A logical way to understand this conundrum of access versus usage is simply to measure usage levels of the BI system. Top performing SMBs were more than twice as likely as all other SMBs to monitor this BI usage level. Technology—when it comes to the types of BI solutions in place, the research shows that Best-in-Class companies are more likely
69
ANALYST FEATURE ■ BI FOR THE SMB
Figure 3: Best-in-Class SMBs move towards “pervasive BI”
Source: Aberdeen Group, July 2009
than all other SMBs to use several different technologies. The most common BI solution—traditional historic reporting and analysis—is still the most commonly used technology in place at all companies, but the Best-in-Class are working to achieve a balance of both strategic and tactical tools for analysis (see Figure 2).
Key takeaways
O
ne of the quickest ways to achieve a tangible return from a BI investment is to widen the adoption and usage of the tool to more employees within the organization. The economies of scale inherent in this type of BI expansion produces a situation in which a nominal increase in resources is required to equip a much larger set of users, and thus the cost-peruser goes down. As BI solutions have matured and grown
more applicable and usable for non-technical business users, many companies are trying to move towards this goal of “pervasive BI” in the organization. In fact, the research shows that Best-in-Class SMBs are far more likely than all other companies to have a pervasive deployment of BI or a heavy usage of BI within their organizations (see Figure 3). The fact that Laggard SMBs are more likely to be using BI on a point solution or project basis doesn’t necessarily mean that these companies don’t have the potential to reach Best-in-Class status. In many cases the research shows that Industry Average and Laggard companies simply have a younger deployment of BI and are struggling to find the right balance of people, process and technology to fully realize the benefit that these solutions offer. Frequently, a Best-in-Class company will get to that level by utilizing a “land and expand” strategy for BI deployment. In other words, a
company will deploy BI to one strategic function such as sales, marketing or finance. Once the value of the solution has proven itself out in one area of the organization, the company will look to expand its use in other areas of the company, eventually leading to a more pervasive deployment of BI and the associated Best-inClass performance. In order to improve BI adoption and drive more sustainable business improvements, SMB organizations should consider taking the following action: Define sets of business unit KPIs that roll up to the overall company strategy Start to migrate away from spreadsheets and toward dedicated BI Invest in developing a formalized training program for end-users Examine the use of outward customer-facing dashboards and BI tools.
Michael Lock
RESEARCH ANALYST, BUSINESS INTELLIGENCE Aberdeen Group Michael Lock is a Research Analyst focusing on end-user adoption and usage of business intelligence (BI) technology. In addition to being widely published on the topic of BI, Michael has written on a broad range of topics including unified communications, enterprise mobility, IT service management, green storage, voice over Wi-Fi, and business process management (BPM). Michael’s past research focused on the use of BI within the SMB segment, collaborative techniques for BI usage, and the adoption of BI dashboard visualization tools in the enterprise.
70
“Best-in-Class SMBs are far more likely… to have a pervasive deployment of BI or a heavy usage of BI…”
HEAD TO HEAD ■ ACCESS SECURITY
Security at its best Astaro are so concerned about your security that they’re going to give you their product—free. GERT HANSEN (ASTARO) talks to ETM’s ALI KLAVER about why SMBs will triumph, the future of security, and why they’re giving away free solutions to everyone. http://www.GlobalETM.com
“... we actually want to help these businesses to avoid sacrificing security for efficiency and/or cost.”
72
ACCESS SECURITY ■ HEAD TO HEAD
A
GH:
K: ASTARO HAS SAID THAT THEY THINK SMALL AND MEDIUM-SIZED BUSINESSES ARE A CRITICAL PART OF STRENGTHENING THE WORLD ECONOMY. WHY IS THIS?
I think that’s not only our opinion—many politicians share it. We think the main reason is that many SMBs these days are led down more conservative and cautious roads, and are subject to high requirements created by investors or the public market. This type of conservative business management keeps them strong even in tough times like the current one. Large enterprises need loans of banks or even the government. The smaller companies maintain a steady and stable business and if you sum up all the smaller companies, they equal in size, or are even greater, than the larger enterprises. It is this fact that is currently helping to strengthen the world economy, in our opinion. AK: TELL US MORE ABOUT YOUR NEW FIREWALL EDITIONS, THE ASTARO ESSENTIAL FIREWALL FOR BUSINESS USE AND THE ASTARO ESSENTIAL FIREWALL FOR VMWARE?
GH:
Astaro has been building and selling security products for close to 10 years. During that time we have seen many different trends, and two trends which we have seen lately have led to a decrease in company security. The first one is a constant removal into virtualization. More and more companies are investing in virtualized environments and migrate many if not all of their servers and network infrastructure into virtualization. This certainly gives them many benefits such as better resource utilization, more and increased flexibility, better disaster recovery and so on. But at the same time they are decreasing their security level because there are not many security solutions available that run in the virtualized environment. Those that are available are either very expensive or extremely complicated to manage—some are even both. But for increased usability people are willing to sacrifice security and connect all machines, left unprotected, to each other. Using this type of networking infrastructure we are finding ourselves back in the year 2000. As a solution, and in order to help these customers, we have created an Essential Firewall edition of our Astaro Security Gateway product that delivers the basic security functions to secure the virtual environment. This will also be maintained free of charge. With this, every company can keep their current security policy while migrating towards virtualization. This is just one trend where we want to help customers. The second trend we see is triggered mainly by the economic situation the world finds itself in at the moment. Companies with financial challenges are increasingly using consumer network equipment to protect their company, primarily because it is cheaper. Again, these companies are willing to sacrifice security to save money. In order to help these customers, we use the same Essential Firewall of our Astaro Security Gateway product and give it away free of charge to all businesses.
“The reduction in complexity and the increased ease of use to administer to our integrated web user interface is obviously the preferred choice...” This they get firewall security and the business also compared to consumer devices.
means enterprise at no cost gets better protection
AK: THAT’S FANTASTIC NEWS GERT. AT A TIME WHEN COMPANIES ARE LOOKING TO STRENGTHEN THEIR BOTTOM LINE, IT’S INTERESTING THAT ASTARO IS GIVING AWAY THEIR PRODUCT. SO WHY IS THIS, AND WHAT IS INCLUDED UNDER THE TERM “FREE”?
GH:
That’s a good question—basically there are two motivators. The first one is that we actually want to help these businesses to avoid sacrificing security for efficiency and/or cost. The second one is that we hope the reach of Astaro will be increased and that if people get used to our feature-rich and easy-to-use solutions they will buy the Professional edition once their security needs increase. What we mean regarding the term “free” is that included in the Essential Firewall edition is:
> A complete internet router that is able to replace any existing equipment you have
> An enterprise grade firewall to control who is allowed to communicate with whom
> Remote access for windows PC, iPhones and other mobiles to securely connect back to the company to use internal resources
> A detailed reporting solution that easily shows you what is going on in your network and what the solution did for you. The Essential Firewall edition also includes free updates to the software and free support from our web portal, www.astaro.org. We think these are the essential security functions that every company must have in the current climate once they are connected to the internet.
73
HEAD TO HEAD ■ ACCESS SECURITY
AK: WHY DO YOU FEEL THAT THE POINTS YOU MENTIONED ARE THE ESSENTIAL SECURITY FUNCTIONS AND NOT OTHERS?
GH:
Well, security is a wide and complex field. There are many different techniques and solutions available to protect a company and their assets. We looked at the minimum requirement that every company should have and the feature we selected was the least common denominator— therefore we included it in the Essential edition. Astaro offers a lot more of the security functions in our Professional edition that customers are able to licence. But that really depends on what the customer needs and what their security policy indicates. Every company needs to evaluate and decide for themselves what is important to them. AK:AT ETM, OUR MEMBER BASE INCLUDES QUITE A LOT OF SMALL AND MEDIUM SIZED BUSINESSES. SO TELL ME, HOW WILL THIS OFFERING BENEFIT THEM?
GH:
GH:
That option is completely up to the customer to decide. A company can actually augment their existing installation by using the Essential Firewall as an add-on to an existing solution, or they can replace it completely. The reduction in complexity, and the increased ease of use to administer to our integrated web user interface is obviously the preferred choice, but again, a company needs to decide exactly what is right for them. AK: HOW IS THIS OFFERING DIFFERENT TO THE FREE HOME USE EDITION ASTARO OFFERS?
GH:
The Free Home Use edition, which has now been available for many years, is limited to home use only. We have many thousands of people using the Home Use edition today and the community around this is steadily increasing. You can actually monitor it at www.astaro.org/ ourcommunityportal. The Home Use edition includes the complete feature set of the Astaro Security Gateway product, but it’s not permitted to be used in companies and it’s limited to a maximum of 50 users. Up until now administrators could only use this at home. But with the new Essential Firewall edition, they are now able to use an unlimited version of the Astaro Security Gateway product with a reduced feature set, also in their company, completely free of charge.
“... every company can keep their current security policy while migrating towards virtualization.”
For those small companies who are currently looking for a new security solution, or if their existing security solution is up for renewal, they can now save money and use our solution for free. This will help them to invest the saved money into other businessgenerating activities like sales and marketing, and it will therefore help them to save cost on one side, and invest it wisely in business-driving tasks on the other.
Gert Hansen CHIEF SOFTWARE ARCHITECT, Astaro
Gert Hansen co-founded Astaro in early 2000 and is responsible for product strategy and software design. He was previously with the internet service provider Plannet Systems, and also co-founded the web development company Timenet, collecting extensive experience in both technology and entrepreneurship. Gert studied computer science at the Technical University of Karlsruhe.
74
AK: LET’S QUICKLY TAKE A LOOK AT INTEGRATION. WILL USERS NEED TO DEACTIVATE OR COMPLETELY REPLACE THE SECURITY SOLUTIONS THEY ALREADY HAVE IN PLACE IN ORDER TO USE THE ASTARO ESSENTIAL FIREWALL?
ASK THE EXPERT ■ E-DISCOVERY AND COMPLIANCE
Proactive Information Management ETM’s ALI KLAVER talks to SIMON TAYLOR (COMMVAULT) about changing the way organizations think about and manage their information into a top-down approach that aligns both IT and business perspectives. http://www.GlobalETM.com
“We’re aligning, for the first time and through a single piece of technology, better information and then data management of those information assets.” 76
E-DISCOVERY AND COMPLIANCE ■ ASK THE EXPERT
A
K: IT’S OBVIOUSLY NO SECRET THAT MISMANAGING DATA CAUSES HUGE DOWNSTREAM PROBLEMS IN BOTH INFORMATION DISCOVERY AND RECORDS COMPLIANCE. WHY IS THIS STILL SUCH A PROBLEM FOR MANY ORGANIZATIONS?
ST:
The best way to answer this is if we separate out the requirements for discovery. A lot of the traditional problems are caused because organizations approach it from a top-down perspective, so it’s business-driven need around some form of risk. Typically what happens, if we take records compliance for example, is that an organization takes a very siloed and implementation approach from an IT perspective to group together that information and keep it in a particular way. For instance, if you look at a finance organization complying to something like SEC legislation, they would take a very supervisory approach to the way they’re looking at content— keep all their email communications, store them in a particular silo and interrogate that content. The problem in doing this is that you end up with multiple different sorts of silos and repositories, basically because no one has been thinking about it from a bottom-up perspective. Now, if we took an e-discovery stance, what causes most of the pain, particularly with data mismanagement, is having to search multiple different environments—online sources, applications, file systems, offline media, backup— and other difficult environments like laptops and desktops which have to be imaged and interrogated fairly forensically to gather content. The answer is to think about how to consolidate some of this data. So with these two separate things—the keeping of records and organizing for compliance and supervision, versus e-discovery and finding things quickly when litigation or FOI requests hit—you get good alignment as to how information needs to be more proactively and strategically managed across the organization. AK: TRADITIONALLY IT’S BEEN A TOPDOWN PERSPECTIVE WHEN IN FACT IT’S ORGANIZING INFORMATION IN A CONSISTENT AND BOTTOMUP WAY THAT WORKS. SO WHAT ARE THE CUSTOMER ADVANTAGES OF A PROACTIVE INFORMATION MANAGEMENT STRATEGY, PARTICULARLY AS IT RELATES TO LITIGATION AND CORPORATE GOVERNANCE?
ST:
First of all, if we focus on the risk aspect again, by organizing information more proactively you reduce the number of copies and align them to particular business strategies and classifications, thereby intuitively reducing risk. People keep records to reduce risk, so risk is a big driver. In proactively organizing information so that it’s classified correctly and organized for the right retention, businesses get better rationalization of records, and that’s the underlying principle of good records management. If we think about accessing content from an e-discovery perspective, one of the biggest problems from a litigation approach is that too much information gets generated. For example, in the US, a simple set of criteria as part of federal rules of civil procedure meet and confer sessions, which is typically an e-discoverybased discussion between attorneys, can result in masses of amounts of information that needs to be sifted through. However, if information was more intuitively identifiable, organized, readily accessible and aligned to access requirements, then you would reduce almost implicitly the nature of the data that’s being generated with a direct flow-on effect of time and cost benefits. In summary, some of the key advantages of being more proactive is that you understand the records, reduce the amount you’re keeping and have better alignment with retention policies—by definition you’re achieving a better level of compliance. Good policies lead to good litigation management and reduced risk, so organizing information and having it readily accessible means that what you access, from an e-discovery perspective, is likely to be more responsive in the first instance and therefore create better downstream efficiencies which will ultimately reduce your legal costs. AK: OKAY SIMON, LET’S GIVE THE AUDIENCE SOME SOLUTIONS. TELL US MORE ABOUT YOUR PRODUCT OFFERING—COMMVAULT SIMPANA 8?
ST:
Simpana 8 is a milestone in our technology. It’s the largest release we’ve ever delivered as part of the 12-year history of the Simpana version. What is fairly significant about 8 is that we’ve introduced a lot of additional information management capabilities—classification engines and more intuitive data mining capabilities—that ultimately allow people to organize, mine and access data quickly.
One of the things we’ve been trying to do, ever since we released the early version of Simpana, is minimize the cost burden at an IT level to do typical data management tasks that, for instance, other vendors have different products to do. What Simpana 8 brings is a whole layer of information management capability built on top of earlier releases of Simpana 6 and 7 that introduced very intuitive content indexing based on every single piece of data that we touch, so that we’re exposing all of this data at a business level and making it more accessible. I like to think of Simpana as being a type of technology that transforms data into information and makes it more valuable to better access, organization and classification. Today, Simpana is a technology that has very defined unified management of data so that we reduce the cost burden. Finally, the classification and work process technology that we’ve introduced, which we call content director, takes it one step further. We suggest that within the software you can create patterns of analysis of information so that if you identify particular records or classifications of information assets that meet particular requirements, not only can you group and orient them to particular business departments or individuals, but you can also workflow them and put them into different technologies. In effect, we are trying to create an almost information middle-ware, or an information movement technology, that not only understands the nature of information but allows you to realign how that information is organized at a business level. AK: YOU’VE TOUCHED ON COST ALREADY WHICH IS GREAT, BUT BECAUSE IT’S SUCH A HUGE ISSUE AT THE MOMENT, CAN YOU TELL US HOW SIMPANA 8 HELPS DRIVE DOWN COST?
ST:
On an IT level, the nature of the platform technology itself means that you don’t need to implement multiple pieces of technology to do the same thing with one technology and different components. So straight away, processes such as archiving, back-up and replication are achieved with the same platform as those with one management interface. This reduces, fairly significantly, the cost burden of most organizations in implementing data management software. The fact that, in Simpana 8, we took our single instance technology to the next level and created a block level de-duplication technology that was built into everything we do at a software level,
77
ASK THE EXPERT ■ E-DISCOVERY AND COMPLIANCE
means that if you’re archiving, backing up or replicating, we only store that data once. This creates masses of reduction in terms of duplication of data and sometimes up to 40 times less storage, just by using our software. At an information level—let’s just focus on e-discovery because that’s a very good example—with the electronic e-discovery reference model, which is a 9-step process that a lot of organizations follow when they’re finding electronically stored information for litigation, the main cost burdens are to do with what organizations do to find evidence. The second area of cost is legal, incurred through the processing and review of information. Using our type of technology you can classify information more explicitly when you’re trying to find it for a particular set of criteria, which then drives down cost. By proactively content indexing and managing information, you don’t need to spend time sifting through information to provide the two or three thousand pieces of data that you think are the most responsive you can find that information in just a matter of minutes, significantly reducing both the IT and the business cost involved in doing that. When you’re faced with a litigation cost that is exceeding $800,000 for an average case, and you’re able to collapse the amount of effort required in pinpointing evidence by 50-60%, you can see that what you do proactively at an IT level has huge benefits in terms of passing responsive data to attorneys.
projects fail is because they’re not joined up—it’s a top-down driven strategy that stops when it hits applications. Now we’re looking at how to streamline the business and IT-based information management so that the right level of efficiency is achieved which directly benefits people exploiting and managing records. Finally, there are the high-level executives who are concerned about risk. The underlying concern is that a lot of these executives are accountable for the way information is organized and retained, and if it isn’t retained in the correct way they can be liable. This joins everyone up from the top right down to an IT level, and it makes them understand what role they play in delivering a more joined-up information strategy. It isn’t really about individual stakeholders, it’s about everyone being involved in the whole strategy so they understand the role they play in the better organization of information across the business.
AK: THE COST AND TIME SAVINGS SOUND CONSIDERABLE, BUT HOW WILL COMMVAULT SIMPANA 8 BENEFIT STAKEHOLDERS CROSSFUNCTIONALLY, SUCH AS IT, LEGAL, COMPLIANCE, END-USER AND OTHER VARIOUS GROUPS, ACROSS THE ORGANIZATION?
We have a broad customer base with over 10,000 customers across nearly every vertical worldwide. When you look at information management it becomes very apparent that certain industries are more pertinent than the others. For instance, the finance community is very highlyregulated with strict compliance requirements around retaining and supervising information that are also aligned to institutions such as the SEC. Similarly, there are very tight regulations in the pharmaceuticals industry and how drugs are managed, created, patents are filed and information is made accessible when litigation arises. On a worldwide basis one of the biggest increasing drivers to get people more focused in this area is litigation and e-discovery of electronically stored information. Lots of industries are now faced with challenges in court around electronic content and they need to immediately find evidence. Most organizations have some form of exposure in this area so there’s really no
ST:
Most people talking about information compliance and governance in the industry today refer to joined-up strategies between business and IT. The only way you can join up a strategy is to have cross-functional teams that understand the stakeholders involved in that process. These approaches benefit all of these people implicitly. For instance, better organization of information to do with email, whether you’re keeping it for e-discovery reasons or better compliance, leads to better application management. You’ve also got CIOs that are trying to implement a holistic approach to information management. The reason why a lot of those
78
AK: THIS SOUNDS LIKE A FLOWON EFFECT, SO IF A COMPANY CAN IMPLEMENT A CONSISTENT INFORMATION MANAGEMENT SOLUTION ACROSS THE BOARD, YOU’LL ONLY SEE THE BENEFITS. LET’S NOW LOOK AT A GLOBAL SCALE— HOW DOES COMMVAULT SIMPANA 8 APPEAL TO VARIOUS INDUSTRIES AND VERTICALS ACROSS THE GLOBE?
ST:
downside to taking a more proactive approach to the way they organize information. AK: HOW IS SIMPANA 8 DIFFERENT FROM OTHER OFFERINGS IN THE MARKET TODAY, AND WHAT IS THE MAIN POINT OF DIFFERENCE?
ST:
There are two main points of difference, and the first is unified technology. We are doing with one technology what many other technologies have to implement separate technologies for. For instance, the Microsoft business information search technology within Simpana allows us to exploit assets and information in a way that’s aligned with the way users typically find information, and it does that very quickly. It really doesn’t matter how many different impressions of information we gain from the business or applications—ultimately the technology is about ensuring that we’re minimizing the cost burden. Unification at the data level has direct benefits at an IT level, and the way we exploit information obviously gives direct productivity and risk-reduction benefits at a business level because we’re exposing information in a way that is non-challenging. The second thing that is more important from a business point of view is that we work from a joined-up thinking approach. We’re aligning, for the first time and through a single piece of technology, better information and then data management of those information assets. IT get the benefits they want, and more importantly, the business gets the benefit that they want in terms of reduced risk and better access to information.
Simon Taylor SENIOR DIRECTOR, INFORMATION ACCESS AND MANAGEMENT, CommVault
Simon is Senior Director for the CommVault Information Access and Management business worldwide, including information risk, e-discovery, compliance, information search, archiving and data analysis. His field of expertise covers a range of topics including information and data intelligence, data warehousing, application and information management.
When litigation arises, will you have everything you need
at your fingertips? Access & discover relevant ESI—including email messages, files, and backup data—at a moment’s notice with Simpana® software. When faced with legal challenges, every minute matters. After all, opposing counsel isn’t going to wait while you sift through years of documents, trying to find relevant data. CommVault® Simpana® 8 makes eDiscovery simple by providing you with a range of information management capabilities to deliver efficient and intuitive legal access to corporate email messages, files, and collaboration data. Simpana 8 enables search and classification across all ESI including laptop/desktops, backup copies, managed legal preservation, and archive retention—all from a single console and single infrastructure. To find out more about the most comprehensive, risk-averse, and cost-managed eDiscovery solution in existence, go to commvault.com.
BAC K UP & R E C O V E RY
A R C H IV E
R E P LIC AT IO N
R E S O U R C E MA NA GEMENT
SEARCH
©1999-2009 CommVault Systems, Inc. All rights reserved. CommVault, the “CV” logo, Solving Forward, and Simpana are trademarks or registered trademarks of CommVault Systems, Inc. All other third party brands, products, service names, trademarks, or registered service marks are the property of and used to identify the products or services servi of their respective owners. All specifications are subject to change without notice.
CAREER PATH ■ MARTIN KUPPINGER
Career path
Meet: Martin Kuppinger Founder and Senior Partner Kuppinger Cole + Partner
M
artin Kuppinger is the author of more than 50 IT-related books, as well as being a widely-read columnist and author of technical articles and reviews in some of the most prestigious IT magazines in Germany, Austria and Switzerland. He is also a well-known speaker and moderator at seminars and congresses. His interest in Identity Management dates back to the 80s, when he also gained considerable experience in soft ware architecture development. He shares his career path so far with ETM... WHAT IS THE BEST ADVICE YOU’VE RECEIVED?
MK:
A professor of mine once said when discussing options for studying and a future profession: “If you feel that you’ll ever regret not doing that step, do it.”
Martin Kuppinger
WHAT IS THE MOST REWARDING EXPERIENCE YOU’VE HAD?
Founder and Senior Partner Kuppinger Cole + Partner
MK:
Apart from my personal life, I have had many positive and rewarding experiences. Overall I feel the most rewarding thing is when business appears to be fair and trust is honoured. But I also like hearing feedback from competitors like: “This guy really knows a lot about Identity Management”. WHAT DO YOUR COLLEAGUES SAY ABOUT YOU? WHAT ARE YOUR STRENGTHS?
MK:
It’s probably best to ask them... but overall I think that my strengths are trustworthiness, knowledge and fairness. IF YOU COULD CHANGE ONE THING ABOUT YOUR JOB, WHAT WOULD IT BE?
MK:
FACT FILE > Kuppinger Cole + Partner (KCP), founded in 2004, has become the leading Europe-based analyst company for all topics around Identity Management and Digital Identities. > KCP stands for expertise, thought leadership and a vendor-neutral view on the broader “identity market”, including aspects like classical Identity and Access Management (IaM), Information Rights Management (IRM), Identity Risk Management, Digital Certificates, Cards and Tokens, Single Sign-On, Auditing, Federation, User Centric Identity Management and Identity 2.0, and many more.
I would like to spend less time travelling and more time at my
home office.
WHAT DO YOU DO WHEN THE GOING GETS TOUGH?
MK:
Sports such as running and cycling help to really free up the mind. Besides this, personal task management reduces the risk of the going getting tough in the first place.
> Services include: Kuppinger Cole Strategic Advice, Kuppinger Cole Readiness Analysis, Moving from Cost Drivers to Business Support, Role Analysis and Role Management, and Structured Approach for Successful IAM Projects. > KCP provides results in their free newsletters and the comprehensive monthly KCP market view newsletter, through webinars, reports, studies, keynote presentations and events.
HOW DO YOU STAY UP-TO-DATE PROFESSIONALLY?
MK:
Briefings with vendors and end users are always interesting because they’re essentially the front line, and I keep up with recent research.
> KCP research covers over 100 companies in the broader IAM space without limiting itself to any key players. They observe the leading companies as well as the emerging ones.
If you would like more information about Martin Kuppinger go to www.kuppingercole.com, read his blog at http://blogs.kuppingercole.com, or visit www.id-conf.com for the European Identity Conference 2010, a Kuppinger Cole event.
80
EVENTS AND FEATURES ■ 2009/2010
Events and features 2009/2010
ETM is focusing on BI, GRC and Security INTERCONNECTION WORLD FORUM 2010 DATES: 25 – 28 January 2010 LOCATION: London, UK URL: www.iir-events.com INTERNATIONAL CONFERENCE ON COMPUTER EDUCATION, MANAGEMENT TECHNOLOGY AND APPLICATION DATE: 27 – 28 January 2010 LOCATION: Kathmandu, Nepal URL: www.coemta.org BLACK HAT BRIEFINGS AND TRAINING DC 2010 DATES: 31 January – 3 February 2010 LOCATION: Arlington, VA URL: www.blackhat.com/html/dc2010/dc2010home.html GARTNER BUSINESS INTELLIGENCE SUMMIT DATES: 1 – 2 February 2010 LOCATION: London, UK URL: www.gartner.com/it/page.jsp?id=927913 MARKETING WORLD 2010 DATES: 8 February 2010 LOCATION: San Francisco, CA URL: www.frost.com/prod/servlet/summitsdetails.pag?eventid=169406039&as=attend TDWI WORLD CONFERENCE 2010 DATES: 21 – 26 February 2010 LOCATION: Las Vegas, NV URL: http://events.tdwi.org/events/las-vegasworld-conference-2010/home.aspx VIRTUAL EDGE SUMMIT DATES: 22 – 23 February 2010 LOCATON: Santa Clara, CA URL: www.virtualedge.org
GARTNER BUSINESS PROCESS MANAGEMENT SUMMIT DATES: 1 – 2 March 2010 LOCATON: London, UK URL: www.gartner.com/it/page.jsp?id=928017
GARTNER INFRASTRUCTURE, OPERATIONS AND DATA CENTER SUMMIT DATES: 24 – 25 March 2010 LOCATON: Sydney, Australia URL:www.gartner.com/it/page.jsp?id=1177915
GARTNER BUSINESS INTELLIGENCE AND INFORMATION MANAGEMENT SUMMIT DATES: 2 – 3 March 2010 LOCATON: Sydney, Australia URL: www.gartner.com/it/page.jsp?id=1175212
GARTNER BUSINESS INTELLIGENCE SUMMIT DATES: 12 – 14 April 2010 LOCATON: Las Vegas, NV URL:www.gartner.com/it/page.jsp?id=1118023
GARTNER IDENTITY AND ACCESS MANAGEMENT SUMMIT DATES: 3 – 4 March 2010 LOCATON: London, UK URL: www.gartner.com/it/page.jsp?id=928020
GARTNER ENTERPRISE INTEGRATION SUMMIT DATES: 13 – 14 April 2010 LOCATON: Sao Paulo, SP URL:www.gartner.com/it/page.jsp?id=1188513
CLOUD CONNECT DATES: 15 – 18 March 2010 LOCATON: Silicon Valley, CA URL:www.cloudconnectevent.com
GARTNER ENTERPRISE TECHNOLOGIES SUMMIT DATES: 21 – 22 April 2010 LOCATON: Mexico City, Mexico URL:www.gartner.com/it/page.jsp?id=1188514
GARTNER CUSTOMER RELATIONSHIP MANAGEMENT SUMMIT DATES: 16 – 17 March 2010 LOCATON: London, UK URL:www.gartner.com/it/page.jsp?id=934115 GARTNER CIO LEADERSHIP FORUM DATES: 21 – 23 March 2010 LOCATON: Phoenix, AZ URL: www.gartner.com/it/page.jsp?id=1189395 GARNTER BUSINESS PROCESS MANAGEMENT SUMMIT DATES: 22 -24 March 2010 LOCATON: Las Vegas, NV URL:: www.gartner.com/it/page.jsp?id=1216615
Interested in contributing? If you’re an analyst, consultant or an independent and would like to contribute a vendor-neutral piece to future issues of ETM, please contact the managing editor, Ali Klaver: aklaver@enterpriseimi.com. 82
INTEROP LAS VEGAS DATES: 25 – 29 April 2010 LOCATON: : Las Vegas, NV URL:www.interop.com/lasvegas
My CEO doesn’t know my name. And that’s the way I plan to keep it. Effective data security is key to preventing breaches, simplifying the compliance process and reducing risk to your organization. Let us help you focus your time, money and resources on more strategic projects, reduce the workload of securing critical information, and streamline compliance reporting for mandates such as PCI DSS, HIPAA, NERC, and Sarbanes-Oxley.
Our solution provides a multi-level approach to data security and compliance: • NetIQ® Security ManagerTM – from Log Management to Complete SIEM
• NetIQ® Secure Configuration ManagerTM – Compliance Assessment to Security Configuration Auditing
• NetIQ® Change GuardianTM – Privileged-User Management and File Integrity Monitoring
• NetIQ® Aegis® – the First IT Process Automation Platform for Security and Compliance
If you’d like to find out more about how NetIQ can help you with data security and critical compliance mandates, visit www.netiq.com/undertheradar or contact info@netiq.com.
© 2009 NetIQ Corporation. All rights reserved. NetIQ, the NetIQ logo, NetIQ Security Manager, NetIQ Secure Configuration Manager, NetIQ Change Guardian, and NetIQ Aegis are trademarks or registered trademarks of NetIQ Corporation or its subsidiaries in the United States and other jurisdictions. All other company and product names may be trademarks or registered trademarks of their respective companies.