the independent resource for it executives July 2009 n www.Global
.com
Perimeter
breach Are your security measures tough enough?
Derek Brink Aberdeen Group Roger Southgate ISACA Past President Jason Creasey Information Security Forum Stephane Fymat Passlogix
ETM n Contents page
contents page and contributors page 7 Editor three-in-one challenge 8 The
Enterprises today are doing their best to address three simultaneous objectives; enhance security, achieve and sustain regulatory compliance, and improve the efficiency and cost-effectiveness of their ongoing operations. DEREK BRINK (ABERDEEN GROUP) explains how it’s possible to face this challenge head-on.
ahead of the hack 12 Staying Budget cuts and compliance issues are affecting the role of IT professionals, and staying one step ahead of the breach is only getting more difficult. derek brink (aberdeen group) seeks answers from morey haber (eeye digital security), jan hichert (astaro ag), and jim waggoner (symantec) on how to increase protection while staying competitive.
24 Don’t lose control…
When it comes to security and risk, how are your applications controlled and are they proving to be lethal or lifesaving? Roger Southgate (past president of ISACA, London) has the solution for application controls, and how to make the best out of a bad situation.
4
july 2009
24
Contents page n ETM
breaches— guaranteed 28 No
With today’s growing importance on IT security, is it possible to be fully protected? MANU NAMBOODIRI (BITARMOR) takes us through an information-centric security solution that boasts a better, more cost-effective protection—and the first ever No-Breach Guarantee™.
32 The threat horizon
As new technologies evolve and change, so too does online crime in all its forms, creating a very real challenge for business management and security. Jason Creasey (Information Security Forum) looks at the threats facing information security professionals over the next two years.
36 Security first
The increase in security breaches has left companies scrambling to protect themselves and their most precious asset. TODD TUCKER (NETIQ) tells us why so many organizations have failed to secure their data, and what they can do about it.
the security gap 42 Closing
Access to privileged and shared accounts has become something of a free-for-all in organizations today. STEPHANE FYMAT (PASSLOGIX) explains why it’s crucial to find a solution for managing shared credentials.
46 A parallel universe
In order to seamlessly move from the physical to the virtual world, IT managers need unified, global control over their evolving physical and virtual data centers to meet security and compliance requirements, warns DAVID McNEELY (CENTRIFY).
and features listing 50 Events
july 2009
36
5
GIL 2009: India Bangalore, India October 2009 Growth, Innovation & Leadership: A Frost & Sullivan Global Congress on Corporate Growth GIL 2009: Asia Pacific Kuala Lumpur, Malaysia
GIL 2009: China Shanghai, China
GIL 2009: Europe London, UK
GIL 2009: Middle East Abu Dhabi, UAE
GIL 2009: Latin America Sao Palo, Brazil
GIL 2009: North America Phoenix, Arizona
Frost & Sullivan’s premier client event GIL – Growth, Innovation and Leadership supports senior executives in their efforts to accelerate the growth rates of their companies. Each year, thousands of CEOs and their growth teams return to engage in this global community to explore actionable strategies, solutions, and growth processes that they can put to work in building a solid Growth Acceleration System. GIL 2009: Middle East is a "must attend" for any organisation seeking fresh perspectives, new ideas and innovative, practical solutions to stay ahead of the curve. Please mention partner code GILETM09 to receive a Rs. 7,500 savings courtesy of Enterprise Technology Management
Register Today! www.gil-global.com Email: gilindia@frost.com Tel: + 91 22 4001 3422
Capturing innovative ideas for growth.
Editor’s Page n ETM Contributors
Protect your assets,
or be prepared for a breach
W
hat a dynamic month it’s been. From Twitter’s unexpected role in the Tehran revolution, to the hype around the new iPhone 3G S, the world is preoccupied with technology and its uses. This comes as no surprise to those on the front line, who are able to predict and develop these technologies to keep up with public demand, and indeed to anticipate it. But in the rush to implement and utilize these capabilities, a lot of individuals and companies are not adequately protecting themselves from security attacks. In this issue of ETM we’re looking at security, and the need to keep up with advancements in, for example, the increasing adoption of virtualization—which opens a whole new set of security responsibilities. Enterprises today are doing their best to address three simultaneous objectives; enhance security, achieve and sustain regulatory compliance, and improve the efficiency and cost-effectiveness of their ongoing operations. Derek Brink explains how it’s possible to face this challenge head-on. He also moderates our exclusive panel discussion on managing threats and vulnerabilities with the expert advice of Morey Haber, Jan Hichert and Jim Waggoner. BitArmor’s No-Breach Guarantee™ is certainly worth a read (and a listen), as Manu Namboodiri takes us through a cost-effective information-centric security solution. David McNeely from Centrify tells us that a global, unified approach is needed to control evolving physical and virtual data centers to meet security and compliance requirements, and Todd Tucker explains why so many organizations have failed to secure their data, and what to do about it. We also take a quick look into the future, where Jason Creasy from the Information Security Forum looks at the threat horizon over the next two years. Hopefully in this issue of ETM you’ll find the answers to at least some of the security challenges facing IT professionals today. Thank you for reading, and if you would like to contribute to any future issues of ETM, please feel free to contact us at www.globaletm.com or via email at editor@enterpriseimi.com
Ali Klaver Managing Editor
Fo u n d e r / P u b l i s h e r Amir Nikaein Managing Editor A l i K l av e r Po d c a s t / S o u n d E d i t o r Mark Kendrick Tr a n s c r i b e r Ann Read Ar t Director Ve e S u n Ho He a d o f D i g i t a l Ju l i u s z Kw i a t k o w s k i Project Director Ye n Ng u y e n Fi n a n c e D i r e c t o r M i c h a e l Ng u y e n Account Executives Fa y Ha m p t o n Jo e M i r a n d a Contributors Derek Brink Vice Pres id ent an d R esearc h D i rec to r, I T Sec u r i t y A b e rd e e n G r o u p Roger Southgate Im m e d i a t e Pa s t P r e s i d e n t I S AC A , L o n d o n C h a p t e r Ja s o n C r e a s e y He a d o f R e s e a r c h I n f o r m a t i o n S e c u r i t y Fo r u m ( I S F ) Stephane Fymat Vice President of Strateg y and Product Management Pa s s l o g i x How to contact the editor
We welcome your letters, questions, comments, complaints, and compliments. Please send them to Informed Market Intelligence, marked to the editor, Level 4, 35-39 Moorgate, London, EC2R 6BT or email editor@enterpriseimi.com
PR submissions
All submissions for editorial consideration should be emailed to editor@enterpriseimi.com
Reprints
For reprints of articles published in ETM magazine, contact sales@enterpriseimi.com All material copyright Informed Market Intelligence This publication may not be reproduced or transmitted in any form in whole or part without the written express consent of the publisher.
Enterprise Technology Management is published by Informed Market Intelligence
Headquarters Informed Market Intelligence (IMI) 35-39 Moorgate, London, EC2R 6AR, United Kingdom +44 207 148 4444 Tokyo 1602 Itabashi View Tower, 1-53-12 Itabashi, Itabashi-Ku173-0004, Japan Dubai (UAE) 4th Floor, Office No: 510, Building No.2 (CNN Building), Dubai Media City, Dubai july2009 2009 july
7
analyst feature n security compliance and management
The three-in-one challenge Enterprises today are doing their best to address three simultaneous objectives; enhance security, achieve and sustain regulatory compliance, and improve the efficiency and cost-effectiveness of their ongoing operations. DEREK BRINK (ABERDEEN GROUP) explains how it’s possible to face this challenge head-on.
8
july 2009
security compliance and management n Analyst feature
Aberdeen Group’s ongoing IT security research has shown that the “Best-in-Class” approach to this three-in-one problem is to ensure that their IT infrastructure is secure, compliant, and wellmanaged… in that order. How are the companies with top performance (referred to by Aberdeen as “Best-in-Class”) attacking the three-part nature of these sometimes overlapping, sometimes conflicting objectives? In part, by making use of the incredible volume of data that is already being generated by their existing computing infrastructure. For example, the logs that record information about the events that take place throughout an organization’s network devices, servers, endpoints, operating systems, applications, and databases. The number, volume, and variety of security-related logs have increased significantly, which has created the need for log management solutions to address the process of generating, transmitting, aggregating, storing, and eventually disposing of log data. Log, information, event, flow, and session data is also generated by an organization’s existing security solutions, such as anti-virus software, intrusion detection/intrusion prevention systems, and identity and access management solutions—and a wide variety of other sources. In general, security information and event management (SIEM) solutions are complementary to log management, in that they are used to interpret and take action on securityrelated log, information, and event data.
What to do with all that data The sheer volume of log, information, and event data that is generated every day can be staggering. One large insurance company with 4000 devices in four locations generates about 80,000 unique events per second, or about 6.9 billion events per day. A US-based financial institution has 20,000 devices in 18 locations, generating 15.5 billion events per day. A global services provider with 30,000 devices in 34 locations generates 20 billion events per day. On average, the number of unique data sources for companies participating in Aberdeen’s most recent benchmark study was about 5000. Not all of this data is noteworthy, of course, from a security and compliance perspective—quite the contrary; the primary objects of interest are the exceptions, not the norms. The need to find the proverbial needles in this haystack of data illustrates why companies are turning to technical solutions that can help them leverage their securityrelated logs, information, and events. Like most topics in IT security, a series of “verbs”, such as those listed in Table 1 below, can be used to describe the process of leveraging logs, information, and events in a lifecycle model, i.e. from the initial discovery of data sources to the eventual deletion of collected data. Most important are the verbs that describe taking action, such as alert, report, prioritize, and remediate. As IT environments grow simultaneously more mission-critical and more complex, the ability to turn “all that data” into
Table 1: The “verbs” of log/information/event management Generate data
Manage data
Interpret data
Take action
Discover sources Collect
Store Secure Retain Index Search Delete
Normalize Aggregate Correlate Analyze Detect incidents Prioritize incidents
Alert Report Generate information required for remediation Prioritize remediation Automate remediation
Source: Aberdeen Group, Leveraging Logs, Information and Events (March 2009)
july 2009
9
analyst feature n security compliance and management
10
july 2009
but most are not realizing meaningful returns from these expenditures. And in managing these unrewarded risks, they are painfully distracted from managing the type of rewarded risks that really matter: those that create value for their customers and ultimately grow the business. These financial, strategic, operational, and other types of rewarded risks are what most people mean by the “R” in IT GRC. Aberdeen’s research has found that Best-inClass companies are significantly more likely than others to describe their current approach to IT GRC as “centralized” and “primarily automated.” Two-thirds of the Best-in-Class
Figure 1: Market trends—absolute vs. relative adoption 7.0
IT-GRC platform /software
4.0
ERM platform /software
1.0 25%
SIEM
50%
Absolute adoption (percentage of Best-in-Class)
Log management
Source: Aberdeen Group, IT GRC: Managing Risk, Improving Visibility, and Reducing Operating Costs (May 2009)
TAkinG iT up A leVel: iT Grc The rise in importance of IT governance, risk management, and compliance (IT GRC) initiatives reflects the increasing recognition that the strategic value of IT lies not in the mere technology itself (which is generally accessible to everyone), but in how it is applied and managed most effectively. Aberdeen’s research has found that for the IT function, the GRC acronym is not listed in the actual order of appearance of formal corporate initiatives. The de facto order for IT GRC initiatives,
based on the findings from Aberdeen’s benchmark studies, has been first security, then compliance, then IT governance, then IT risk management. This pattern holds true independently of how successful the initiatives ultimately are. The research also shows that Best-in-Class organizations are 2.3 times more likely to have adopted a “continuous improvement” approach to their IT GRC initiatives, underscoring their commitment to managing IT as a strategic asset. Note that “managing risk” actually manifests itself in two distinct ways. Companies are compelled to make investments for security and compliance (these are unrewarded risks),
Relative adoption (ratio of Best-in-Class:Laggard)
useful information begins to transform the importance of security-related log, information, and event data beyond mere reporting on security and compliance, to providing visibility and insight into the underlying services that drive the company’s operations. The ultimate reason to leverage logs, information, and events is to provide better visibility, understanding, and insight into the complex IT environments that are the foundations for running and growing the business. Efficiency at the “verbs” of the log/ information/event lifecycle is necessary, but not sufficient, to achieve these ends. If the generation and management of all that data is merely to generate static reports to satisfy the next auditor, the opportunity to interpret the data and take action to drive more value from the IT infrastructure is lost. Looking ahead, the most valuable IT security professionals will be those who can successfully interpret the implications of security-related logs, information, and events, and more importantly, who can successfully drive actions that positively impact the business.
75%
security compliance and management n analyst feature
… the most valuable IT security professionals… can successfully interpret the implications of security-related logs, information, and events…
further characterize their IT GRC initiatives using attributes such as “risk-based”, “eventdriven” and featuring “automated workflows for incident response.” In contrast, the lagging companies are 3.5 times more likely to be using “manually intensive” controls and procedures. While this is consistent with the general “Crawl, Walk, Run” pattern commonly seen in Aberdeen’s information technology research, the point is not to be good at the process of security, or compliance, or governance, or risk management for its own sake—the point is to harness IT more effectively in support of achieving business objectives while managing financial, strategic, and operational risks. enAblinG TechnoloGies, reVisiTed In the context of IT GRC initiatives, log management, security information, and event management are becoming baseline technologies, in the sense that they are being used not only by a majority of the Best-in-Class but also by a relatively high proportion of other respondents (see Figure 1). Other technologies to watch include IT GRC “platform” solutions, which are in the early adoption phase by the Best-in-Class (modest absolute adoption; high relative adoption), and Enterprise Risk Management “platform” solutions which are in the emerging quadrant for the Best-in-Class companies in Aberdeen’s most recent studies. The beneFiTs oF Top perForMAnce Through its time-tested benchmarking process, Aberdeen’s research found that Best-in-Class organizations are making greater strides in
their IT GRC initiatives, leading to numerous strategic and operational benefits including: Significantly larger year-over-year improvements in their ability to identify, assess, and prioritize risks Improved access and visibility for business owners regarding current risk status Better communication of risks to key stakeholders Enhanced capabilities to translate risk assessment data into recommendations, enabling faster decision-making Significantly greater year-over-year improvements in compliance-related tracking and reporting Better flexibility to adjust to new or updated regulatory requirements
Derek Brink
Vice presidenT And reseArch direcTor, iT securiTy Aberdeen Group
Improved access and visibility for business owners regarding current compliance status Better communication of compliance status to key stakeholders. Aberdeen’s research demonstrates that IT GRC initiatives are continuing to grow in relevance, as a direct result of their ability to apply and manage IT more effectively, and thereby to maximize its strategic value to the organization. A full copy of Aberdeen’s benchmark report on IT GRC: Managing Risk, Improving Visibility, and Reducing Operating Costs is available at no charge from Aberdeen Group until 31 July. See the website for more information: www.aberdeen.com
Derek joined Aberdeen Group in 2007 as a senior hightech executive experienced in strategy development and execution, corporate/business development, and product management/product marketing. He brings a unique blend of analytical/technical background, combined with excellent communication skills and extensive information security industry expertise. Prior to Aberdeen, Derek’s industry experience included positions with RSA Security (now part of EMC), Gradient Technologies (now Entegrity), Transarc (a subsidiary of IBM), Sun Microsystems (now Oracle), and HewlettPackard. He began his professional career as an analyst for the Central Intelligence Agency. Derek earned an MBA with honors from Harvard Business School and a BS in Applied Mathematics with highest honors from the Rochester Institute of Technology.
july 2009
11
Executive panel n threats and vulnerabilities
Staying ahead
of the hack
12
july 2009
threats and vulnerabilities n Executive panel
Budget cuts and compliance issues are affecting the role of IT professionals, and staying one step ahead of the breach is only getting more difficult. DER EK BR INK (A BER DEEN GROUP) seeks answers from MOR EY H A BER (eEY E DIGITA L SECUR ITY), JA N HICHERT (ASTA RO AG), and JIM WAGGONER (SYM A NTEC) on how to increase protection while staying competitive.
DB: Aberdeen conducted benchmark research last fall on Unified Threat Management, also known as UTM, UTM+, UTM 2.0, Extended UTM, xUTM, All-inOne Security, MultiFunction Security, Integrated Security— there is no doubt the names are confusing! But we found that the companies with top performance were realizing tangible benefits from adopting the UTM approach, such as year-over-year reductions in actual threat/vulnerabilityrelated incidents, audit deficiencies, unscheduled downtime, and total associated staff. Jan, are these still among the main value propositions seen by Astaro clients?
JH:
Well firstly, with regards to the terminology, when we started Astaro in 2001, the term Unified Threat Management wasn’t even born—we called it an all-in-one firewall back then.
It doesn’t really matter what you call it, what is relevant is the idea of consolidating your various security applications, or in our case, your network security related applications. That’s what Astaro does and what Unified Threat Management, or UTM, means. Our customers are deploying firewalling, virtual predator web filtering and spam filtering, all in under an hour and for low cost on one device. This is very easy to manage and our customers receive enterprise class protection. The outcome is that our customers can now spend more time on specific IT security related threats or even certain regulatory requirements, and that works really well. Primarily we’re seeing small- and mid-sized enterprises with slashed budgets, counting the number of network security applications that they’ve deployed through their businesses, realizing that UTM scales pretty well, and offers efficiencies that are to the benefit of any sized company. That’s really the advantage of all-in-one network security or UTM. DB: Thanks Jan. We found exactly the same thing in our benchmark research,
taking data directly from the end-users, and some of these benefits are quantifiable for companies who are getting the top results. Morey, you also talked about an integrated approach to managing threats and vulnerabilities, which if I understand correctly, eEye achieves not through an “all-in-one” solution per se, but by a portfolio of solutions that are integrated by a common management console. Have I described this correctly? What do you believe are the advantages of this approach?
MH:
That approach is true for eEye solutions but not necessarily for the eEye product line itself. eEye believes that each product should be a best of breed, an allin-one best of breed solution, but it does rely on other components in order to make it that all-in-one solution. Our Blink Endpoint Protection product is a perfect example of that, and contains seven complete layers of security protection in a single endpoint agent. We achieve the best results through an integrated threat management solution using Blink or a Retina Network Security
http://www.global .com/ featured_podcasts/Article/ staying_ahead_of_the_hack july 2009
13
Executive panel n threats and vulnerabilities
Scanner, and the Retina Security Management Console. These two tools can come together as a software appliance, then provide that all-in-one approach to it as we mentioned. As for the advantages, attacks and threats can occur anywhere in the network. Protection identification is best served near the source for containment or even rapid identification. How to monitor vulnerability assessment data, endpoint attack data, and even web firewall data residing in an all-in-one console allows for that quick correlation, analysis, and rapid response, so one component in itself can’t be allin-one. It can be best of breed, but when tied together as a solution, becomes pretty much that all-inone solution. If you consider each eEye tool best of breed, merged with an all-in-one integrated threat management solution, it creates that type of heterogeneous single vender environment that a lot of vendors want. So we ask clients: “What are the risks? How are they being attacked? How can they mitigate the threats, including zero-days?” And then we figure out how to report that data in on one solution, dealing with one vendor, especially when resources are tight as in today’s age. DB: Thank you, Morey. The next question for Jim at Symantec will probably make the discussion a bit broader. Jim, we’re talking about managing threats and vulnerabilities on the network side, and since Symantec has such an enormous base of endpoints in the enterprise, I’d like to ask you to comment on the endpoint security
14
july 2009
capabilities that your customers are adopting in addition to traditional threat and vulnerability management capabilities, and Symantec’s approach to that.
JW:
If we go back a couple of years, everyone was using antivirus, and they adopted antispyware when spyware and adware became very rampant in the environment, not only in the enterprise but consumers did as well. What we found at the time was that we had a couple of products. One was the basic antivirus antispyware, another was adding in the firewall and ITS. But what we noticed was that not a lot of customers were adopting the firewall and ITS. At the time we saw about 10% of our customers adopting it, and the main reason for this is they felt antivirus and antispyware was good enough. Although that’s a great baseline, we know that you need more than just antivirus, so we combined different technologies where we added access control, application control, whitelisting, device control, as well as antivirus, antispyware, firewall and ITS, and delivered that to our customers. We saw very quick adoption, and I think in your paper, Derek, you saw that as well where around greater than 80% of the customers at the endpoint were adopting firewall and ITS. When we talked to our customers we found that they’d become advanced enough to use the firewall and the ITS, and now the new trend is that they’re asking for more—they want the network access control to ensure their systems are protected. They’re also looking to protect the data on the endpoint
where there are a lot of mobile workers using mobile devices that companies don’t have total control over. Similarly, the proliferation of removal media such as USBs is causing worry. Everyone has them, they’ve got stacks of them on their desks or in their drawers, and all of them have company data. So, customers are asking us for solutions on how to protect that data, and integrate them into single client software that focuses on performance. DB: I conducted a benchmark study on endpoint security, which was published in March, and what we also looked at in that study—and I guess this comes back to you, Jim— is the trend towards integration of not just the security aspect of endpoints, but also the management aspect of endpoints. It’s true that Symantec has been a leader in terms of developing capabilities in this area. But my question is, how rapidly have your enterprise customers adopted this approach, and why?
JW:
Well, a secure endpoint is a well managed endpoint. I think this comes back again to some of the statistics that you were showing where you have the average user, you’ve got the best-in-class, and you’ve got the laggards. I think your average was around 50%, best-in-class is about 20%, and your laggards are about 30%, and that’s the same thing that we’re seeing. The reason why this is happening is that customers start out with the basics—you’ve got the security down at the endpoint and your laggards are looking at their
Executive panel n threats and vulnerabilities
simple reporting and they’re seeing threats hitting their environment. They’re trying to put down protection technologies to detect it. The next step is to stop it, but then they begin to ask questions like: “How can I prevent it? Do I have solutions in place from a management point of view that will allow me to prevent the attacks from hitting on the endpoint and getting on to my network?” So not only are they responding to these attacks and threats which takes quite a bit of time, they’re also looking to operationalize it. I asked one of my customers how they dealt with a virus outbreak, and what they used to do is unplug the computer from the port. So they go to a closet, they identify the right net cable for that computer, and unplug it so they’ve isolated that system. Then they’ll grab that person’s computer, they’ll ghost the image down so they can restore it back from scratch, and then get the user up and running and make sure they’re clean. If you think about the amount of time spent going through that process, of course you’ll want to make it more efficient. From a centralized management perspective where you’re not just looking at threats but also your assets, inventory, backup, and distribution, you need to come up with a model to isolate a detected system, lock it down, and then remotely reimage it and get this user back up and running within minutes as opposed to hours. Then from a console you can think about that information and identify if the user was attacked, what sort of data they had on their system, what other businesses or systems might be impacted by that, and what sort of risk-based approach or information approach to take in order to isolate those vulnerabilities. So again, we see some of the users that take a very basic approach
16
july 2009
fall behind the average, and then we see ones that are going best-in-class are starting to ask about becoming more efficient, and that’s where we see the integration of endpoint management and endpoint security. It’s a very powerful thing. DB: Before we get to the freeform discussion, I’d like to work back from the discussion we’ve been having about endpoints. Morey, you mentioned your endpoint security solution, and I think you also mentioned the consolidation around a single agent. Can you tell me about the trends in reducing the number of security agents that are installed at the endpoints, and why?
MH:
Well, as my colleague at Symantec mentioned, there definitely has been a trend in this area. Endpoint security pretty much started with just antivirus, and Gartner was bold enough recently to state that traditional antivirus is dead, even though it’s still being offered by many vendors as a standalone tool. Over time the threats have involved blended threats that use exploits, spyware exploitation of port configurations, and more prominently now social engineering attempts, whether it be attachments that have exploits, or phishing attempts. Vendors have pretty much tried to tackle the problem by introducing targeted solutions. We’ve had antispyware add-ons, we’ve had add-ons for DLP, and some vendors only sell HIPS today—and they do fairly well out of it.
So having multitudes of these types of agents requires separate installation, perhaps another management console, and more importantly, they tax the system resources. Vendors like eEye recognize this problem, I’m aware that my colleagues at Symantec do as well, and we’ve tried to consolidate a lot of features and functions of the security agents whether it’s through suites, acquisitions, or integration. The hard part about all of this is that in many cases vendors need the best of breed or the best security product on their endpoint, but they can’t deal with the bloatware. They need the feature to protect against DLP, phishing, firewalls, hits, zero-day attacks, and so on, but they need something that’s going to work on their older hardware because they’re not being replaced. They need something that can come in with a small footprint just like the older AV products they have. So it becomes very costprohibitive in many cases to use these larger tools because they need a lightweight footprint. eEye is taking the approach that with a single Blink solution, it’s not an aggregate of integration suites or acquisitions, and instead is a fairly lightweight agent that is in many cases smaller than a lot of the free AV solutions that are on the market. We tie that with our all-in-one management console, again whether it’s software as an appliance, and reduce the footprint to one agent without relying on those à la carte add-ons, loaded integrations, or multiple vendors in a suite. So, we see the trend moving towards that all-in-one lightweight agent because of system resources, the ability of customers to replace resources because of economic times, and the need for the latest security protection, especially when compliance initiatives require additional work for PCI, HIPA, and SOX.
SYMANTEC IS Worms. Spyware. Phishing. Today’s breed of online threats pose serious
risks when it comes to securing your network and company’s privacy. Endpoint Protection goes beyond just antivirus, integrating essential security technologies to protect your laptops, desktops, and servers. Try Endpoint Protection today at symantec.com/endpointprotection
CoMPrEhENSIvE
ENdPoINT ProTECTIoN.
© 2009 Symantec Corporation. All rights reserved. Symantec and the Symantec Logo are registered trademarks of Symantec Corporation or its affiliates in the U.S. and other countries. Other names may be trademarks of their respective owners.
Executive panel n threats and vulnerabilities
In a recent eEye case study, Conficker were pretty much dead, but there were several good organizations that did get impacted. In the case study basically 100% of the machines in the environment that were not patched for Microsoft vulnerabilities did get infected because the company had not adopted a rigorous patch management program, and everything from desktops to services were affected. They were using a traditional AV solution at the time with no HIPS and it could not even remove the product. The saving grace was that one executive in the company chose to use Blink as a solution in a trial version, and he was the only one that was not affected. It made a very clean transition to a single agent approach, versus an à la carte add-on. So in this case study, it’s one environment, one agent, and one person doing the management, and it makes it much easier for the resources of that organization to deal with instead of three or four different vendors plus management consoles and training. DB: Thank you Morey— actually you reminded me about one of the findings from the study that we did which was just a simple fact about the average replacement cycle for endpoints in the enterprise; we found it to be about 3.2 years and trending longer in this difficult economy. In other words, you made a point about older systems, or the need to have a lighter weight footprint, and given the current economy companies are trying to stretch the replacement cycle, and that’s one of the factors we found that gives credence to the point you’re making there.
18
july 2009
Jan, thanks for being patient. In past discussions with Astaro it seems that the company takes pride in being somewhat “contrarian” in its approach to advancing the market—is this true? If so, what would be some recent examples?
JH:
Just for clarification, we’re seeing very similar trends to those discussed by my peers on the panel, on the client side and on the gateway side, with respect to all these different applications that need to run. I don’t think the situation is better or worse, I think it just compliments a desktop solution very well. It’s all about integrating those best of breed or point solutions into one easy-to-use solution and I think much of our success is based on the fact that we aren’t using exclusively our own resources, which is unrealistic, even for a company with very large development teams who are trying to develop all these various solutions. I think what we’re seeing from the big companies is that they’re acquiring these companies to have this common set of functionality. We have gone to the open source ecosystem, and we source a lot of our applications from the open source arena where there is great technology available. I think by now it’s accepted in the software industry that the open source development model itself has some inherent advantages in developing secure software— however, it’s no panacea. It has a lot of advantages from its transparent model, and as an integration vendor like we all are, the challenge here is to make these different client products work seamlessly together. We feel we can do that much better by using open source technology where we can always choose the best component and ensure it works seamlessly. That’s what a lot of our success is based on because a lot of our
customers realize the benefit of open source in terms of quality and functionality, but also at the end of the day, the price on our solutions is very competitive using that development model. At the same time, a lot of our firewall appliance-based competitors are coming from the router firewall base, therefore it’s very expensive and only possible on long cycles for them to upgrade their hardware to deliver the latest web protection, and they are forced to use a more softwarecentric approach. Astaro can get to market more quickly than competitors, benefitting from the performance of Linux and other open source tools. That makes a real difference for us. For Astaro, another benefit we get from that software-centric approach is that we also offer a software solution. Our customers can go to our website, download the image for our appliances, and install it either on the hardware they’ve purchased and configured for their specific needs, or on spare hardware that they have in stock, turning this hardware into their dedicated firewall UTM appliance. Alternatively, they can choose our VMware virtual platform to save on hardware in the first place. We’re really the only company approaching the UTM market that way, and that makes a lot of our partners and customers happy. DB: Now I’d like to open it up to a more freeform discussion. Obviously, we’re addressing IT professionals and we’re trying to help the development of strategy and purchase decisions. The last two years of research that I’ve conducted at Aberdeen show that it’s very much a pointproduct world. The term we use for the top 20% of any given
GARTNER SYMPOSIUM/ITXPO
THE WORLD’S MOST IMPORTANT GATHERING OF CIOs AND SENIOR IT EXECUTIVES BALANCING COST, RISK AND GROWTH Symposium/ITxpo 2009® is designed to deliver the insight, tools and relationships you need to get through what may be the toughest year of your career. More than 200 presentations delivered by world-renowned Gartner analysts will cover all facets of how business technology can help you strike the right balance between cost optimization, risk mitigation and a carefully timed return to growth. In challenging times, organizations rely on their leaders.
IT leaders rely on Symposium.
Vision
Meet IT’s best minds. Keynotes by top CEOs.
Experience
CIOs, senior IT executives, and industry experts conferring on tough challenges.
ROI
Immediately actionable take-aways for each of nine IT leadership roles.
Solutions
The world’s top technology companies across IT.
Visit gartnerinfo.com/symposium/etm for an exclusive Enterprise Technology Management discount on your registration.
© 2009 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. or its affiliates. For more information, e-mail info@gartner.com or visit gartner.com.
18 – 22 OCTOBER, ORLANDO, FLORIDA GARTNER.COM/US/SYMPOSIUM 2 – 5 NOVEMBER, CANNES, FRANCE GARTNER.COM/EU/SYMPOSIUM
2009 The world’s most important gathering of CIO’s and senior IT executives
Executive panel n threats and vulnerabilities
study is “best-in-class”, and among those best-in-class companies we definitely do see a trend with the things we’re discussing here like consolidating functionality, simplifying management, and reducing the number of vendor relationships. What we’ve found with the other 80% is a very tactical approach. A problem is identified and then a solution is deployed to address it. That situation gets repeated over a number of problems, and suddenly companies find themselves maintaining and supporting multiple products. So what do we say to the companies who are in this position, further back along the path but moving towards this best-in-class approach, and what advice do we give in terms of strategy and the approach that might work best?
JW:
Well, it goes back to a customer of mine at Symantec who I would definitely put in the bottom 80%. What they had deployed down to the endpoint—work stations, laptops, servers, an antivirus/antispywareonly solution was very simple. They showed us centralized reporting of the antivirus and antispyware, and we saw that on all the attacks, out of roughly over 350,000 during that time window, the top two were over 70% of all attacks. So I asked them if they would like to be able to prevent this and instead focus on the bottom 30% of attacks which is miniscule by comparison. Of course, the answer was yes. So we advanced them a little bit more in the technology and let them know that if they had just added ITS at the endpoint and done network scanning so that we can look for malware attacks through the network as you download, then we would have been able to prevent those from
20
july 2009
getting on to the file system, getting into memory or getting on the box, and detecting it after the fact. They saw the game immediately and asked: “How do I do this? What tools do I need to enable this IPS?” So we talked about the integration of clients, the high performance, and then the management of trying to deploy out a client, deploy out a patch, and deploy out the technology. We basically take the problems they have and show them that by expending a little bit of effort, there is potential for huge gain.
MH:
I think clients with this type of problem would first have to acknowledge it. They need to acknowledge that they have a hodge-podge of vendors, they need to reign in their assets, and have better control over them. The switch is where the problem becomes a headache for management—how do they switch three or four different vendors rapidly and not leave themselves exposed? We have taken the philosophy that when you pick a vendor to work with, or you pick a leader in the industry, you want to make that switch and eliminate everything else out there, but you need to do it rapidly. Here at eEye we’ve developed something called the SRT or Software Removal Tool, and it has a dozen different solution vendors within it that can go in and rapidly switch out. So when consolidating or bringing that type of security to a client, the end-user is able to turn off their firewall, and you’re able to rapidly deploy the solution technology, switch out vendors, and then bring them into an all-in-one approach. And as my colleague mentioned earlier, a secure endpoint is a wellmanaged endpoint. When you’re dealing with laptops, you’re very rarely on the
network or other devices that are floating out there. Getting them to report their attack data, making them location-aware so that they have different policies when they’re on or off the network, and then having them all standardized from the same versions, really helps with the approach of getting an environment secure and making sure the assets are protected the right way.
JH:
What we’re seeing is that a lot of our customers start at the gateway instead of even touching the endpoint product, because it’s clearly a more management-intense effort and we don’t recommend that. We’re seeing a lot of people starting off with a gateway-based solution, doing web filtering against phishing and affected downloads, plus spam filtering, which is also a big attack vector, and thinking that’s enough to keep them safe. Obviously this has its own risks, but we’re seeing it happening a lot. DB: It’s interesting that a lot of the discussion has been not only about security threats and vulnerabilities, but also about being more efficient at being successful. For example, we find that IT budgets are being cut and yet security and compliance budgets are on the up. This squeezes out the ability of these companies to invest in other initiatives. So by being more efficient at these activities, we find it frees up companies to invest in more “rewarded” types of risks. We seem to be talking about improvements in capabilities
How can mobile solutions help us increase revenue? How will cloud computing help us do more with less?
What technologies can help my business right now? How can virtualization make our business more efficient?
Will unified communications help us improve customer responsiveness?
REGISTER TODAY
MUMBAI | 7–9 OCTOBER Bombay Exhibition Centre
for a Conference Pass or Reserve Your Free Expo Pass. www.interop.com/mumbai
Answers. Action. AdvAntAge. ©2009 TechWeb, a division of United Business Media LLC.
The Leading Global Business Technology Event is Coming to India Get the information you need to build your competitive edge at IT’s most comprehensive Conference and Expo. See all the latest innovations—including virtualization, mobility and cloud computing—that will help you increase efficiency, drive revenue and reduce costs.
Register Today for a Conference Pass or Reserve Your Free Expo Pass. www.interop.com/mumbai For Exhibiting and Sponsorship opportunities Contact Pankaj Jain at + 91 98101 72077 or email: pankajj@ubmindia.com. SPOnSORS: Gold
MEDIA SPOnSORS:
BronzE
BadGE
SuPPORTInG ASSOCIATIOnS:
Executive panel n threats and vulnerabilities
and processes as a discipline for our industry, which I personally think is a good thing.
MH:
I agree with that wholeheartedly. With security in mind, whether it’s at the perimeter or at the endpoint, clients that are security-minded really have to assess their risk, and that is at the forefront of any compliance initiative or any security program in a company. Having protection at the endpoint or at the gateway will pretty much stop any attack, but if companies are not aware of their attack vectors, or how they could be penetrated, it presents a whole different problem.
Within our toolset the vulnerability assessment component is built into our endpoint agent as well as a network scanner, so when an organization is trying to assess their security they know where their weaknesses are, and it builds a strategy to protect them. So assessing your security, knowing where your penetration points are, and then properly building your defense and depth layers around them is how I look at the security model for any organization. We make sure that the security team can say: “We’re seeing an attack, this is what they’re going after, but we’re not worried about it, because we’re patched.” And it doesn’t matter how many times
they beat on the door with the same threat, it’s not going to be penetrated because those patches or configurations are hardened to the right specifications.
JW:
Symantec is in alignment with that. As you were saying Morey, we’re seeing that there is going to be more of a budget for security but ultimately their job is not to put more security down at the endpoint or at the perimeter, whether it’s on the network or off the network. Ultimately, they need to keep the business running, and by having a well-managed endpoint and management of the security of the technologies, we see it really starting to blend in with operations. They’re asking: “What are the day-to-day activities we can do to be secure, to make us more efficient, and to become more high-performance, but that won’t impact the business?” Here we see the ultimate goal—let’s keep the business running, and let’s get security working in the background.
JH:
One thing to consider is outsourcing security to a professional. Every IT department is at their resource limit right now. I think a professional partner often has the economies to scale advantage to any IT department managing security. The end-user, or the business trying to protect its infrastructure, has the advantage of having professionals working on their IT security that they potentially can’t or don’t want to afford. At Astaro, we sell our products exclusively through sub-certified partners, a worldwide network of 2500 partners, and they’re absolute specialists at what they do. I’d recommend before launching another big project that companies at least reach out to some of these security consultants and find out how they can be of help.
22
july 2009
threats and vulnerabilities n Executive panel
Derek Brink - Moderator
Vice President & Research Director, IT Security,
Aberdeen Group
Derek joined Aberdeen Group in 2007 as a senior high-tech executive experienced in strategy development and execution, corporate/business development, and product management/product marketing. He brings a unique blend of analytical/technical backgrounds combined with excellent communication skills, and extensive information security industry expertise. Prior to Aberdeen, Derek’s industry experience included positions with RSA Security (now part of EMC), Gradient Technologies (now Entegrity), Transarc (a subsidiary of IBM), Sun Microsystems (now Oracle), and Hewlett-Packard. He began his professional career as an analyst for the Central Intelligence Agency. Derek earned an MBA with honors from Harvard Business School and a BS in Applied Mathematics with highest honors from the Rochester Institute of Technology.
Jan Hichert
Chief Executive Officer, Astaro AG
Jan Hichert, Chief Executive Officer, has led Astaro since the company was founded in early 2000. Before Astaro, Jan consulted with software vendors and internet service providers on IT security and open source business models. Jan studied business at the University of St. Gallen and computer science at the Technical University of Karlsruhe, before becoming an entrepreneur.
Morey J. Haber
Vice president, Global Business Development,
eEye Digital Security
With more than 15 years of IT industry experience in network management and security, Morey joined eEye in 2004 and is today responsible for working with Fortune 500 customers and translating their real world security problems into reliable and easy-to-use solutions. Prior to eEye, he held senior positions for CA Inc., where he was responsible for CA’s SWAT team and management of new product beta cycles, including portions of their flagship product line, Unicenter. Morey began his career as a Reliability and Maintainability Engineer for a government contractor building flight and training simulators. He earned a Bachelor of Science in Electrical Engineering from the State University of New York at Stony Brook.
Jim Waggoner
Director of Product Management,
Symantec Corporation
As Director of Product Management, Jim spends his time meeting with customers from around the world to better understand the problems they are trying to solve as he drives the roadmap for Symantec Endpoint Protection. Jim has been with Symantec for the last 12 years in various technical and product management positions focused on products that protect against malware. Jim holds a Bachelor of Arts degree in Psychology from California Polytechnic State University in Pomona. july 2009
23
ANALYST FEATURE ■ APPLICATION SECURITY
Don’t lose control…
When it comes to security and risk, how are your applications controlled and are they proving to be lethal or lifesaving? ROGER SOUTHGATE (PAST PRESIDENT OF ISACA LONDON) has the solution for application controls, and how to make the best out of a bad situation.
24
JULY 2009
APPLICATION SECURITY ■ ANALYST FEATURE
THE MOBILE RANG, AND THEN CAME THE QUESTIONS ALL CIO’S DREAD: “Have you seen the news today? Could this happen to us?” And so began Sam’s day. He’d only been there for two years and it had been full-on since he arrived. The door opened. “Sam, are you ready for us to start?” said Steve. “Yes, come in and sit down Steve. I’ve been having some premonitions about security, and particularly applications security. You’ve been here a lot longer than I have—what’s your gut feeling?” “Well, Sam, if I am really honest I think we have been very lucky so far. You may not be aware that five years ago we took over two of our competitors primarily for their systems, much like Lloyds Bank did a few years ago when they took over TSB...” While this dialogue is completely fictitious, something about it nevertheless has a ring of truth. There are a number of situations that should trigger a review of application controls. Current examples such as virtualization and computing in the “cloud” may seem obvious candidates. SharePoint and similar applications may also benefit from closer scrutiny. Perhaps not quite so obvious are the applications controls within applications acquired during a takeover or merger. Before we can begin any review of controls we should always revisit the risks organizations face to remind ourselves and perform a reality check, and to confirm completeness and continuing relevance. Consideration also should be given to the organization’s risk appetite, stakeholder expectation, and emerging compliance and regulatory issues. UNDERSTANDING THE RISKS Enterprise Risk Management is something of an oxymoron in many organizations. Perhaps a quick reality check here is also in order. Starting with a blank sheet of paper can be quite intimidating, but don’t worry—help is at hand. I am sure you will have heard of the World Economic Forum. These are the people that are very much in the news towards the end of January each year when they hold their annual meeting in Davos, Switzerland. Out of the headlines they are a rich source of reference material, including risk management. Each year they publish a Global Risk Report (see box) and I recommend you take a look at the reports for 2008 and 2009. The reports are concise and concentrate on important risks we should all be considering. If you walk through the scenarios and reflect upon how they could impact your organization, try and repeat the process assuming there are no information services.
This will give you an insight into how dependent we have all become on volumes of real-time, up-todate information. If this is not enough for you, they also publish annually a Global Information Technology Report (see box). This has been running for six years and this year is the first that you can download the whole report. This may be particularly helpful to those of you with overseas operations and business partners. At the operational risk level, a good starting point is annex 7 of the Bank of International Settlements (BIS) Basel II (see box) document that contains a “loss event type” table which is a good place to start a reality check for a specific application or a subset representing the most important group of information services. ARE YOU IN CONTROL? Once we understand the risks we can then move on to consider the controls. The grave danger when we focus on a subject is that we lose sight of the context, and nowhere does this seem to happen more often than in IT, and particularly in relation to controls. “Controls” can and often are a very emotive subject. We all like to feel we are in control and none of us like to feel we are being controlled. When someone broaches the subject of additional controls we all too often see this as a challenge to our trustworthiness. The most important thing to remember is that we cannot “control” anything. This may sound extreme, but it is the reality. The choices we make may influence or contain the immediate outcomes and long term consequences. A simple rule of thumb to remember is: Thought + action = immediate outcome + long-term consequences. Unfortunately, lack of thought plus inaction can also lead to immediate outcome and longerterm consequences. I wonder how many times you have heard someone say: “Well, I thought...”, when it is painfully obvious to everyone that this is the last thing they did. This is a good time to remind ourselves that the only reason we have controls is to mitigate the potential impact from risks to an acceptable level. This means that for controls to be “effective”, their cost of operation must be less that the potential loss, and perhaps more importantly today, the potential impact on the “customer experience” of applications controls should be given careful consideration. Applications controls are one of the three broad classifications of controls available to organizations, the others being Business and
General IT controls. The term “application controls” is used to describe those controls built into the application to control the processing of transactions and data. The diagram (next page) illustrates how they should all work together to provide protection in layers to the organization and its stakeholders. Each organization needs to develop its own “cocktail” of controls, and the table format (next page) can be used to see and understand how the layers of control should work together to provide the protection that’s needed, and avoid as far as possible any single point of failure. We are now in a position to put the application controls under the spotlight. This can be problematic particularly with third party applications. COBIT 4.1 WORKS COBIT 4.1 is on hand to help us identify our current requirements which COBIT describes as “business requirements for information”. Seven distinct but overlapping criteria have been identified and found to work in practice. These are: Effectiveness: deals with information being relevant and pertinent to the business process as well as being delivered in a timely, correct, consistent, and usable manner Efficiency: concerns the provision of information through the optimal (most productive and economical) use of resources Confidentiality: concerns the protection of sensitive information from unauthorised disclosure Integrity: relates to the accuracy and completeness of information as well as to its validity in accordance with business values and expectations Availability: relates to information being available when required by the business process now and in the future. It also concerns the safeguarding of necessary resources and associated capabilities Compliance: deals with complying with the laws, regulations, and contractual arrangements to which the business process is subject, i.e. externally imposed business criteria as well as internal policies Reliability: relates to the provision of appropriate information for management to operate the entity and exercise its fiduciary and governance responsibilities. The requirements should be re-visited with the owner of the application/service. (It should be remembered that the business is responsible for identifying application control requirements, with IT ensuring third party vendors satisfy the
JULY 2009
25
ANALYST FEATURE â– APPLICATION SECURITY
Useful links
World Economic Forum Global Risk Report: www.weforum.org/en/initiatives/globalrisk/index.htm World Economic Forum Global Information Technology Report: www.weforum.org/pdf/gitr/2009/gitr09fullreport.pdf BIS Basel II annex 7 Detailed Loss Event Type Classification: www.bis.org/publ/bcbs107.htm ISACA/ITGI COBIT 4.1 is free to download from: www.isaca.org/cobit. More information can be found in both the COBIT Control Practices and COBIT Assurance Guide.
‌ the only reason we have controls is to mitigate the potential impact from risks to an acceptable level
Generic process controls
People Processes Data Technology Infrastruction Facilities
requirements in the best way for the organization.) We can then turn to the applications controls in COBIT 4.1 and consider whether the existing controls are sufficient to meet our current and projected needs. COBIT consolidates applications controls into six broad groups: AC1 Source Data Preparation and Authorisation: concerned with origination, error detection, irregularities, segregation, and approval AC2 Source Data Collection and Entry: addresses timely input and correction by authorised and qualified individuals AC3 Accuracy, Completeness and Authenticity Checks: can these be performed close to transaction/data origination? AC4 Processing Integrity and Validity: error detection AC5 Output Review, Reconciliation and Error Handling: is the output still relevant, sufficient, and appropriate to the needs of the organization? AC6 Transaction Authentication and Integrity: maintaining authenticity and integrity during transmission or transport.
1 2 3 4 5 6
WHAT YOU NEED TO DO Where we are concerned that there may be inadequate applications controls, we want to consider
26
JULY 2009
General IT controls Systems development Change management Security Computer operations
Physical
Business processes
Business applications
Contain/correct
Logical
Application controls
Detect
Administrative
Prevent Business controls
locating compensating controls as close as we can to the risk. We need to decide what preventative, detective, and corrective measures are most suitable, and whether these need to be in the business. From the IT perspective all 34 COBIT processes are mapped to the business requirements for information, thus providing focused guidance on which IT general controls can help address
Roger Southgate
IMMEDIATE PAST PRESIDENT ISACA, London Chapter
shortfalls in application controls. While performing this exercise you will find it useful to document your assumptions to make them explicit and test their reasonableness. They can then be used to help identify those changes either with the organization or the wider environment that would cause you to want to reconsider or reverse the current controls.
During his five years on the ISACA board, Roger was responsible for IT Governance and Standards. He has been active in both the promotion and ongoing development of COBIT since 2002. Throughout his career, Roger has designed and implemented accounting and management information systems across a range of industries, including television, manufacturing, and investment banking. He discovered COBIT during his 18 year career with a Japanese Investment Bank in London, where for over 15 years he was CIO. Roger is now an independent consultant. Recent consulting assignments include IT Governance reviews, process maturity assessments, IT Governance training/ mentoring, policy reviews, Sarbanes Oxley preparation quality reviews, and risk and security discovery workshops. He is a regular speaker on IT Governance, IT Risk, COBIT, Security and related compliance issues.
GIL 2009: India Bangalore, India October 2009 Growth, Innovation & Leadership: A Frost & Sullivan Global Congress on Corporate Growth GIL 2009: Asia Pacific Kuala Lumpur, Malaysia
GIL 2009: China Shanghai, China
GIL 2009: Europe London, UK
GIL 2009: Middle East Abu Dhabi, UAE
GIL 2009: Latin America Sao Palo, Brazil
GIL 2009: North America Phoenix, Arizona
Frost & Sullivan’s premier client event GIL – Growth, Innovation and Leadership supports senior executives in their efforts to accelerate the growth rates of their companies. Each year, thousands of CEOs and their growth teams return to engage in this global community to explore actionable strategies, solutions, and growth processes that they can put to work in building a solid Growth Acceleration System. GIL 2009: Middle East is a "must attend" for any organisation seeking fresh perspectives, new ideas and innovative, practical solutions to stay ahead of the curve. Please mention partner code GILETM09 to receive a Rs. 7,500 savings courtesy of Enterprise Technology Management
Register Today! www.gil-global.com Email: gilindia@frost.com Tel: + 91 22 4001 3422
Capturing innovative ideas for growth.
ask the expert n information-centric security
No breaches—
guaranteed AK: FIRSTLY MANU, LET’S TALK A BIT ABOUT THE INDUSTRY ITSELF. WE’RE SEEING A LOT OF CHANGE IN THE SECURITY MARKET� REGULATIONS, BREACHES, AND LOWER BUDGETS. SO, FROM WHERE YOU SIT, WHAT DO YOU SEE? MN: You’re right, there’s a lot of change in the industry. State regulations are becoming more stringent. For example, recent laws in both Nevada and Massachusetts now require encryption of sensitive data about citizens—both at rest on mobile devices as well as when transmitted. This applies to small and large organizations and will require lots of work to be effective. Breaches are also increasing. Since 2005, over 360 million records were breached. If you look at an average of approximately $200 per record as remediation for those breaches, that’s $18 billion annually. Organizations are also looking at their security investments and trying to figure out what has worked and what hasn’t, so in this economy, there’s a trend towards looking at ROI and risk as a better way to indicate how organizations should spend their dollars. AK: YOU’RE ABSOLUTELY RIGHT MANU�IN THE CUR RENT MARKET WE’RE SEEING A HUGE FOCUS ON ROI, AND IT’S INTERESTING THAT YOU MENTION RISKBASED APPROACHES. CAN YOU TALK ABOUT THIS A BIT MORE? MN: Absolutely, I think risk is becoming a key indicator of how people are spending money. Attrition.org tracks breaches all over the world, and they have a
28
July 2009
downloadable database of breaches. We downloaded it and did some analysis. One interesting statistic was that 31% of all breaches happened because of a lost or stolen laptop, but this only accounted for 9% of all data lost. So even though you’re doing full disk encryption in your organization, you’re only reducing your risk by 9%. Organizations should be aware of what risk their investments are reducing. Organizations are looking at this and figuring out that data is the most risky element, not devices, and that they can’t effectively protect all devices and networks. They’re now looking at a holistic approach to reducing risk, rather than only from a device-based perspective. AK: WELL MANU, THAT’S CERTAINLY VERY INTERESTING AND IT SOUNDS LIKE THERE ARE A LOT OF MISCONCEPTIONS THAT SECURITY PERSONNEL HAVE TO DEAL WITH OUT THERE. ARE THERE ANY OTHERS THAT YOU THINK ARE IMPORTANT OR RELEVANT? MN: Absolutely. One of the interesting misconceptions I see is that people believe a breach won’t happen to them. According to Ponemon research, 80% of organizations had a breach in the last 12 months. It might be a minor breach, but the probability of having a breach over the course of a year is very high. Another myth is that perimeter security provides effective protection. This castle and moat approach doesn’t work with the digital and collaborative environment we have today. Perhaps the most interesting
With today’s increasing importance on IT security, is it possible to be fully protected? MANU NAMBOODIRI (BITARMOR) takes us through an information-centric security solution that boasts a better, more cost-effective approach to protection—and the first ever No-Breach Guarantee™.
myth of all is that security professionals believe that their job is to prevent data from moving around. In fact, it should actually be about enabling the secure sharing of data. If security is restricting the sharing of data, it’s actually working against the business workflow and you will not be as effective. AK: IT MUST BE HARD TO RECONCILE THE IMPORTANCE OF NOT RESTRICTING THE SHARING OF DATA, AND IN REALITY MOST IT SECURITY PROFESSIONALS MIGHT NOT BE THINKING THAT WAY. SO HOW DOES SECURITY ENABLE SUCH A WORLD OF SHARING AND SECURE COLLABORATION? MN: Well, security has traditionally been based on a castle and moat approach. Today’s security approach is not really optimized for sharing and collaboration. For example, you protect your laptop, or protect your network, but this does not enable secure collaboration with your partners. The only way you can share it is if you can secure the data itself. And that’s this better approach we call information-centric security—protect the data itself wherever it goes. It doesn’t matter if the network or device is not protected, as long as the data itself is always protected. AK: WELL THAT CERTAINLY MAKES LOGICAL SENSE�YOU’RE TALKING ABOUT PROTECTING THE CRITICAL ASSET OF THE ORGANIZATION WHICH IS THE INFORMATION. TELL
ME, WHY IS THIS A BETTER APPROACH? MN: There are multiple reasons why this is the better approach. Even if data leaks outside, nobody would be able to access it because the data always remains encrypted, and has strong access control policies which are embedded in the data itself. Information-centric protection is location-independent. Rather than cobbling together separate products for encrypting your devices, you are protecting the data itself so you don’t have to worry about the complexities of managing and maintaining multiple separate data protection solutions. And that means you have much lower cost. But the most important part of this information-centric approach is that the CISO can get buy-in from the business unit. If you start talking about protecting data, as opposed to the infrastructure, business will want to have ownership and be part of the discussion. AK: I’M INTRIGUED BY WHAT YOU’VE MENTIONED ABOUT GETTING BUYIN FROM THE BUSINESS UNIT. MOST CISO’S HAVE A TOUGH TIME GETTING ATTENTION FROM THE BUSINESS. CAN YOU EXPLAIN IN MORE DETAIL HOW THIS APPROACH WILL HELP THEM? MN: Well, interestingly enough, I came across some research recently which concluded that 73% of businesses users thought that security is something that IT should deal with—therein lies the problem. The starting point for a better discussion with your business unit is by asking who should have access to a particular piece of data, how
information-centric security n Ask the Expert
.com/featured_ podcasts/Article/bitarmor_no_breaches__guaranteed
http://www.global
July 2009
29
ask the expert n information-centric security
it should be shared, and how to make sure that workflow across the business units is effective and secure. You’re trying to align with the business and the way they collaborate, and not talk about IT infrastructure. I think you’ll find that this information-centric approach allows you to have a more engaging conversation with the business unit. AK: IT SOUNDS LIKE THERE’S A STRUGGLE BETWEEN THE BUSINESS AND IT SIDES OF AN ORGANIZATION. LET’S NOW DIG INTO THE TECHNOLOGY AND IMPLEMENTATION DETAILS. MN: The absolute foundation for information-centric security is to ensure that protection policies always travel with that data. This requires having “smart data.” Smart data knows who should access it, how long it is valid for, and what its sensitivity level is. Other things that are really important are user transparency, and device and application independence. It has to be easy to use and should work wherever the data rests or whatever networks it passes through. Another critical component is transparent key management. IT professionals should not be burdened with complex key management issues. AK: SO THERE ARE SIGNIFI CANT TRANSPARENCY AND MANAGEMENT ISSUES HERE AS WELL. CAN YOU TELL ME MORE ABOUT BITARMOR’S INFORMATIONCENTRIC APPROACH? MN: I’d love to. Our CEO has a PhD from Princeton in cryptography, and our product is based on his research. The core of BitArmor’s technology is the Smart Tag™. A Smart Tag™ contains encryption, access control, data expiration, and data tracking policies. Those policies are embedded in the data and travel with it. It doesn’t matter if the data goes onto a USB device, to a backup tape, via email, or sits on a laptop— the Smart Tag™ always protects it by enforcing those policies.
30
July 2009
In today’s world, for example, if someone copies a document from a restricted HR file share and brings it down locally to their PC, the protection that the HR share provided no longer exists. I heard one story about a retail organization preparing for Black Friday (the day after Thanksgiving), which is the busiest day of the year. The pricing for that day is very sensitive IP, and apparently this data was accidentally emailed out. They had to shut down all Exchange Servers to recover the document, but they weren’t sure how successful it was. BitArmor’s Smart Tag™ approach would have ensured that this sensitive document was persistently protected, and only those with appropriate access could see it even if it was emailed to the entire world.
statement to make, but we can back it up with our technology, and that’s why we’re able to make such a big claim.
AK: BITARMOR’S SMART TAG™ CERTAINLY SOUNDS LIKE A POWERFUL APPROACH. PERHAPS YOU COULD PROVIDE SOME APPLICATIONS OF INFORMATIONCENTRIC SECURITY? MN: We see lots of applications, especially around data breach notification and compliance for SOX, HIPAA etc. Organizations today are buying separate, expensive encryption products for each device and then cobbling them together. When data moves from device to device, it creates an encryption/decryption cycle that has performance impact— and a situation that is easy to attack. With BitArmor, once you protect that document, it remains protected. So a lot of people look at this approach and realize that they can achieve broader protection, and it’s a lot less costly and complex. We are so confident about our approach, that we came out in January this year with the industry’s only No-Breach Guarantee™. We are guaranteeing that data that’s protected with a Smart Tag™ cannot be breached and therefore will remain protected across the board. Now, this is a very powerful
AK: THE NOBREACH GUARANTEE�THAT’S A VERY STRONG MESSAGE AND IT CERTAINLY SOUNDS LIKE A ONESTOP SECURITY SOLUTION. DLP AND VIRTUALIZATION ARE HOT TOPICS IN THE IT SPACE AT THE MOMENT�CAN YOU ELABORATE A BIT MORE ON HOW THIS INFORMATION CENTRIC APPROACH HELPS? MN: Absolutely. DLP solutions are incredibly effective in discovering sensitive data. However, after discovering this information, CISO’s have to manually intervene to protect the document to comply with regulations. BitArmor’s approach is complementary to DLP. It’s easy to apply a Smart Tag™ on a document with appropriate security policies as soon as it is discovered. That policy remains with the data wherever it travels, and thus completes the cycle of protection that DLP solutions promised all along. The informationcentric approach is perfect for that. Now, in a virtual environment, the only thing not virtual is the data, which is the true asset that needs to always be protected. Data has a longer lifetime and is the only element that can travel between virtual and nonvirtual worlds.
Manu Namboodiri
VICE PRESIDENT OF MARKETING, BitArmor
Manu Namboodiri is Vice President of Marketing and responsible for strategy and marketing at BitArmor. Most recently, he led product management for Windows Vista deployment technologies at Microsoft. With extensive experience in the technology industry, Manu has worked for Morgan Stanley and Sun Microsystems in corporate development and product management leadership positions. He has a master’s degree in computer science from the University of Texas at Austin, and an MBA from the Kellogg School of Management at Northwestern University. We think the information-centric approach is more effective—rather than spending money replicating all the security investments you’ve made in the non-virtual world and transferring them into the virtual world. AK: SO, AT THE END OF THE DAY IT’S ALL ABOUT PROTECTING THE INFORMATION. TO WRAP UP MANU, TELL ME HOW DOES BITARMOR SEE THE FUTURE OF SECURITY, PARTICULARLY IN RELATION TO THE HANDLING OF INFORMATION? MN: In 1984, “futurist” Alvin Toffler made an interesting comment; it’s the information flows that matter, not the stocks of information. What he was hinting at is that the value and lifeblood of any economy is when the information flows. Therefore, the future of security is definitely going to be informationcentric, there’s no doubt about that. The world is moving towards the idea of increased collaboration and you have to be able to protect data at its source. My closing thought is that information-centric security is what is going to enable you as a CISO to align with the business, and provide a cheaper and more cost-effective way to protect data. This in turn enables you to protect future trends and help solve the problems of tomorrow.
There’s a better way to protect your data.
BitArmor software with Smart Tag™ technology protects data wherever it goes. Not just on laptops, USB drives, or file shares. Not only while it’s traveling over networks. Wherever it goes. All the time. Smart Tags travel with data as it moves from device to device for consistent protection that’s simple, cost-effective, and easy to use. • Persistent file encryption with Smart Tag technology for USBs, e-mail attachments, and file shares • Full disk encryption for laptops • An integrated product with a single installation
For a 14-day free trial, go to www.bitarmor.com.
The BitArmor No-Breach Guarantee™: We’ll protect your data or your money back.
AnAlyst feAture n future security
PREDICTING THE FUTURE IS ALWAYS AN INEXACT SCIENCE, AND HOW threats to information security will change over time is no different. But by combining practical knowledge, analytical skills, and a good gut feeling, we can provide insight into what challenges information security professionals will face in 2011 and beyond. The ISF 2011 Threat Horizon report harnesses the combined knowledge of its member organizations—some 300 of the leading companies and public sector bodies from around the world including many of the Fortune 100 corporations. The research was carried out within a ‘PLEST’ framework that takes into account political, legal, economic, socio-cultural, and technology factors. Some of the key findings are examined below, but it is expected that most of the threats in 2011 will still be familiar ones that are evolving in significant ways with new and sophisticated attacks complementing tried and tested techniques. It is also clear that the financial crisis has accelerated the change and sophistication of many of these new threats to information security. Right now, both internal and external threats are higher with increased staff turnover and dissatisfaction, while emerging long-term threats already pose a real challenge and present serious legal, financial, and reputational consequences. ORGANIZED CRIME One of the most unanimous predictions is the increased involvement of organized criminal groups that see online crime as a lucrative and low risk alternative to other criminal activities such as drug-running and people smuggling. And with the problems of protecting large volumes of sensitive information held in organizations electronically, businesses are also under increasing threat from targeted espionage, resulting in loss of competitive advantage or theft of intellectual property. There will be a shift from indiscriminate events to highly targeted and planned attacks by organized crime groups that are developing more sophisticated “business” models for extorting the e-economy and laundering money. Plus, a combination of social engineering and technical attacks will be used increasingly to steal identities and information in order to commit fraud. This will be especially concerning for businesses that trade over the internet and rely on a loyal customer base that is confident in making online purchases. Should consumers lose confidence in online transactions, it will be detrimental to many retailers and organizations in non-financial sectors.
32
july 2009
The threat
horizon
As new technologies evolve and change, so too does online crime in all its forms, creating a very real challenge for business management and security. JASON CREASEY (INFORMATION SECURITY FORUM) looks at the threats facing information security professionals over the next two years.
future security n Analyst feature
july 2009
33
AnAlyst feAture n future security
ACCESS ALL AREAS Mobile internet, which will become increasingly popular, will also be subject to attack as the abundance of malware aimed specifically at mobile devices grows. It will be another area of common concern due to the rapid proliferation of mobile devices used to access corporate networks and resources. Without the same virus protection or other security controls in place as traditional networks and PCs, the growing trend of mobile and remote working will inevitably attract new forms of malware designed to create fraudulent payments or commit identity theft. Companies will also face new challenges to adequately manage and secure their corporate mobile devices to prevent employees from leaking sensitive information, either voluntarily or involuntarily. Concurrent with the rise of mobile internet is the increasing use of Web 2.0. Companies will face greater exposure due to the rise of social networking sites such as Facebook, Bebo and Twitter that have already become a popular part of not just home, but also office culture. In addition to providing another channel for the accidental leakage of corporate information, cyber criminals will create new methods of attack to target the vulnerabilities of these social networking sites. Virtual worlds such as Second Life may also present new risks if brand damage in the virtual world translates into the real world. The mobile internet, Web 2.0, and virtual worlds are popular with a younger, more technologically aware workforce that is driving a new techno-generation corporate culture. While organizations will have to adapt to and embrace these changes if they want to attract the right skills, these new employees will need to be made aware of information risks and the need for tighter controls that may restrict their online freedom. Companies will need to give serious consideration when addressing this issue in terms of controls, employee awareness programs, and discipline procedures. NEW TECHNOLOGIES… NEW CHALLENGES Cloud computing is another new technology that offers many benefits but at the same time presents a multitude of new threats that need to be addressed. For example, many of the technologies and services associated with cloud computing have been around for years and are now being implemented in new ways to provide dynamic, scalable, and virtualized computing infrastructures, platforms, and applications. However, organizations will need to be able
34
july 2009
Jason Creasey
HEAD OF RESEARCH Information Security Forum (ISF)
Jason has worked with the ISF since 1994 and plays a key role in the development of ISF projects particularly in the production of best practice guides, helping to shape international security management standards. He is responsible for completing a wide range of the ISF’s research and development projects on topics including information classification, wireless LANs, insider threats, third party access and web servers. He has been the principal architect behind the ISF’s highly successful suite of risk management products, including tools for risk analysis and benchmarking, and is the primary author of the ISF’s Standard of Good Practice. Jason is a qualified computer auditor, and prior to joining the ISF was IT audit manager at a major manufacturing company. He also worked in the banking industry for over 12 years for several international banks.
to make informed business decisions that include strong consideration of security processes to minimize exposing critical or sensitive information and systems to increased or new risks. Other threats highlighted in the ISF 2011 Threat Horizon include the weakening of infrastructures and the potential impact of prolonged interruptions to business services from power cuts to internet failures.
LOOKING TO THE FUTURE One of the risks of making predictions is that something happens that nobody expects—such as the recent banking crisis and consequential turbulence in global markets. Economic downturn will have a direct impact on threats to businesses. For example, as employees feel more vulnerable, their loyalty to an organization can be compromised, leading to a higher likelihood of
… security can no longer be an add-on but will need to be ingrained within the IT infrastructure…
Aside from technological threats, tougher legislation and compliance will also create more burdens and possible confusion across geographic regions, while the erosion of the traditional network boundary, and increased outsourcing or off-shoring of operations, poses greater risks to data protection and privacy. Information security professionals will also face challenges within their own organizations. Information security can only be solved through a combination of technology, business processes, education and training—with greater involvement of information security professionals in business strategy and planning. This will involve adopting a more holistic approach to consider other functions such as operational risk, physical security, business continuity, loss prevention or fraud units, and financial risk.
malicious action originating from the inside. And more generally, budget issues may take precedence at a time when information security is more important than ever. In the future, security can no longer be an add-on but will need to be ingrained within the IT infrastructure, business processes, and strategic planning from the outset. Working more closely together and adopting new methods to analyze and tackle the changing dynamics of security risk will help us through the tough times of the downturn and emerge stronger in the upturn. Prediction is never an exact science, but by thinking through the possible challenges, information security professionals will be better prepared to face the future with confidence.
In the Hotseat n data security
Security first The increase in security breaches has left companies scrambling to protect themselves and their most precious asset. TODD TUCKER (NETIQ) tells us why so many organizations have failed to secure their data, and what they can do about it.
AK: Todd, before we jump into specifics, let’s talk about what NetIQ actually does. You have a range of solutions from systems management to IT process automation, and there is a lot of change in the security market at the moment, so what is NetIQ focusing on? TT: Ali, it’s interesting that you mention two things we are actually focusing on which is security management—and for us that is security information and event
36
july 2009
management or SIEM—and security configuration management or security auditing. Around two years ago we introduced a new product to the market for what is sometimes called run-book automation or IT process automation. We’ve been very focused on how we put automation into security management to provide these benefits on top of security benefits to our customers. It’s a constantly evolving challenge for us, but it’s been very exciting at the same time. AK: So it sounds like you’ve created some
great solutions and you’ve mentioned security which is a hot topic right now. Tell me, what are some of the challenges that your customers have been tackling? TT: Well, recently we interviewed dozens of security professionals as part of market research, and asked mainly about non-product related questions—things like: “How do you prioritize what you’re doing today for security?” We thought we would hear that cost pressures were their top concern—we heard quite the
opposite. Their number one priority is protecting data, and the job has only gotten harder. If you look at some of the breaches that have been very publicized like Hannaford Bros or Heartland Payment Systems, you can see the sophistication of the breach which makes it much more difficult to detect and prevent. We recently read a study which showed that at around 75% of the time, breaches aren’t discovered by the breached organization. They are discovered by a partner such as a service provider, or in some cases customers, which shows it’s sometimes difficult to detect and
data security n In the Hotseat
Basically, they find out what they have to comply with and then check that all these elements are in order. It’s very much a check-box mentality. The problem with this is that when you use the check-box approach, it sometimes fails to create a foundation that provides for better long-term security. The other approach that I would advocate is a security-first approach. It’s generally one that leverages best practices such ISO 17799 and 27001, and frameworks such as COBIT or the NIST Risk Management Framework. But, if they take a best-practice approach and use it to demonstrate compliance—even if there are gaps between their security practices and the compliance mandate—they are in a more defensible position because they’ve taken best practices first to implement security, rather than the regulatory mandate or other factors. The approach I would advocate to deal with this contention is security-first based on a bestpractices approach, rather than a compliance-first approach.
.com/featured_ podcasts/Article/netiq_security_first
http://www.global
prevent, and it also shows the level of complexity we’re dealing with. One other thing to point out is that compliance mandates exacerbate the situation, so compliance also remains a top concern. AK: At ETM we hear time and again that it’s not so much a cost issue as it is a data protection issue. There has been a lot of discussion about compliance and security, but why does there seem to be contention between two things that should
be so closely related? TT: I agree, they certainly should be pursuing the same goals which are the protection of data and the management of risk, but they seem to be somewhat contentious. In our experience, the problem lies in the fact that in many cases there’s a tremendous about of time and effort spent on compliance aspects, and meeting specific goals that are externally mandated. These are sometimes in contention with security. We find that customers typically take one of two approaches. One of those approaches is what I call a compliance-first approach.
AK: I know that the Payment Card Industry Data Security Standard, or PCI DSS, is an important standard. Can you tell me what effect that is having on the security world? TT: Well it’s having a tremendous impact on security globally. From our global standpoint, we are seeing the biggest impact in North America, the UK, and other parts of Western Europe. Perhaps those affected areas are where companies are more dependent on payment cards for their business, and where we see a more mature regulatory set of controls than in other parts of the world where they don’t exist yet. I would say without a doubt that it’s the biggest single driver in security spending that we see today. As a result of the Card Systems breach of several years ago, PCI
DSS has received attention from everyone including the board of directors down, which is probably the best impact of PCI DSS. I’ve also seen a lot of negative effects from PCI DSS, one of them being the increased focus on compliance which is what I call the T’s and C’s. I’ve seen the same two approaches I mentioned earlier taken for PCI DSS. Some companies say they have a good security program in place—it’s based on best practices and is entirely defensible to any auditor for PCI DSS or any other regulatory mandate. A lot of companies that take the check-box approach have problems with, for example, log management which is mandated by PCI DSS. What they put in place is a simple log collection capability that fails to provide for things like real-time monitoring and management of security, which I think is vital to longer-term security. AK: I was discussing data breaches yesterday in an ETM interview and an alarming figure came up—apparently 80% of companies have suffered data breaches in the past twelve months. So Todd, do you think it’s hard to reconcile the amount of time and money that’s been spent on security, considering there are so many data breaches? TT: Clearly, there continues to be a lot of breaches despite the efforts put forward, and I think there are several reasons why current security investments have failed in so many cases to protect data. The first one is that businesses are running very fast and very lean today. In that case, security often becomes a low priority. Secondly is that security is still a silent discipline, it’s a kind of “dark art” if you will. There are very few people in organizations that have july 2009
37
In the hotseat n data securIty
formal security training or what I would call a security mindset. But again, the level of awareness for security continues to increase. Thirdly, few organizations know where their data resides. Data tends to be very mobile and transient and it gets transferred to USB’s, laptops, and other mobile devices. Often data is stolen from systems that were unknown for even having sensitive data. Fourth, I would say that most security organizations still focus on protecting the network and neglect protection of the host that stores and processes the data. We see time and again that a breach is performed by exploiting wireless access into the network or a trusted connection, and often that’s due to a failure of security controls over the host and the actual data itself. Lastly, again as I mentioned earlier, I would say that there’s still in many cases too much emphasis on compliance and not enough on managing risk and implementing true data security. AK: let’S get down to SpecificS now—whAt cAn orgAniZAtionS do to help breAcheS from occurring? TT: That’s the million dollar question, and obviously there are a lot of things that must be done, but I’ll focus on a few things here that I think are often overlooked. The first one is that you have to know where your data is and the applications that process them. One of the CISO’s I spoke to a few weeks ago said he had two security architects on his team, and their job was to each know specific applications that process, in their case, customer data for financial services customers. I thought that was a great approach. I would also say that you have to work from the assumption that you’ve already been breached at the network level. I think good security people start the day with the assumption that someone bad is inside the perimeter.
38
july 2009
Todd Tucker
Senior director of Strategic SolutionS, netiQ
As Senior Director of Strategic Solutions for NetIQ, Todd and his team drive NetIQ’s product and solution strategy. Prior to this role, Todd held senior roles for strategy, product management, and marketing for NetIQ and PentaSafe. Todd began his information security career at Ernst & Young, where he co-founded the firm’s National Enterprise Security Architecture Team, and led security and audit engagements for some of the firm’s largest clients. Todd has earned several certifications, including the Certified Information Systems Security Professional, Certified Information Systems Auditor, and Microsoft Certified Systems Engineer, and is Foundation Certified for ITIL. You need to make sure you reduce the number of privileged users, or system administrators, that you have on your key systems. Then you have to find a way to control them which can be as simple as monitoring them. You also have to secure your systems, and especially your host environment, using best practices. I would ask myself: “Do we have clear standards in place? Do we have a process to allow exceptions to our standards without sacrificing security?” I would also suggest monitoring your systems for changes. It’s amazing how many publicized breaches would have been detected if the company was monitoring systems for changes. The last thing I would suggest is to operationalize security as much as possible. In other words, integrate good security practices into your IT operations. That extends to automating your security processes. Then also train the IT infrastructure and operations staff. Make sure you change their mindset into security-first, and then make security part of their formal responsibilities.
TT: I think the two most common needs we have are helping customers first of all protect data, and secondly helping them reduce the cost and workload of compliance and the protection of data. Reducing cost is definitely a big challenge, and when we probe into what customers mean by that they’ll admit it’s not so much about reducing cost as improving the scalability of their capabilities without increasing the headcount. Companies are being asked to do more to secure the business without adding resources, so they have to either improve team skill sets or provide better automation. The problem with skill sets is that you have a lot of skilled people spending a lot of time on menial tasks. If you free these people up they’ll be able to focus on more important product-driven things. The other thing to remember is that losing people and sensitive IP is always detrimental to the business. The benefit of automation is that when somebody new comes in you don’t have to worry as much about knowledge transfer, because you’ve automated an otherwise manual requirement.
AK: So we’re looKing At more of A holiStic view when you tAlK About implementing Security practiceS. when netiQ worKS with itS cuStomerS, whAt Are the mAin thingS they Are trying to Achieve?
AK: let’S wrap up And tAlK A bit About the future. tell me, where do you See Security mAnAgement going over the neXt five yeArS, And whAt iS netiQ worKing towArdS in the future?
TT: There are three trends that I expect to continue. One I mentioned earlier is the operationalization of security. I think there’s going to be increased accountability by infrastructure teams to implement and maintain a secure environment with more of a focus on security teams, architecture, and defining policies and enforcing them rather than implementing. Second, I also think increased automation and security management is something that a lot of companies are doing, but there are two approaches to this. One is that we see automated workflows being built into core security products to support automation, and the other is automation outside of the security product. We believe this provides better opportunities for integration and process automation that spans security and IT, and the tool sets they use. Third, I think we’ll also see more operationalization of security tools. There has been a kind of explosion in terms of the number of vendor and tool companies, and I think many of these are redundant. In fact, I think companies will use existing tools to accomplish tasks that maybe they didn’t plan to use them for. So, I think we’ll see more operationalization of security, increased automation in security management, and more rationalization of security tools in the next three to five years.
My CEO doesn’t know my name. And that’s the way I plan to keep it. Effective data security is key to preventing breaches, simplifying the compliance process and reducing risk to your organization. Let us help you focus your time, money and resources on more strategic projects, reduce the workload of securing critical information, and streamline compliance reporting for mandates such as PCI DSS, HIPAA, NERC, and Sarbanes-Oxley.
Our solution provides a multi-level approach to data security and compliance: • NetIQ® Security ManagerTM – from Log Management to Complete SIEM
• NetIQ® Secure Configuration ManagerTM – Compliance Assessment to Security Configuration Auditing
• NetIQ® Change GuardianTM – Privileged-User Management and File Integrity Monitoring
• NetIQ® Aegis® – the First IT Process Automation Platform for Security and Compliance
If you’d like to find out more about how NetIQ can help you with data security and critical compliance mandates, visit www.netiq.com/undertheradar or contact info@netiq.com.
© 2009 NetIQ Corporation. All rights reserved. NetIQ, the NetIQ logo, NetIQ Security Manager, NetIQ Secure Configuration Manager, NetIQ Change Guardian, and NetIQ Aegis are trademarks or registered trademarks of NetIQ Corporation or its subsidiaries in the United States and other jurisdictions. All other company and product names may be trademarks or registered trademarks of their respective companies.
To survive and compete, IT departments must change the way applications are provided to users. visionapp makes it possible.
Reduce your operational costs up to 25 %
visionapp Server Management 2008 R2
Increase your users’ productivity up to 40 %
visionapp Workspace Management 2008 R2
Get your investment back in less than 30 days
visionapp Remote Desktop 2009
Build and operate more stable Citrix and Microsoft environments in a fraction of the time.
Deliver all your applications, regardless of technology, in one place.
Manage remote machines quickly and easily with the world’s #1 remote admin tool.
visionapp helps your organization build dynamic delivery systems. You'll be able to add applications quickly. Users will have a single place to access new applications or templates — no matter what technology is used. Administrators will be able to build servers and workstations in a fraction of the time.
Find out more: www.visionapp.com
It's time to change your approach!
ANALYST ARTICLE ■ ACCESS SECURITY
Closing the security gap Access to privileged and shared accounts has become something of a free-for-all in organizations today. STEPHANE FYMAT (PASSLOGIX) explains why it’s crucial to find a solution for managing shared credentials.
42
JULY 2009
ACCESS SECURITY ■ ANALYST ARTICLE
ONE ONLY HAS TO CONSIDER THE CASE OF JÉRÔME KERVIEL, THE rogue trader at French bank Société Générale who used multiple shared passwords and accounts to execute fraudulent trades, to appreciate the risks shared account logons pose to modern organizations. Kerviel’s actions cost the bank €4.9bn and serious ramifications were felt across global financial markets. The City of San Francisco, California, found itself in a similar situation last year when disgruntled network administrator Terry Childs reset all administrative passwords to the routers for the city’s area network. He refused to hand over the passwords which effectively gave him complete control of the network, locking out all other employees and preventing anyone else from managing the system. His actions essentially held the city to ransom. The Terry Childs incident illustrates a problem that has largely passed under the radar of most companies who place an enormous amount of trust in their IT staff and system administrators. There was only one administrative account on many systems at the City of San Francisco. Because Terry Childs had open access to system passwords he was able to change them without authorization and lock out his colleagues. What these two stories demonstrate is that failing to manage privileged user accounts’ shared passwords adequately can expose organizations to serious vulnerabilities, particularly in the case of privileged accounts, where a disgruntled employee could potentially have the power to hold an entire network hostage. TO SHARE OR NOT TO SHARE? Keeping track of privileged user and shared access accounts is also important for accountability. Unfortunately, however, many organizations simply don’t know for sure who has access to shared passwords. Far too often, the entire IT department knows the details of what is supposed to be a limitedaccess password. According to a 2008 survey of its members by the Independent Oracle Users Group, nearly 40% of organizations had no way of monitoring the abuse of data by privileged account users. Concerns about ineffective password systems and lax security that enables unauthorised users to breach enterprise networks have caused corporate regulators to take a tougher stance on password security.
As a result of high-profile incidents like those at the City of San Francisco and Société Générale, legislation and industry regulations such as PCI DSS (a multifaceted security standard that includes requirements for security management, policies and procedures) are increasingly prohibiting the sharing of accounts between users. But this causes big headaches for many IT managers in both the public and the private sector, as shared and privileged accounts have become a necessary component of today’s enterprise IT infrastructure. All kinds of employees, from office administrators and temporary workers to nurses and civil servants, require access to shared account logons for enterprise applications and systems for all kinds of reasons. IT managers therefore need to strike a balance between providing the flexibility required to meet end users’ needs, and ensuring security and compliance with corporate policy and the latest industry regulations and legislation. AVOIDING A “KEYS TO THE KINGDOM” ATTACK So how do organizations protect themselves from the risks posed by shared and privileged user accounts in a cost-effective manner? To ensure compliance—and that IT applications and systems are secure— organizations need to know who is using what shared account and when. They need absolute certainty so they can identify the culprit if data is stolen, changed or deleted. They also need to be able to demonstrate this information in a clear audit trail. The first step organizations need to take is to put in place a scalable and flexible method for regularly changing passwords, as well as a reliable way of ensuring that all passwords generated are unique on every system and suitably complex. The second step is to centralize shared account storage and control so that a user must make a request to use a shared password from the organization’s central directory. This can then be approved or denied based on preestablished policies set by the organization. If approved, the administrator is logged on to the network device and when he or she finishes his session the password is automatically logged back into the directory. This step ensures that the organization has visibility and control each and every time a privileged credential is accessed or used.
JULY 2009
43
ANALYST ARTICLE ■ ACCESS SECURITY
The bottom line is simple; passwords alone no longer provide adequate protection
The more people who know a password, the greater the threat. So the next step is to ensure that all passwords for shared accounts are concealed so that a user never knows the password of an account that is checked out. This prevents the inadvertent sharing of passwords, as well as sabotage by rogue administrators. It also eliminates the problem of users forgetting passwords which puts a huge strain on IT support staff, or writing them down which in itself poses a huge security risk. To facilitate regulatory compliance it is also important to tie shared account usage to the user within the organization’s identity management system so that the actual user of a shared password is known at all times. For some particularly sensitive accounts, organizations might also want to consider controlling the usage of privileged or shared passwords by policy. For example, by setting a limited time window for use, or prescribing a maximum number of logons. A further security measure could be to introduce two-factor authentication such as security tokens at the point of logon to ensure that the person using the account is the person authorised to check it out. It is also important to keep a history of passwords for each network device so if they need to be restored from back up for any reason then the current password can be retrieved. CLOSING THE GAP The loss of revenue and damage to reputation suffered by the City of San Francisco
44
JULY 2009
Stephane Fymat
VICE PRESIDENT OF STRATEGY AND PRODUCT MANAGEMENT Passlogix
As the Vice President of Strategy and Product Management for Passlogix, Stephane is responsible for setting the corporate, product, and technology strategy of the company and managing its entire product line. In his current role, he leads the organization that is responsible for the conceptualization, design, and overall direction of the Passlogix product line. He is also responsible for the company’s strong authentication initiative, product marketing, corporate positioning, and evangelism activities. Stephane joined Passlogix in 1998 to manage sales and business development, and helped define the products when they were still in prototype stage, sold the products to Passlogix’s first enterprise customer, and went on to build a sales organization that has since established many enterprise customers, resellers, and OEM partnerships. Stephane graduated with honors from UCLA with a Bachelors of Science in Engineering and Computer Science, and from Columbia Graduate School of Business with an MBA in Finance and Marketing with honors.
administration and Société Générale could so easily have been avoided if these organizations had put such relatively low cost security measures in place. Both incidents highlight the need for greater control over administrative passwords and access to enterprise networks. The bottom line is simple: passwords alone no longer provide adequate protection. Solutions for managing shared credentials can provide a simple, secure, and audit-ready approach to providing system and application
access for administrators, temporary workers, and others who must share account passwords. They dramatically reduce the risk that enterprise systems will be compromised. Not only does this close the security gap associated with privileged accounts and shared password management, but it also provides a cost efficient way for organizations to comply with data protection and PCI DSS regulations that prohibit the sharing of accounts between users.
.xls .ppt .jpeg .pdf .mpeg .wff .psd .tif .gif .rtf .swf .eps .ai .html .pst .vhd.ai .vmdk .dsk .csv .doc .xls .ppt .jpeg .pdf .mpeg .wff .psd .tif .gif .rtf .swf .eps .ai .html .pst .vhd .vmdk .dsk .csv .doc .xls .ppt .jpeg .pdf .mpeg .wff .psd .tif .gif .rtf .swf .eps .ai .html .pst .vhd don’t just store it. more with it. .vmdk .dskdo .csv .doc .xls .ppt .jpeg .pdf .mpeg .wff .psd .tif .gif .rtf .swf .eps .ai .html .pst .vhd .vmdk .dsk .csv .doc • Are you struggling to manage increasing volumes of information efficiently?
• Do you need to optimise your storage capacity to save costs? • Is your information stored securely?
• Are you compliant with the latest legislation?
It’s not just about how you store your information. Find out how to do more at Storage Expo –
the definitive event for data storage, information and content management.
14 - 15 october 2009 olympia london RegisteR foR fRee at www.storage–expo.com Organised by:
Platinum sPOnsOrs
INFORMATION MANAGEMENT AND INFRASTRUCTURE gOld sPOnsOrs
Ask the Expert n virtualization security
A parallel universe In order to seamlessly move from the physical to the virtual world, IT managers need unified, global control over their evolving physical and virtual data centers to meet security and compliance requirements, warns DAVID McNEELY (CENTRIFY). AK: At ETM we know that virtualization is a hot topic at the moment. Tell me David, why is security an even more important requirement for virtual data centers over physical ones, and what’s at stake? DM: We are finding that a lot of companies are moving to virtual environments for many different reasons—to reduce costs or increase capacity within the data centers, and also to improve business agility. If the business needs to spin up a new application more rapidly, it’s certainly a lot easier to spin up a new VM (virtual machine). So, let’s think about the physical servers that exist today in the data center. As companies are going through migration to a virtual infrastructure, that physical server is now nothing more than a file or a virtual hard disk. It lives on a server somewhere, and that means that it’s a lot easier to simply move or copy, which improves IT efficiencies when it comes to hardware maintenance. I can take a virtual server that’s operating, move it to a different server, and it will continue to run even while I take that physical machine down for hardware maintenance, upgrade the hard disk or upgrade the memory. That means we need to do a better job of protecting that file as it lives on the server. Because it contains my
46
july 2009
entire server, stealing a server is just as easy as copying that file onto a thumb drive or CD-ROM and taking it off the network. AK: You’re talking about some definite advantages for companies out there and obviously some challenges in the security arena as well. What types of applications benefit most from the agility you can get via virtualization? DM: Usually, in these environments, it’s the most important systems that are being virtualized because they are the ones that need to be able to scale rapidly in order to meet internal business growth, and address external customer demands. Additionally, we are also finding that new applications, as they come into the organization, are the ones that are being deployed for the first time in a virtualized environment. This is because it’s a lot easier, as I said earlier, to spin up a VM and create an environment that hosts a new application. Instead of having to submit a request or a purchase order to get a hardware machine and wait for it to be ordered, received, provisioned, and put into service, you can see that it is a lot faster now to spin up a virtual machine than it would be to set up a physical computer. Therefore, the types of applications
.com/featured_podcasts/ Article/centrify_a_parallel_universe
http://www.global
virtualization security n Ask the Expert
that are moving into the virtual world first tend to be the ones that need to be protected the most, because they are hosting critical internal services or revenuegenerating customer applications. AK: That certainly sounds like a quick process with immediate results. Can you tell me what is uniquely different about the security of a virtual data centre, as opposed to a physical one? DM: The physical server is traditionally in a data center room with a locked door, and typically IT has processes for controlling who has access to that room. If I wanted to access the server or steal something, I could try to poke in on the network side, but usually there are decent firewalls in place. So in order to steal the server I need to get the hard drive itself. But in this new virtualized environment, that server is now a file that lives on another computer. In reality, this other computer, or the virtual machine host, is the one that I need to protect more stringently. Now the data center room itself—the door that was protecting that room, and the rack that the servers were located in—is now all virtual and based on a VM host system. So protecting the VM host becomes the primary responsibility of security products.
AK: We see vendors like VMware who are also providing management and security tools with their offerings—aren’t these sufficient? DM: They certainly are providing the management tools in order to manage their virtualized environment, whether it is the virtual host or the guests that are running on top of them. We do find that there are many different ways to get to those hosts, and in working with VMware we have a lot of customers using that environment as well as many other virtualization environments. Typically, there’s more than just a management interface that’s allowed to gain access to that host. Underneath the management interface there is an operating system that’s hosting some of these virtual machines. Usually it’s a thinned down version of either Linux, in the case of VMware, or in the UNIX environment, the UNIX host itself. There is a command-line interface that is allowed to gain access, and as with any UNIX/Linux systems there’s a root account that needs to be protected because it has permission to do anything. So we want to ensure that, as the administrators log on to these hosts in order to manage all the virtual systems on top, there’s a way to enable that administrator to gain the privileges—and only
the privileges—that he needs for the specific tasks at hand. Let’s say his job function is to make sure that the system stays up and running. He needs to make sure there are no processes eating up the entire CPU of the system, and potentially he needs to be able to kill any runaway processes. He also needs to check the hard drive space. These kinds of commands must be run with a privileged account. Yet ideally, we want to ensure that the administrator logs on with his own account, not a root or other generic superuser account, and so we use a privileged elevation tool to grant the admin the right to execute only commands relevant to his job role on the VM host. If the administrator logs in with a root or other generic account, we fail compliance audits because we can’t prove accountability for actions. If we grant too many privileges, we fail security audits because the administrator could perform other actions outside their job duties. It protects against malicious acts, like inserting backdoor accounts, viewing customer data, or again, copying and stealing an entire VM. But it also protects administrators against inadvertent errors, such as mistakenly deleting a critical file. AK: OK, let’s talk now about the virtual client machines
running on the virtual host systems. If we’ve secured the virtual host, is there any need for additional controls to secure access to the guests? DM: Yes, there’s a common misconception that access to the guest is gated by the host system. If you think about the host system and the functions it’s providing, essentially a virtualized network or virtualized KBM switch gains access to the host system’s console. But there are multiple ways of gaining access to a guest system. While we could go through the host in order to get to the real console of that guest system, we can also come over the network to that guest and get access through an SSH interface, if it’s a UNIX/ Linux system, or through RDP or VNC, depending on which graphic interface that guest provides. We also need to protect the guest just like you would protect any of the physical servers within the data center. For those UNIX/Linux systems, ideally you would want to incorporate the same kind of security where you require your system administrators to log on with their own account. Instead of giving everybody the root account that grants permission to do any function, you would want to ensure that those administrators are logging on with the specific account assigned to them, and july 2009
47
Ask the Expert n virtualization security
then executing privileged commands based on their particular role or task at hand. AK: So you’re really in the business of protecting everyone when you do that. The change that’s going on in the data center isn’t just the transition from physical to virtual infrastructure. In some cases it’s also a parallel migration from in-house systems to hosted systems. What are the special requirements that ISP’s have in terms of securing access to the virtual hosts and guests for multiple clients? DM: ISPs are challenged with an additional task which is to isolate and protect the guest systems for each of its customers. That means if you have administrators from each of these different customers logging in to gain access to the guest systems, and I have my own administrative staff as the hosting provider logging in to access the host system, we all need to be able to manage the infrastructure together—it’s a shared responsibility. There’s a trust relationship between the hosting provider and
48
july 2009
the customer in that environment. Yet, there’s not that same kind of trust relationship between the various customers that the hosting provider may be hosting within his data center. We end up with a very special task of needing to be able to segregate administrative duties between these various systems, and to ensure that there is a notion of isolation between the various guest computers for the clients. So we need to be able to make sure that the management of the VM host itself can be shared between the customer and also the hosting provider, and yet to keep that completely separate for one customer to the next. Once we’ve done that, we have set up an environment that can be secured between the various customers. Now, as we launch and run VMs on top of that hosting infrastructure, at that point the hosting provider can hand control over to customers and say: “You’re on your own to manage your own guests.” As the hosting provider, my responsibility is to make sure that the guest continues to run correctly, and that it doesn’t get damaged or stolen or misplaced within the network. But I don’t have the ability to do anything inside that guest operating system. AK: Auditing is going to be a special
requirement for ISPs as well isn’t it? DM: Yes, the only way we can make sure that service levels are maintained and met is by ensuring there’s the ability to record administrative activity on these virtual hosts. We need to make sure that we know exactly who is manipulating and managing access to those systems and the kinds of actions that take place. So we’ll know that if an administrator logged in and did something they didn’t believe was
damaging to the system, yet it did cause an outage, that we’d be able to identify who was at fault based on the shared responsibility of managing that virtual system. We need to ensure the integrity of both the staff that are accessing the systems, as well as the security of the overall infrastructure, to make sure there’s the ability to go back and audit forensically and describe where the problem came from, and to be able to rapidly fix it in order to meet service level agreements..
David McNeely
Director, Product Management, Centrify Corporation
As Director of Product Management, David works with customers to drive the roadmap for Centrify’s awardwinning Centrify Suite solutions. Prior to joining Centrify, he served as the Technical Marketing Manager at ActivCard, where he launched a new product that helped the company establish a presence in the single signon solution market. David worked at Netscape, where he held various roles from Senior Systems Engineer leading to his role as the Director of Product Management for the Netscape Directory (iPlanet) and Security product line. He served as the Desktop Systems Architect at AT&T Wireless Services, where he was responsible for their messaging systems’ backbone and internet connectivity. David holds a Bachelor of Science degree in Electronic Engineering Technology from Texas Tech University.
Your Application. His Thumb Drive. In a virtualized data center, starting up a new application server is as easy as copying a file.
So is stealing it. More than 1,000 organizations have embraced Centrify’s solutions for securing their cross-platform data centers. View our webinar on Addressing the Unique IT Security Risks Posed by the Virtual Data Center to learn how Centrify’s solutions provide a powerful, cost-effective way to enable security and compliance.
www.centrify.com/virtualization North America: +1 (408) 542-7500 Europe: +44 118 965 7755
R
EVENTS AND FEATURES ■ JULY 2009
Coming up in the August issue,
ETM is focusing on tackling ITSM and GRC:
+
PLUS, WE’LL BE FEATURING INTERVIEWS AND MUCH MORE FROM OTHER INDUSTRY LEADERS.
Events
DATA CONNECTORS TAMPA TECH– SECURITY CONFERENCE 2009 DATE: 2 July 2009 LOCATION: Tampa, FL URL: www. dataconnectors.com/ events/2009/07Tampa/agenda.asp 6TH INTERNATIONAL CONFERENCE ON IT IN ASIA 2009 DATES: 6 – 9 July 2009 LOCATION: Kuching, Malaysia URL: www.cita09.org 2009 INTERNATIONAL CONFERENCE ON INFORMATION AND NETWORK TECHNOLOGY (ICINT 2009) DATES: 10 – 12 July 2009 LOCATION: Perth, Australia URL: www.icint.org
LISA ERICKSONHARRIS from EMA will introduce the Service Desk function from the ITIL perspective and describe its role and the benefits for ITSM.
DENNIS DROGSETH, also from EMA, is looking into how to make your CMDB a success in 2009. He’ll discuss some of the challenges and provide recommendations for addressing them.
EUROPEAN AND MEDITERRANEAN CONFERENCE ON INFORMATION SYSTEMS (EMCIS) DATES: 13 – 14 July 2009 LOCATION: Izmir, Turkey URL: www.emcis.org MULTICONF–09 DATES: 13 – 16 July 2009 LOCATION: Orlando, FL URL: www.promoteresearch.org WIRELESS BROADBAND WORLD AFRICA 2009 DATES: 13 – 16 July 2009 LOCATION: Cape Town, South Africa URL: www.terrapinn.com/2009/wireza GARTNER SOA SUMMIT DATES: 14 – 15 July 2009 LOCATION: Tokyo, Japan URL: www.gartner.com/it/page.jsp?id=857312 GARTNER 6TH ANNUAL OUTSOURCING SUMMIT DATES: 15 – 16 July 2009 LOCATION: Sao Paulo, Brazil URL: www.gartner.com/it/ page.jsp?id=767113
THE
IN
E DEP
ND
ENT
RES
OU
R
FOR CE
XE IT E
JULY 2009
TIV
ES
THE TEAM from PLANNET will introduce us to “ITIL for ITIL’s sake”, focusing on how organizations can learn from forced efficiencies made in a recession, and the shift from the costly drive for best practice to a focus on ROI.
WORLD SUMMIT ON GLOBAL ECONOMIC CRISIS (WSGEC 2009) DATES: 16 – 18 July 2009 LOCATON: London, UK URL: http://mimm-ltd.com/default.aspx 2009 NEW ZEALAND CIO SUMMIT DATES: 21 – 22 July 2009 LOCATION: Auckland, New Zealand URL: www.brightstar.co.nz/nz/3rd-annual-ciosummit-2009.html SOUTH EAST ASIA COM DATES: 22 – 23 July 2009 LOCATION: Kuala Lumpur, Malaysia URL: http://seasia.comworldseries.com EXPO COMM–WIRELESS JAPAN 2009 DATES: 22 – 24 July 2009 LOCATION: Tokyo, Japan URL: http://expocomm.com/wirelessjapan/ visprof.html BURTON GROUP CATALYST CONFERENCE 2009 DATES: 27 – 31 July 2009 LOCATION: San Diego, CA URL: www.catalyst.burtongroup.com/Na09 INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND TECHNOLOGY (ICCET 2009) DATES: 29 – 31 July 2009 LOCATION: Oslo, Norway URL: www.waset.org/wcset09/oslo/iccet
50
CU
My CEO doesn’t know my name. And that’s the way I plan to keep it. Effective data security is key to preventing breaches, simplifying the compliance process and reducing risk to your organization. Let us help you focus your time, money and resources on more strategic projects, reduce the workload of securing critical information, and streamline compliance reporting for mandates such as PCI DSS, HIPAA, NERC, and Sarbanes-Oxley.
Our solution provides a multi-level approach to data security and compliance: • NetIQ® Security ManagerTM – from Log Management to Complete SIEM
• NetIQ® Secure Configuration ManagerTM – Compliance Assessment to Security Configuration Auditing
• NetIQ® Change GuardianTM – Privileged-User Management and File Integrity Monitoring
• NetIQ® Aegis® – the First IT Process Automation Platform for Security and Compliance
If you’d like to find out more about how NetIQ can help you with data security and critical compliance mandates, visit www.netiq.com/undertheradar or contact info@netiq.com.
© 2009 NetIQ Corporation. All rights reserved. NetIQ, the NetIQ logo, NetIQ Security Manager, NetIQ Secure Configuration Manager, NetIQ Change Guardian, and NetIQ Aegis are trademarks or registered trademarks of NetIQ Corporation or its subsidiaries in the United States and other jurisdictions. All other company and product names may be trademarks or registered trademarks of their respective companies.
SYMANTEC IS Worms. Spyware. Phishing. Today’s breed of online threats pose serious
risks when it comes to securing your network and company’s privacy. Endpoint Protection goes beyond just antivirus, integrating essential security technologies to protect your laptops, desktops, and servers. Try Endpoint Protection today at symantec.com/endpointprotection
CoMPrEhENSIvE
ENdPoINT ProTECTIoN.
© 2009 Symantec Corporation. All rights reserved. Symantec and the Symantec Logo are registered trademarks of Symantec Corporation or its affiliates in the U.S. and other countries. Other names may be trademarks of their respective owners.