Cyber Security of Critical Infrastructure Summit - Proceedings

Page 1

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

JANUARY 11 – 13, 2017 ANNENBERG PRESIDENTIAL CONFERENCE CENTER Texas A&M University

TAMU_CSProceedings_CVR-4pp.indd 1

8/10/17 12:40 AM


The Lynde and Harry

BRADLEY FOUNDATION THE LYNDE AND HARRY BRADLEY FOUNDATION WAS THE SOLE SPONSOR OF THE CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017, AND THE FOUNDATION HAS GRACIOUSLY PROVIDED FUNDING TO HELP STUDENTS UNDERTAKING UNDERGRADUATE CYBERSECURITY STUDIES.

THE FOUNDATION’S MISSION

The Bradley brothers were committed to preserving and defending the tradition of free representative government and private enterprise that has enabled the American nation and, in a larger sense, the entire Western world to flourish intellectually and economically. The Bradleys believed that the good society is a free society. The Lynde and Harry Bradley Foundation is likewise devoted to strengthening American democratic capitalism and the institutions, principles, and values that sustain and nurture it. Its programs support limited, competent government; a dynamic marketplace for economic, intellectual, and cultural activity; and a vigorous defense, at home and abroad, of American ideas and institutions. In addition, recognizing that responsible self-government depends on enlightened citizens and informed public opinion, the Foundation supports scholarly studies and academic achievement.

THE LYNDE AND HARRY BRADLEY FOUNDATION THE LION HOUSE 1241 NORTH FRANKLIN PLACE MILWAUKEE, WI 53202-2901 414.291.9915

TAMU_CSProceedings_CVR-4pp.indd 2

8/10/17 12:40 AM


CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017

WELCOME

I

n January 2017, we were honored to welcome distinguished guests to the inaugural Texas A&M Cybersecurity Summit. We believed the focus of this event, the cybersecurity of critical infrastructure, was not only critically important, but also unquestionably timely. We believe that strategic leaders in all organizations must gain a deeper appreciation for both the evolving cyber threats to our crucial infrastructure and the promising technological and policy innovations that could mitigate those threats. It was our hope that, after this event and follow up engagements, summit participants and others will be better prepared to make well-informed and sound decisions relating to policies, technologies, and resources allocated to address cybersecurity threats to our infrastructures. We assembled a diverse set of cybersecurity experts and thought leaders to serve as keynote speakers and panelists, allocated time in every session for questions and answers, and encouraged all participants to actively engage in a spirited and fruitful dialog. We are proud that Texas A&M University, particularly the College of Engineering and Bush School of Government and Public Service, has a long history of engaging in research and educational undertakings that contribute to national and economic security. We believe that the summit, a wholly collaborative effort, made further contributions to this end. We are thankful for those who joined us at this important inaugural event. We hope you enjoyed your time at Texas A&M and we look forward to continuing the discussion!

M. Katherine Banks, Ph.D., P.E. Vice Chancellor and Dean of Engineering; Director, Texas A&M Engineering Experiment Station; Harold J. Haynes Dean’s Chair Professor

Mark A. Welsh III Dean and Executive Professor, Bush School of Government & Public Service Edward & Howard Kruse Endowed Chair

TAMU_CSProceedings_pp.indd 1

8/10/17 12:36 AM


CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

CYBERSECURITY OF CRITICAL INFRASTRUCTURE 2017 SUMMIT PROCEEDINGS EDITOR IN CHIEF Dr. Andrew Ross Dr. Daniel Ragsdale MANAGING EDITOR Dr. John Junkins ASSISTANT EDITORS Emily Otto Kelsie Suter Carter Ross CREATIVE DIRECTOR Michelle Grierson PROJECT MANAGER Gina Daschbach TRANSCRIPTIONS EDITOR Heather Saeed EVENT PHOTOGRAPHER Michael Kellett

ADVISORY BOARD Dr. Clifford Fry Col. Carlos Vega Lynn Schlemeyer Ray Rothrock Bob Butler Rick Howard Julia Pierko BRADLEY FELLOWS Ryan Vrecenar Colton Riedel Tyler Holmes CONTACT Texas A&M Cybersecurity Center Donald L. Houston Center 200 Discovery Drive 4254 TAMU College Station, TX 77843

The views expressed in these proceedings are those of the authors and not of the Texas A&M Cybersecurity Center, the Texas A&M Engineering Experiment Station, or any other Texas A&M University system entity. However, the authors of specific content published in the Texas A&M Cybersecurity Summit proceedings own copyright to their works, as stated in the Texas A&M University System Policy for Intellectual Property Management and Commercialization. Copyright owners retain their copyright while granting a non-exclusive distribution license to Texas A&M University. This publication of the Cybersecurity of Critical Infrastructure Summit was designed and produced by Gina Daschbach Marketing, LLC under the direction of Fedwriters. 2

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 2

8/10/17 12:36 AM


of e l tab nts e t con

01

Welcome

04

06

Introduction

A DARPA Perspective on Cybersecurity

KEYNOTE

14

The Case for Resilience in Critical Infrastructure

KEYNOTE

34

Making Cyber Resilience Strategic to the Business

KEYNOTE

20

26

The Evolving Threat to Critical Infrastructure

The Current Strategic Perspective of Cybersecurity

KEYNOTE

38

Emerging Technologies to Address Current and Future Threats

PANEL

48

Policy to Address Current and Future Threats

PANEL

PANEL

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 3

|3

8/10/17 12:36 AM


CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

DANIEL RAGSDALE

JOHN JUNKINS One of the key messages from this conference is: while cyber security threats are very real and significant, enterprises that make informed decisions about the allocations of resources, policies, practices, procedures, and technology acquisition will, over time, be more secure, rendering our critical infrastructures less vulnerable to disruptive or even destructive cyber attacks. These wellinformed enterprises will gain an advantage over their less well-informed competitors.

On behalf of our generous sponsors, the Lynde and Harry Bradley Foundation, our gracious hosts, The Bush School of Government and Public Service and the Texas A&M College of Engineering, we want to welcome you to the inaugural event in the Texas A&M Cyber Security Summit Series, focusing on the cyber security of critical infrastructure. The evolving threat to our critical infrastructure is real and almost certainly increasing. We’re going to address that topic and focus on emerging technologies, innovations, and policies. 4

I’d like to introduce Dr. John Junkins, the director of the Texas A&M Institute for Advanced Studies, whose vision resulted in this event. He led the development of a broad-ranging proposal to the Bradley Foundation to provide funds for this important event and a follow-on event in April of 2018. John is a distinguished professor of aerospace engineering, and an internationally respected authority on spacecraft navigation guidance, dynamics, and control. He serves as the founding director of the Texas A&M University Institute for Advanced Studies. JOHN JUNKINS I want to welcome you on behalf of the Institute for Advanced Study and Texas A&M University. This is a starting point for a major thrust into cybersecurity. We aim to make a difference, and I’m very excited about having this exchange of ideas. ANDREW ROSS I’d like to welcome you on behalf of The Bush School of Government and Public Service to

Texas A&M. I’ve had the honor and privilege of being able to work with Dan Ragsdale and other folks at this Center for Cyber Security, at The Bush School and Texas A&M, especially with folks in engineering, to help put this program together. I’d like to introduce my dean, Mark Welsh, known to many of you as General Welsh. He became our dean back in the middle of August, 2016, and was the twentieth Chief of Staff of the Air Force, starting in August 2012. He had previously been Commander of U.S. Air Forces in Europe, and Commander of NATO’s Air Command at Ramstein. He also served as Associate Director of Military Affairs at the Central Intelligence Agency and Commandant of the United States Air Force Academy. He is a graduate of the U.S. Air Force Academy, the Army

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 4

8/10/17 12:36 AM


DAY 1 JANUARY 11, 2017 INTRODUCTION

ANDREW ROSS

MARK WELSH MARK WELSH On behalf of Kathy Banks and myself, I’m excited to welcome you. You have all been involved in many ways, shapes, or forms for a while now and tonight is great way to step into the discussion. Our intent is to move the conversation forward rapidly because our job is to make a difference. I’ve been engaged with NSC, intelligence agencies, the FBI, Homeland Security, and State in this particular mission area. I’ve seen it all along as a three-legged stool: It’s technology, and it’s operations and practice. Policy is the third leg, and we have a lot of blank space there. There have been lots of attempts to move forward in that arena, but we haven’t really done a great job of getting to the point where we can operate across organizational lines in a clear, coordinated and precise way. It’s a problem for us in arenas from national security to national infrastructure. We can’t even agree on what infrastructure means.

Command and Staff General College, the Air War College in Montgomery, and the National War College in D.C. He was a fellow at the Massachusetts Institute of Technology, a fellow at Syracuse in the National Security Studies program at Syracuse and Johns Hopkins, and also a fellow at the Ukrainian Security Studies at the Kennedy School at Harvard. He’s brought new energy and direction to The Bush School, and we’re all thrilled to have Mark as our dean.

speed of light. Bringing experts in technology, together with experts in operations, practice and policy, both scholars and practitioners, allows us to have a comprehensive discussion that can move these areas forward.

The conversation is starting to drift. I’ve sat in round tables in Washington, D.C. I’ve sat in crisis management exercises with the NSC and leaders of departments and government agencies to respond to national cyber crises, whether it was a natural cyber disaster or a comprehensive and consequential attack. We get to a point where we don’t understand what the policy is that allows us to move forward together quickly. The decision timelines can be measured with a calendar in an environment that operates at the CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 5

|5

8/10/17 12:36 AM


JOHN LAUNCHBURY Dr. John Launchbury is the Director of the Information Innovation Office (I2O) at DARPA. In this role, he develops strategy and works with I2O program managers to develop new programs and transition KEYNOTE

program products. Before joining DARPA, Dr. Launchbury was chief scientist of Galois, Inc., which he founded in 1999 to address challenges in information assurance through the application of functional programming and formal methods. Under his leadership, the company experienced strong growth and was recognized for thought leadership in high assurance technology development. Prior to founding Galois, Dr. Launchbury was a full professor at the OGI School of Science and Engineering at OHSU (Oregon). He earned awards for outstanding teaching and gained international recognition for his work on the analysis and semantics of programming languages, the Haskell programming language in particular. Dr. Launchbury received first-class honors in mathematics from Oxford University, holds a Ph.D. in computing science from the University of Glasgow and won the British Computer Society’s distinguished dissertation prize. In 2010, Dr. Launchbury was inducted as a Fellow of the Association for Computing Machinery (ACM). 6

This community brings together people with different perspectives, and cyber security needs that. DARPA has been working in cyber security about 20 years. Some of the technology that is making a big difference in the world now was being pioneered in the 90s. There often are new technologies that can make a significant difference once we figure out how to deploy them. In my office, we spend about $300 million a year on cyber security research. That’s about half of what the Department of Defense spends on cyber security science and technology research.

might be personal information like email, my texts, photographs, or it may be corporate email, or it may be something to do with designs of particular devices. Stealing information is a high priority for cyber attackers, but they may want to get into a system to have that system do something on my behalf by performing the commands that they give it.

Hackers steal our data. We have the problem of the internet of things. People have very cool devices that they put in their houses, and they have no idea that those devices have hardwired passwords, which allow an attacker to just log into those devices and use them as launching points for something else. The challenge is how do we think about cyber security from a fundamental perspective, and what do we want to do about it? A technical understanding can then inform the discussions about both operations and policy.

It may be that that’s a powerful system, and an attacker wants to shut it down or degrade it. To have it not behave as well as it may. One of the complexities when thinking about cyber is that these end results are often the many steps of a chain. What attackers spend a lot of their time doing is trying to get into a place where they can do one of these primary goals. First, they may want to break into a system, hide on that system and have a presence on that system that persists. Then, they communicate back with the outside world and use that vantage point to break into another system. From that second vantage point, they continue to pivot and break into another system, until they get to what they really care about.

A cyber attack is when somebody gets into my machine and does something with it that I haven’t authorized. They may want to get information that is in a machine. It

Cyber attacks will often have stages and secondary goals. We don’t have a good way of measuring cyber risk at the moment. I’d love to be able to say how vulnerable any given system

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 6

8/10/17 12:36 AM


DAY 1 JANUARY 11, 2017 MAIN KEYNOTE

A DARPA PERSPECTIVE ON CYBER SECURITY

Let’s think about what a cyber attack looks like. Somebody might want to put a keylogger on my machine because they want to steal my credit card and spend my money. The cyber criminal has the challenge of how to put a keylogger on the machine. They might decide to use Flash because it has some vulnerability issues. In fact, Flash is the number one source of cyber vulnerabilities at the moment. The cyber criminal might decide to get hold of a Flash application and exploit a flaw in the implementation. He then decides to install a key logger on the machine as an application. The cyber criminal may use an ad on a webpage, and when that ad runs, that ad may have malicious code that triggers the vulnerability in Flash, which then installs the keylogger. The cyber criminal may learn

a bit about you, then send you a link in an email that you find believable and when you click on it you go to this webpage. Many think the problem is the user clicking on links all of the time. The problem is that we’ve built systems for which it’s not okay to click on links. It is helpful to take a diagram like this and consider the kind of systems we are talking about. Are we talking about user actions? Are we talking about code or an application that’s loaded on the machine? Are we talking about the way that the code is implemented and potential flaws in that implementation or are we talking about something deeper than that code, which has to do with the platform itself? Indeed, if you take this notion of quadrants of different styles of attack, it’s quite useful because you can start to think about places where you might have flaws in systems that lead to vulnerabilities. I’m going to run through examples of just single quadrant, one-dimensional cyber attacks where there are problems either in hidden functionality, in firmware, or in the underlying hardware. Let’s start in the top right-hand corner, with the applications you might load on your PC or your phone.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 7

KEYNOTE

is, but we only have very rudimentary methods for talking about that. At the highest level, we can use the standard equation of risk. Risk shows how easy is it to break into a system, and how valuable would it be for somebody to break into a system. We need to think not only about how easy it is for somebody to get into systems, but what benefit does the attacker get by being able to get into it. That should inform us in our thinking about risk.

|7 8/10/17 12:36 AM


A DARPA PERSPECTIVE ON CYBER SECURITY One of them might be a game, Sudoku. Another might be a map. Each of these applications may ask you for permissions to access the internet or your location. Is that okay? Here’s Sudoku. Is it okay that it uses the internet? Maybe it connects with other people and sends my score, so maybe that’s okay. What about location? I don’t know why that game needs my location. What about the map? Of course, it needs to know my location and to access the internet, so that’s completely natural. These two examples came from a DARPA program where we were challenging the teams to discover implicit and hidden vulnerabilities in the code of the application itself by the actions of the application.

that map application. The challenges in these applications can be subtle. Think about how the code is actually implemented and the kinds of things that can go wrong. For example, memory on a computer is sort of vertically stacked. You might have a program (perhaps it’s called Main) which reads a file, and then does something with that file, and then prints whatever it has done with that file. What the computer will need to do to run that code is to allocate some memory. Firs, it will allocate memory to say, “This is the procedure that’s running, and when I’m finished, this is where I need to jump back to because that’s the place where I came from.”

Users sometimes don’t take security very seriously and may have very simple passwords which may even get posted beside the machine. In this case, the problem one was the map. It was acting just as a normal map, but if you happened to be at a certain place in the world, instead of going to the regular map server, it would go to an alternative map server. The owner of that alternative server then knows exactly where all the people are who are using 8

Then it will allocate some more space because it will do some work while it’s executing it. Then it’s going to read a file, so it needs to allocate a chunk of space to be able to read the file. That’s sort of how things lay out, and when you normally read the file, you read it into that memory, and then you work on it.

This is vulnerable—if I have exactly that same structure, but I send it more data than it expected, that data will fill up the buffer and overrun it. It may also overrun that data place, which says what to do when you’re done, so instead of going back to location 100 when you’re done, now it says, “Go to location 950 when you’re done.” Then that little piece of code thinks it’s done. It says, “I’ve got a note here that says I need to go to location 950,” which is in the middle of the data stream that just got loaded and could be malware the computer now stands to execute. Just from that sort of fragile construction, it’s possible for people to get their own code loaded onto your computer for a data stream. There’s work being done on how to shut down this kind of vulnerability, particularly with legacy systems. That was looking at the “how is the thing implemented?” quadrant. We looked a little bit at “what is the application?” We looked at “how is it implemented?” Let’s think about the user. Users sometimes don’t take security very seriously and may have very simple passwords which may even get posted beside the machine. The cleaning staff, then, can easily see your passwords, and they may choose to take note of those passwords and sell them on the open market. The other quadrant says, “What happens when you go deeper down in the machine and you think about things that could go wrong?” There are whole classes of things that could

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 8

8/10/17 12:36 AM


DAY 1 JANUARY 11, 2017 MAIN KEYNOTE

Flash is the number one source of cyber vulnerabilities at the moment.

go wrong at that level, for example, a plug-in device. Suppose you’ve just left your laptop in your hotel room, and it’s locked. You’ve put it to sleep, and it needs a password to wake up. You’re pretty confident. Again, the cleaning staff potentially could come by and just plug a device into that computer. The computer is so willing to talk to any device that it’s possible for that device to reprogram some of the decisions the computer makes, like where it should go to look for web pages. So later, when you turn it on and want to connect to home base, it may go to another place first.

just a single platform has an incredibly complicated cyber story. There’s the software that goes into it; the hardware, the manufacture of that hardware, and the software that’s running on that code. That by itself is complicated, but not the only thing that is engaged. There may be a command base station that has a trusted channel into that machine. How much care is being taken about what’s on those computers that are in the base stations, or even up in the satellites, as it’s talking GPS? How do you know that something else isn’t transmitting something that may cause a problem?

Here’s another example about implementation and storing information in memory. Memory should be allowed to read and write only in certain places. Again, one of the challenges, particularly in our phones, is that in order to get vast amounts of memory, it has had to be squeezed down to such a level that there are problems in the memory itself.

The software or the hardware that’s in the machine itself is probably going to get updated at some point in time. Maybe the hardware or software gets updated, and now you need to consider the machine that’s being used to update that? How sure are you that some attacker hasn’t penetrated that machine in order to modify the code before it gets loaded onto your platform? Even when you’re talking about a single platform you’re going to have a complicated cyber story. It’s no surprise that when you think about some big infrastructure, like the power grid, the cyber security story becomes more complex. One of the challenges that we have with the grid is that there are many different interconnected participants that have to trust one another and that have an impact on one another.

A row hammer attack happens when an attacker repeatedly writes to just one location in memory. Memory is electromagnetic and doing the same thing again and again to the same location can flip memory in neighboring locations that the attacker has no right to access. Indeed, people have been able to get significant attacks going by this sort of mechanism. In terms of systems that we build (for example, Boeing’s unmanned little bird)

There can be concerns other than the connections for the actual power transmission. Fluctuations in the overlay and control layers sharing power, even overseas, can cause problems. As automation increases, as buildings start to participate in the power grid, new attack surfaces open. There are many things that we have already done in cyber security that have made a difference. Four or five years ago, spam was miserable. Eighty percent of all emails had to be deleted. Now, with Gmail or another modern email system, it may be that spam is a thing of the past. Also, platforms like the Apple iPhone have a very high cyber security profile. There is a lot going on in practice which shows us that we can make a difference. The challenge is winning at cyber, not just fighting against it. One defensive technique that we have in play at the moment is two-factor authentication which is much stronger protection than passwords alone. A lot of systems have insider threat monitors that see whether somebody has been behaving one way, then shifts and begins to behave differently. For the iPhone, Apple made a choice to only allow users to have rights to music, and this turned out to be helpful from a cyber security point of view. This is called jailed applications, and it prevents users from loading on their own code. There are many methods used that give us certain

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 9

|9 8/10/17 12:36 AM


A DARPA PERSPECTIVE ON CYBER SECURITY levels of guarantee about how the code is built. Using HTTP is a problem; we should be using encrypted HTTPS to prevent an attack at an intermediate point. We should treat the network as if it’s untrustworthy. The range of ways technology is applied is different, which adds to our negative experience of it. We did a small study comparing browsers. We wanted to see to what extent the code inside the browser is implemented in a cyber defensive way. The code base of Chrome has a vast amount of protections applied to it. Google has taken building a robust server seriously. Attacks are still discovered, but they’re challenging to find. The same analysis with Firefox results in a different profile of the protections and implementation of the code. Microsoft has done a lot to become more secure. Sadly, they still package some things that are fundamentally insecure. Microsoft Auto Update for Office is a 32-bit application, with little cyber protection which allows for potential vulnerabilities. Establishing a “cyber crash” rating for any piece of this code is worth thinking about. We are engaging in three different technological thrusts in order to win at cyber. The first is to harden systems so that you can use your computer and not worry about cyber security. The second is to be able to get our job done even in the presence of insecurity. The third thrust is being effective when the primary focus is using cyber as a business or a warfighting domain. In hardening systems, one of the things that we’ve been doing with helicopters and other vehicles is exploring how to do a cyber retrofit on those systems. We’ve been pioneering methods for embedded systems, like vehicles, where we jack

10

the system up and introduce very strong separation between different parts of it. In this particular case, we got the red team right on the platform. They were on the mission computer of this helicopter, and we challenged them to break out of the protections that we had built. We had done mathematical proofs to show that they couldn’t, and we were correct. We also want to bring cyber security into the automated domain. Last summer in Las Vegas, we had a “cyber-capturethe-flag” competition between seven supercomputers that were working against one another. Cyber-capturethe-flag means that inter-connected computers try to break into each other and steal information. They try to “capture a flag,” by maintaining network facing services, like email, web servers or timed servers. They have to keep those up and running by patching their own vulnerabilities while exploiting others’ vulnerabilities. It was a success. Another example are the binaries, the web services or the network services that we wrote and are running. Sometimes they receded flaws when we wrote them that we hadn’t known were there. One of the machines discovered a flaw, and figured out how to exploit it and wrote the code. It launched an attack against another machine and succeeded. It stole the data from the other machine. A third machine saw that attack taking place and reverse engineered it. It discovered what the flaw was and figured out how to patch itself by installing a patch on its own code so that it was not vulnerable. That whole thing took 20 minutes. We talk about zero days—how many days since a flaw has been discovered since you come around to patching it—now we have to start talking about zero minutes. After we’ve deployed software, the

attacker may get a hold of it and study it for a long time. They might discover a flaw, as this is the manual system. They might figure out how to attack one of our systems. We discover that the attack is underway, maybe a week or even months later, and we reverse engineer it and fix the flaw. We build a defense for the next time. When you can shorten that time frame it changes that equation. Even before we deploy software, we could have machines analyze those implementations to determine where the vulnerabilities are in our code and patch them so that the attacker doesn’t even have the opportunity to launch the attack. Those are a couple of ways that we might think about hardening systems. I talked about operating through insecurity; we might do that by looking at the binaries that we’ve deployed. If an attacker gets a hold of the code in a system that is repeated again and again, he can study its binary implementation and figure out how to break through it. But, if every time you deploy one of those systems you use a variant of that binary by laying things out differently, then the attack may only succeed on one of them and fail on the other. The checking mechanism detects that something has gone wrong and is outside of the normal operating envelope, stops the execution, and returns to a safe situation. I also talked about networks and the importance of being able to operate even in the presence of adversaries in the networks. One of the techniques that we’re developing is to be able to communicate through the network, even if the adversary is somehow in the middle trying to interfere with that communication. The network itself doesn’t even know how to respond. I may instead be able to send a message to my colleague via

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 10

8/10/17 12:36 AM


DAY 1 JANUARY 11, 2017 MAIN KEYNOTE a completely different path than the original path. What we want to do is build devices that, without going inside the network, will discover within milliseconds or even microseconds that something is askew and create an alternative route. We’re also working at the level of the national grid. It’s not primarily our task, as there are other people who are responsible for the national grid. It would be great to be able to simply upgrade the systems on the electric grid, but the reality is that that doesn’t work. We’re still running equipment from the sixties and the seventies in various places. What we can do at the network level to enable us to get early indicators of attack is to isolate sub-networks even further so that the attacker has no ability to communicate back to home base. The current grid is vulnerable, so that if something goes down, an attacker may be able to maintain a presence and keep it down for months at a time. We need to figure out how to fix this vulnerability so that it only goes down temporarily. We are working with Cybercom and Army Cyber to win at cyber. We are helping people gain a better understanding of what’s going on in the networks so that they can to deploy their cyber tools. There are big challenges for us understanding networks, and we’re using machine learning to be able to understand things at scale. We want to harden systems, to operate for insecurity, and to be able to win in that cyber domain. My last question is, what will it take to be able to overcome the barriers to build a secure system? QUESTION: To an extent, this is asymmetric warfare, as individuals without many resources can do a lot of damage. How do we combat that?

JL: Our view is that you’ll never be able to fundamentally secure a system against peer adversaries who are able to throw around money. What we would like to do is make it so complex and expensive to be successful in cyber security, or in cyber attack, that you have to be at the level of a peer adversary to be able to accomplish it. We can begin to win at cyber if we can take off the table the things that the script kiddies can do,

One of your last comments was about deterrence, which was ultimately an acknowledgment that we may not be able to prevent the attack, but we could deter the attacker by making it really expensive and annoying for him. What you just said about building materials is interesting; traditional Japanese buildings are made out of rubbish building materials. They built the walls out of paper, not a very good way to secure a building

We talk about zero days—how many days since a flaw has been discovered since you come around to patching it—now we have to start talking about zero minutes and even up to the cyber criminals and the organized crime. We can make cyber resilient enough so that those resources aren’t enough to make an attack. For the last 30 or 40 years, we’ve been laying down sedimentary layers of insecure software, and now we have the problem of dealing with that. QUESTION: How can a corporation hope to appear resilient to a nation state? When you’re talking about securing a general purpose computer, and hardening it down to the purpose you’re looking at, when we’re buying off the shelf components and control system vendors, is that a realistic task? JL: The challenge is to be able to buy off the shelf components that are secure. We need ways to assess the cyber quality of various components. Figuring out how to build more secure systems is definitely not just the responsibility of the end business. It has to be all the way down that supply chain.

from attack, so they developed other systems. There might be police around that are going to make your life difficult. It may be that a way to look at this is partly through changing the physical structure, but recognizing that the weak parts, or the really expensive parts to fix, are places where deterrence policy can play a bigger role. JL: Attribution is important in the cyber world. In a classified setting, we have techniques to figure out who’s on the other end of the keyboard by following patterns of behavior. That kind of classified information is hard to give to the FBI or for the FBI to use it in public. Instead, we look at the public data stream and do a parallel construction to be able to understand the attack. We’ll be able to find sufficient evidence to demonstrate it with unclassified information. Effective methods of attribution are going to be key.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 11

| 11 8/10/17 12:36 AM


A DARPA PERSPECTIVE ON CYBER SECURITY

One defensive technique that we have in play at the moment is two-factor authentication which is much stronger protection than passwords alone.

QUESTION: Do you see the

hardware supply and supply chain as being something we can solve? If the hardware that it’s running on is already compromised, how do you deal with that?

JL: The level of protection you need at the hardware level is dependent on what you’re using your system for. Apple is an example of doing a lot of things right in the way they’re building systems. They’re maintaining a lot of control over those systems. Once you’re at this level, we ask is it a peer state that is trying to do something here, and that’s when we start to think about it as part 12

of the international, the war fighting domain, as opposed to just day-today cyber vulnerabilities. QUESTION: Can you comment on the effect of open source on cyber security? JL: Open source has been effective. The challenge is that nobody is actually responsible for being fully diligent in cyber security. Even something like Web Kit, which comes out of Apple, is open source, and it has vulnerabilities. They don’t necessarily get fixed. Somebody else who’s now wanting web services in their system gets hold of Web Kit because it’s open source, and they just decide to use that, or they get a current version of Linux, and they decide to

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 12

8/10/17 12:36 AM


DAY 1 JANUARY 11, 2017 MAIN KEYNOTE use that. They may come with all sorts of inherent vulnerabilities. I think it would be interesting at a national level to say, “What would it take to find the key elements of open source software that are out there, that get used again and again, and do a national scale working of those to reduce the inherent cyber vulnerabilities in a lot of those systems?” That could have a huge impact on how we do it. A lot of the challenges that are coming up in some of those systems are not that they’re using open source software that has inherent vulnerabilities. It’s that they have “password” as a password for administrative actions QUESTION: Are we anywhere near or willing to make the necessary changes? If so, where should we start? JL: There’s a vast amount of code and other systems out there. I don’t know that it’s just the will. I would like to see what the dollar figure associated with that is. I do believe that we know how to build secure systems now, and that if we had the will and the money, that we could do it in a couple of years, but the price tag would be prohibitive to try and do it very rapidly. QUESTION: As a nation, who is responsible for cyber security? As a country, do we have a general overall strategy for cyber security or is it fragmented where everyone is on their own, managing their own domain of protecting their businesses? JL: We don’t have any universal, unified approach to cyber security, but we don’t really even have physical

security, either. In the physical world, we have security at different levels. We have national armed forces looking at things, particularly overseas. We have the FBI looking at interstate things. We have local law enforcement looking at things at the state, county, and city level. People are working at different levels together. We don’t yet have experience in cyber security to figure out how to bring things together at the different levels.

People have very cool devices that they put in their houses, and they have no idea that those devices have hardwired passwords, which allow an attacker to just log into those devices and use them as launching points for something else.

I don’t think that the right solution is to have one command that’s responsible for everything, but we do need to think about which things are the responsibilities of the individuals, of the enterprises, of the infrastructure (i.e., the IM internet service providers), and groups like CYBERCOM at a national level. CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 13

| 13 8/10/17 12:36 AM


CYBERSECURITY OF DAY 2 JANUARY 12, 2017 CRITICAL INFRASTRUCTURE WELCOME SUMMIT 2017 PROCEEDINGS DANIEL RAGSDALE Dr. Dimitris Lagoudas is a distinguished professor and an aeronautical engineer. His research is focused on the design, characterization and constructive modeling of multifunctional material systems. He’s the Senior Associate Dean for Research, the Associate Vice Chancellor for Engineering and Research, and the Deputy Director of the Texas A&M Engineering Experiment Station.

RAY ROTHROCK Mr. Rothrock CEO, Redseal and Partner Emeritus at Venrock, joined RedSeal, a cybersecurity company, as CEO in February 2014. Focusing on Internet infrastructure and security, he has an extraordinary track rec-

DR. DIMITRIS LAGOUDAS As an NSA National Center of Academic Excellence in information assurance, Texas A&M has conducted notable research and provided education for more than a decade on cyber security technology, innovation, and policy. About two years ago, we established the cyber security center, and Dean Ragsdale is the first inaugural director. When we formed the cyber security center, we coupled our faculty 14

expertise and our excellent facilities, and we uniquely positioned Texas A&M to help advance national infrastructure security and also develop the future leaders in cyber security. From our smart grid and cyber physical systems test beds, to our secure communications and computer systems laboratory, to our first net and next Grid 911 test beds, we’re actively building a program where groundbreaking cyber security advances will occur. First, we will share information and initiate key conversations. Next, we hope to forge partnerships, as this is a collaborative topic. Finally, we will develop high impact strategies that will help us transform the counter cybersecurity capabilities. Ultimately, this will lead us to national economic security which will benefit society at large.

ord in cybersecurity investments. He often consults on trends, strategies and technologies in cybersecurity markets and recently attended the White House CyberSecurity Summit held at Stanford University. Prior to RedSeal, Mr. Rothrock was a managing general partner at Venrock for 25 years. At Venrock, he invested in 53 companies with more than a dozen in cybersecurity including Check Point Software, Vontu, PGP, P-Cube, Imperva, Cloudflare, CTERA, and Shape Security), and he led the energy investment program and the Internet investment program in the firm. He remains on the board of Check Point Software and several other Venrock investments. Ray was the 20122013 chairman of the National Venture Capital Association.

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 14

8/10/17 12:36 AM


DAY 2 JANUARY 12, 2017 KEYNOTE

THE CASE FOR RESILIENCE IN CRITICAL INFRASTRUCTURE I am going to talk about one of the core elements, resilience. I was a venture capitalist for 25 years with Venrock Rockefeller family. I did investments in cyber security and got savvy on cyber right when the internet started to happen. Then I joined a company called RedSeal.

In San Francisco, every year there’s the RSA conference. I went to one in 1993 where there were about 35 companies. Now it’s a massive event in Moscone Center. It’s clearly a growing and big industry. As I said, three billion dollars from venture capital went into it out of a $56 billion investment from the VC community. One of the things I invested in all these companies was detection and prevention. But, it’s not sufficient, that’s why I jumped with RedSeal. There are 295 reported critical infrastructure incidents. We have to get ahead of that or we will be in trouble.

Gartner, a very big enterprise research company, has a fellow named Neil McDonald. He made a statement in 2013: “Enterprise systems will be in a state of continuous compromise. We’re always under attack. There’s always something going on. But, we’ll be unable to prevent advanced targeted attacks.” The rate of loss is growing very fast. To illustrate the losses, in 2014, it was half a billion dollars. The PwC projects two trillion. It’s 100 percent more than what it is now. The spend only goes from 75 to 125. That ratio, this is our loss ratio, is going up from 6 to 16. We all remember PCs back when viruses were passed around by a floppy disk. This threat emerged, and we created the anti-virus industry, which is about a $6 billion industry now. Then, when we started hooking things together and HTML came along, and a new capability arose. Symantec and McAfee are the big anti-virus companies and leaders

in firewalls and intrusion detection. These companies are leaders, but why are we getting hit? Why are we being attacked? Ninety-nine percent of the damage comes from unpacked systems. President Obama ordered a review of all critical infrastructure, which figured out that 57 percent of the government systems were not able to be patched. If they can’t be patched, damage starts, and that equation does not work. The other thing is APT. It’s social engineering, phishing. Ninety-five percent of the successful breaches start with phishing. I was at an insurance company in Chicago giving a talk about cyber. This particular insurance company required employees to take an online test every quarter. The company also gives pop quizzes and sends fake phishing emails to its employees. The same people fail every quarter, and so the company was modifying its policies so that they would be able to fire people who fail the phishing test. That is the seriousness about phishing. Even Mr. Podesta fell for that trap in the DNC and got his email hacked. At RedSeal, we surveyed 200 CEOs, 83 percent said they’re very confident in their strategies, yet only 52 percent felt that an incident was inevitable, and 72 percent felt that metrics on cyber were not very effective. They

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 15

KEYNOTE

A good venture capitalist has the ability to see patterns. We’re spending $86 billion worldwide, two-thirds of which is on service and a third on product. However, the security budgets are going up fast compared to IT budgets. IT budgets are increasing two to three percent a year, and security budgets are increasing four or five times faster. There are 1,400 technology start-ups; $3.2 billion was invested last year.

PricewaterhouseCoopers (PwC) puts out a report which estimates the losses. Just a few years ago, the losses were at half a billion dollars. This is reported financial losses from corporate enterprise. Two years ago, it was $500 billion. This year it’s estimated to be one trillion dollars. We have to get out of that curve and be smarter than they are.

| 15 8/10/17 12:36 AM


THE CASE FOR RESILIENCE IN CRITICAL INFRASTRUCTURE lack context and information. Our cyber industry cannot communicate to the executives that have to make the decisions, spend the money, and allocate the resources because we don’t have a language.

better than we do. They get in there, root around, find what they’re looking for, and leave. Point products are not working. They don’t provide a systemic view. They weren’t designed to be resilient.

In the early days, losses from viruses totaled $3 million in a $6 billion industry. Then along came crime, and people got smart and responded with firewalls and other kinds of prevention capabilities. The APT and zero-day threat losses equal $59 million. These attacks are not remedied with these two technologies. Palo Alto came along with the next generation firewalls, which sort of work, FireEye and Sandbox. What do we do? How do we think

You can’t possibly do anything if you don’t know the numbers and understand the device relationships and all the ports and protocols. Prioritization is probably the biggest thing that management misses, and this is highlighted in the McKinsey book. Without prioritization, it is impossible to know what to fix first in a network with a million endpoints, a hundred thousand routers and switches numerous people. Doing risk prioritization is a hard thing to accomplish because

Hackers know our networks better than we do. They get in there, root around, find what they’re looking for, and leave. about? How do we solve this problem? Technology is not keeping up with the threats. It’s called digital resilience. McKinsey and Company cyber group out of New York City published a book in 2015 called Beyond Cyber Security: Digital Resilience. It’s a quick read, directed at executives on how to frame and think about policy. Digital resilience is a comprehensive strategy, it is not a product or a technology. Hackers know our networks 16

you don’t have infinite time, money, and engineers. If you have an understanding of your network, you can be compliant all the time. You have to understand your network by measuring it and establishing policies. As Peter Drucker says, “You cannot manage what you cannot measure.” This is just a simple cartoon with switches, routers, a bunch of endpoints, enclaves, the company, and the labs. You need to understand what all of this looks like.

From the outside world, you have trusted capability talking to untrusted capability. You know exactly what’s allowed. You have high priority servers with intellectual property, money or some other capability on those servers. Then you find out you have discontinued protocols. One of our very large customers, a large security company, found lots of connections to the outside world of discontinued businesses to their supply chain. The money they saved by just turning off those ports paid for the product. The point is that there are things that are no longer used and people left it on because nobody knew about it. It wasn’t documented. Then you have all these routes where the bad guys drop in through social engineering and phishing. They’ll drop something on this host here and it’ll start walking around hunting for these red, high priority servers. They’ll eventually find it and then you have to respond to it. We need data and understanding. We need measurements, but at the end of the day we have to respond fast. There’s a lot of great response technology out there. ForeScout is a leader in the government, as well as commercially. They have the capability, and, as an alert comes out, they get an IP address and can detail that device and tell the user what the issues are, but they can’t tell the user where it is. I find that amazing. With our software, you input the IP and you get that information.

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 16

8/10/17 12:36 AM


DAY 2 JANUARY 12, 2017 KEYNOTE

Digital resilience is a comprehensive strategy, it is not a product or a technology.

The DVIR from Verizon, and others, talk about the 250 days, the time between when something starts to happen and when you detect it. The detection and response time is on the order of years. It’s interesting how a lot of these people (Yahoo, Target, Sony, Yahoo, JP Morgan, Chase) are just now revealing that they had a problem, and that the problem occurred in 2013. That was when the bad guys figured out zero day at APT.

If you don’t respond, it’s devastating. If you can respond, you can prevent anything from happening. These are the series of things talked about in books about resiliency and incident response. You have to detect, find, contain, analyze, remediate, repair, restore, and get back in business. At the end of the day it’s about confidence. Your customers have to have confidence and trust in you. These are hard things to do with a limited budget and limited people. These networks are old in critical infrastructure. There’s a lot of technology that’s been in place for 50 to 60 years. They’re being hooked up for economic and monitoring reasons. There’s vulnerability with these systems. They’re often not patchable. An architecture with a network around it needs to be built so that you can respond, protect, and deflect. In 2016, the G7 came out with this advanced resilient energy systems. Secretary Ash Carter talked about resiliency last summer in a speech he gave about their electronic systems. The CEO

of NASDAQ talked about it. In July of 2015, the New York Stock Exchange, the Wall Street Journal, and United Airlines all went out at the same time. They were blacked-out for two hours. People thought we were under a massive cyber attack. It was nothing more than a router upgrade that went afoul. That can happen, and it can bring down systems. You have to be able to respond. Fifty-three percent say that a breach is inevitable. Yet, 75 percent believe they have that resiliency, but really 24 percent do not. In the critical infrastructure world, the numbers are even worse than in the enterprise world. It’s a lot worse because of the age of the systems and because the engineers that run those systems weren’t trained in modern cyber security infrastructure ideas. DHS defined 16 infrastructures, and 15 of the 16 depend on electricity. Peter Drucker said, “You cannot manage what you cannot measure.” That is my new thesis for investment and infrastructure in cyber security. I see these 1,400 great projects, 1,350 of those are all about detection and prevention, and there are a few about measurement. We don’t often measure output in cyber grade, but we should. We mostly measure input. When chief information security officers (CISOs) report out, they report the number of attacks or virus findings. But, these reports don’t measure effectiveness and deterrence because you can’t measure what you don’t understand.

Many different people have built these systems over many decades, and documentation is poor. At RedSeal, we’ve taken an incredible analysis and boiled it down to one number, 375, it’s like FICO, a measurement concept. We measured this FICO capability, 300 to 850. As you make changes to your network and perform compliance analysis, that number gets better. A board wants to know that cyber security is improving. That’s what we and other companies provide. Resilience is a new imperative. At RSA, a lot of companies will be talking about this target basis. There were a lot of good things in The President’s Commission (release 12/1/2016). One was that resilience must be a core component of any cyber strategy. You need the protection and the detection, but it’s not a product, it’s a way to operate. It is managements’ imperative to think this way. Another thing that came from The President’s Commission was that there should be a cabinet level position for this. We will defeat this threat, as we have before. We have a lot of work there, but we can do it. I really don’t want to lose 2 trillion in 2018. That would be headline news in the New York Times if we can stop it. QUESTION: A lot of companies don’t seem to have a good set of policies. Could you speak a little bit more about what we’re missing?

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 17

| 17 8/10/17 12:36 AM


THE CASE FOR RESILIENCE IN CRITICAL INFRASTRUCTURE RR: It is resilience. A policy should dictate those people in the first responder position or a response capability position should be able to have situational awareness. They know what it looks like and where the issues are. If an incident occurs, they can respond. If a hospital, for example, is going to stand up a new diagnostic facility someone needs to thinkabout who needs to have access to that and what information flows back and forth. One of our early customers was Stanford Hospital, a sophisticated teaching hospital. It had a well-maintained, state-of-the-art cyber security infrastructure until they connected to the university. That should not happen. They didn’t have situational awareness. The policy needs to be tested and modified, and that’s driven by business. We’re going to the cloud because it’s cheap, but what is the policy around cloud? We need to be forward thinking rather than reactive. Expect that anything can happen because that’s being resilient and prepared. QUESTION: This is a conference on energy and manufacturing, but everything you talked about was essentially IT networks. Energy and manufacturing are different because the control systems that are involved in operation. Our end devices are not the same end devices as IT, and we don’t have situational awareness. We don’t have the ability to patch, obviously, sometimes for months, if not years. On top of that, our policies are different policies. How do we marry the expertise that comes from the IT world to the domain expertise of the systems that are actually operating in these facilities? You cannot either know or secure anything if you don’t understand that. 18

RR: There are a lot of devices that you will never patch to be secure. But, you can use your network as a defense capability, like the wall. You can’t protect all of the devices because these devices are old. We have lots of customers in the utility industry who’ve used technology like ours to segment their network properly and to control those interactions between enclave A and enclave B. That’s where you have to think about in order to really get after the energy and manufacturing sector. In network infrastructure security, you have to be able to talk to the devices. Most of them are IT, CISCO, or Juniper kind of routers, but there are all kinds of other people that have routers in the infrastructural world of energy and manufacturing. We understand how those things work, and we give that information into the model. QUESTION: My issues are the sensors, the controllers, the actuators, the things that actually do things. We’ve got switches, routers, and everything else that can blow things up and which aren’t being addressed. RR: You’re the original Internet of Things (IOT) and IOT is a big deal, a big threat. There are ways to build your network so that you can control that access and the threats. You need tools and to think about it differently. It’s not just securing a device. There isn’t a secure end point in here, I guarantee it. QUESTION: I’m one of nine commissioners with the Texas Commission on Law Enforcement Leadership in the board room is critically needed in government. I’m from the government, and I’m here to help my constituents. If I can’t motivate the leaders to ad-

dress these issues, then we have failed in our mission. RR: The board room is interesting: we don’t need someone that’s an expert on electricity, water or gas, but cyber is different. The attack surface is unique. It’s very distributed. It goes to the core function of the board which is confidence and trust in the customers and the relationships that you have with your stakeholders, not just your shareholders, but all of the stakeholders in the business. Cyber can destroy that, and you have to be aware of that. The McKinsey people tell me that there are regulations from FINRA that’ll drive this into the board room. I had the privilege of meeting the CEO of JP Morgan, Jamie Diamond. One day a month, his executive team reviews the cyber strategy of the bank. There are people who are way out there in thinking about that, but I’m with you. I bet 99 percent of the people are not anywhere close to that. QUESTION: I’m with the science and tech part of Homeland Security. The quality work that you’re focused on is something that’s needed. It ultimately traces back to whether the board and the company recognizes things are changing technically. It’s a question of whether, in a corporate sense, they are good at responding to threats and challenges. Do you see it as a quality initiative at its level, and if so, what characterizes a company as high quality? RR: Make people aware and measure things. Train, and train often. Technology is moving fast. There’s nobody that knows it all, so you have to train people to think, not about the specific technical things,

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 18

8/10/17 12:36 AM


DAY 2 JANUARY 12, 2017 KEYNOTE but critically think about what questions to ask. I think executives should have that. You have to be skeptical. The good ones have these policies. They have rigor, they train, and they have measurements. I think that’s what separates the winners from the risky ones. If you have 1,000 defects in an engine, where do you start? You need people to prioritize that for you. QUESTION: First, what’s the source on the $1 trillion loss figure, and second, from a C-level board room position, what are the metrics that they would be most interested to see? RR: The source, PwC Global Cyber Report, comes out every year. That’s from the 2015 report. That data, showing $500 billion in losses, was documented; the $1 trillion was estimated. The $2 trillion is their estimate going forward, if things don’t change. The board measures results, revenues, expenses, and head count. They measure physical things, and could care less about a zero day. They don’t even know what a firewall is. That stuff is irrelevant. We have 10,000 CISCO firewalls, and 2,000 check point firewalls. That’s the input side of it, and what’s missing. When I got to RedSeal three years ago, we had this calculation about priorities and vulnerabilities across the entire network. We don’t just assess the devices, we assess the whole architecture. We boiled it down to this FICO score. We have customers who have an army of cyber people, the Postmaster General of the U.S. Post Office, for example, who look at their resilient score every day. The board is engaged in the trade off when the CISO wants to spend more on cyber security. You have to be able to give the executives that make those decisions some measure because that’s the way boards operate. They operate on numbers.

QUESTION: Can you comment on what your company does to make things resilient? Knowing the score 350 or 650 doesn’t do anything about improving. How do you know that 650 really means something until your next software patent brings it down? RR: There’s no such thing as resilience technology. It’s a way of thinking, operating, and of prioritizing what you’re going to do next. That score is relative to itself, so you can compare that score to the last time you checked. Most people use our software weekly. If I do work on the network and I re-run RedSeal against it, the score either went up or down. Because we do this score, we actually render a judgment about it. If we render a judgment, then we must be able to tell which one is the worst. Most networks have a phone book of vulnerabilities and issues that they have to deal with. Does any-

Every time you run the software, you get a new report and you can see what’s changing. There are three things we measure: configuration of devices, vulnerability host scan information, and what we call the incomplete network. We always find parts of your network that you don’t know about. People put new stuff in, they take stuff out. It changes all the time. It changes multiple times a day. We render a judgment about your network. We trend it over time. The CEO, the COO, the CFO or the GC, would be someone monitoring this, and they call down and say, “Hey. The network just got bad.” Then, I can go into this pane here. These are the top ten risks in that network of maybe ten million. You can then push a button, dispatch a work ticket, get it fixed, come back, re-run the software, and see if it’s fixed.

Resilience isn’t a technology. It’s a measurement, a state of mind, it’s a way to operate and think about your network. body know what the number one policy that’s violated? (Everybody has this policy, but it’s completely violated on the network side of things.) It’s passwords.

Resilience isn’t a technology. It’s a measurement, a state of mind, it’s a way to operate and think about your network.

With 10,000 routers are you going to have 10,000 unique passwords? No. You’re going to keep the password. I bet you I could log in here if I could get to their router. There are only eight passwords that CISCO uses. It is amazing how these policies are violated. We have to have things to keep up with those passwords.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 19

| 19 8/10/17 12:36 AM


DANIEL ENNIS Mr. Ennis is a Senior Fellow with the Center for International and Strategic Studies at the University of Maryland and the Executive Director of the University of Maryland Cyber Initiative. He is also KEYNOTE

CEO of DRE Consulting. In these roles, he leverages over 36 years of U.S. Government service, principally with the National Security Agency/ Central Security Service (NSA/CSS). Prior to November 2015, Mr. Ennis was the Director of the NSA Threat Operations Center (NTOC). NTOC is responsible for conducting 24/7 cryptologic activities to discover, characterize, and proactively counter cyber threats to U.S. national security systems and other networks of interest, while protecting the legal rights of U.S. persons. His distinguished career with the U.S. Government began in 1979 with the DEA. Ennis joined the NSA in 1982 and has held numerous senior-level positions, within the Signals Intelligence Directorate. Before becoming the NTOC Director, he served as the Assistant Deputy Director of Analysis and Production, providing overall management of the U.S. Signals Intelligence analysis and production mission. Other senior positions held included Deputy Chief, NSA Tailored Access Operations, Chief, Transnational Targets and Chief, Office of Russia. 20

I’ll try to explain who’s in charge of cyber security in the United States and give you a sense of where we stand from the government side in the cyber security domain. It’s not a government problem alone. Most people look to the government to be the defense against counter-terrorism, but if you’re looking for your government to solve and be the total defense for the cyber problem, then you’re misguided. We want to solve that problem, but that can’t be done in isolation. It’s part of a larger issue of social media and our reliance to get our news from the internet. We only have to look at the fake news around the elections, the propaganda, and the Russians, and the adversarial plans that existed to affect and impact the elections. This is a holistic problem. To look at it just as a cyber issue is problematic. The changing threat vectors must be looked at as well. The fact that nobody predicted that somebody might take a cyber approach, at least in part, to affect or impact a U.S. election is surprising. We cannot depend on traditional threat vectors. We must look broadly at the adversaries, from nation-states to hacktivists, to insiders. We have to look at the fact that they may take an approach and attack our systems in a way that we didn’t consider. This nation-state game is

not fair. There may be optimism in terms of our ability to defend ourselves against certain adversaries, but not the nation-state adversary. Then we get to the threat actors. I defended against the nation-state actors at NSA. I wasn’t worried about the criminal hacktivists, unless they had ties to nation-states, or concerned about the insider, unless they were insiders against national security systems. Russia, for example, has a more aggressive posture under Putin, and is more willing to do things that would be considered out of the box, even willing to get caught. That speaks to his motivation, or the motivation of their intelligence services, to prosecute and to engage in this kind of activity. That’s a much different thought process than we used to have. In cyber defense at the national level, we used to think it was about subterfuge and being covert and quiet. Folks are not necessarily concerned these days about being quiet. They’re willing to get caught. We look at the Chinese who are aggressively posturing their holistic plan against the United States to steal our intellectual property. It may be through a cyber attack, or it also may be through penetration by other means (e.g., by buying companies). You have to look at these aggressive actors in the context not just of cyber, but of this larger event.

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 20

8/10/17 12:36 AM


DAY 2 JANUARY 12, 2017 KEYNOTE

THE CURRENT STRATEGIC PERSPECTIVE OF CYBERSECURITY

Not only do we have to take a holistic view, but we also have to change the U.S. government perspective on how they address cyber threats. For cyber, it has to be a partnership between the private sector, government, and academia. As we begin to look at who the cyber warriors are, it’s not just the government. It has to be in the private sector, as well. How do we engage in the type of information exchange and sharing that facilitates strength in this sector? If the government doesn’t develop processes and procedures that allow and facilitate the kind of interchange

and exchange that we need to effect against these targets, then private sector individuals are going to do it on their own. We already know that they’ve conducted acts to understand and characterize the threats in a way that actually is illegal, because if you penetrate somebody’s system and you’re not the government, you’re committing an illegal act. How do we respond to this overwhelming problem? The fact is that the government is not well positioned against the Russians, the Chinese, the hacktivists and the insider at this point in time to do the kind of holistic planning that we see the adversaries doing. The insider is a big threat that cannot be overlooked, whether it’s somebody taking a gun into the workplace, or deciding to break your machines or steal your intellectual property. The NSA had its own insider, Snowden. At the U.S. government level, we’re not structured properly to conduct the kind of counter-activities and deterrentsetting activities that we need in order to be effective. As we have this triad of private sector, government, and academia that needs to fight the cyber war, the government’s role is huge, especially against the nation-states, in setting the kind of planning, structure and priorities that are necessary to create effectiveness. It used to be that when

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 21

KEYNOTE

I was Chief of the Office of Russia at NSA during 9/11. After 9/11, I had to move 1,000 people from the Russia effort to the counter-terrorism effort, yet today, Russia is still a problem. It is a nation-state that is carrying out acts that enlarge their holistic plan to attack the vulnerabilities of the United States. An entity like NSA has to change its cyber perspective because the traditional threat of Russia (strategic bombers and other things, which are still there), has morphed into a cyber threat. It’s not going to be effective if NSA does it alone. Law enforcement, DHS, and ultimately the private sector all have to come together in the effort. That’s a change in the mindset of the U.S. government.

| 21 8/10/17 12:36 AM


THE CURRENT STRATEGIC PERSPECTIVE OF CYBERSECURITY the Russians had “crossed the line,” there would be a response from the United States. Now, they don’t quite know what that level of line crossing is, nor do the North Koreans, the Iranians nor the Chinese. That’s a dangerous place to be, and so we need to change the government structure as to how it prosecutes the overarching cyber issue. There needs to be a new level of intimacy in the exchange between the public and private sector. DHS has a tough mission. They’re gathering large scale data and storing it, and the kind of analytics needed to look at it is being developed. But, the private sector questions whether the information put into or pulled from that database has the context and intimacy needed to understand the problem and deal with it. I want a government that is postured to make sure that those dealing with industrial control systems and internal mechanisms, by sector, can communicate. We’re talking about finding a way to engage in the kind of intimate discussion, in real time, with the agility that allows them to act. That’s what we need. That’s part of this larger structural problem that the government has. We have to talk about the right to privacy. We are fighting this war as if it were a World War II kind of construct. We had oceans to protect us then. Because of the ubiquitous nature of communications, comms could have traversed the United States 14 times. But, NSA 22

has to stop at the border. This presents a dilemma that we need to address. Back when the government workforce was furloughed, we saw a 400 percent increase in exfil from the United States into Russia. I oversaw NTOC, and I went to the FBI and informed them of two nodes in the United States they should focus on. It took the FBI three weeks in one instance and months in another to get the probable cause, and ultimately, a warrant to go against those nodes. By that time, they were useless. The reason was that the judge didn’t understand the cyber dynamic. A U.S. government that processes and engages in the type of exchange that alerts, educates, and changes the model as to how we defend in this space is critical. We’re not there. There are countries that do have a model. The British, for example, have a greater trust of government than we do. They’ve put a lot of that responsibility and authority into their Government Communications Headquarters (GCHQ). GCHQ has both offensive and defensive responsibility and can project their foreign intelligence back into the system with some degree of ability to defend. It’s not a perfect solution, but people trust that government. Nobody’s going to allow NSA to be that entity. It’s just not in our heritage here in the United States. We’ve got to figure out the privacy piece because it is huge. I recently read a study reporting that people trust the government intelligence communities

more than they trust Google and other entities to protect their information. Nevertheless, we’re not going to have the kind of authority embedded in NSA, like the British do, but we have to figure out what the model is for the future if we’re going to be effective. Going back to the private sector, there are two entities I want to focus on regarding the strategic capability against cyber. There are the people who need to protect themselves, and we need to have a structure that allows for those kinds of intimate discussions. We also need that kind of engagement with private sector security companies. I know of a company that learns about threat information after sitting on a node and watching the traffic. We don’t have a ready mechanism for that threat company to pass that information onto the FBI or DHS with the agility or the intimacy that’s needed in order to react in cyber time to counter that threat or develop systemic measures to address it. This idea that we create a government that is responsive and focused, but sharing and broadening its approach, is a total change. We’ve looked to the government to protect us for years. We still must do that, but we must remodel what we’re doing to focus on these strategic cyber threats. We are structured as a government to handle a cold war and a traditional, conventional force threat. We need to change that so we’re also structured to handle a cyber threat. If we don’t, private sector entities are going

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 22

8/10/17 12:36 AM


DAY 2 JANUARY 12, 2017 KEYNOTE

We have to change the U.S. government perspective on how they address cyber threats.

to take it upon themselves to do so, in the absence of real government action. The ability to innovate across all sectors is what’s going to keep the U.S., and our economy strong. If we don’t protect the information we learn through innovation, the Chinese or other entities are going to be right behind us stealing it. This idea of the industrial internet of things, the ability of innovation to get us to where self-learning systems can actually make our robotics better or tell us where there’s a problem on the assembly line before it happens, is the kind of innovation that’ll keep our economy strong, but only if we can protect it. I’m concerned that we’re not setting priorities. We need to figure out what to focus on, both tactically and strategically. If I’m a company and I’m trying to protect everything, then I’m protecting nothing. If I’m a government, and I’m trying to do everything in cyber, then I’m doing nothing. I need to focus the level of effort and set priorities. The first priority should be to have a holistic plan that shows the adversaries where, if they take their next step, they could have a problem with the United States. A plan needs to be set up by sector that people can rally around allowing for intimacy and information exchange. Ultimately, a new structure is called for and it should be a cabinet-level position. The technical expertise in cyber in the government resides in large part at the NSA. Other entities and sectors in the government are

growing, but NSA has a great capability. It is, at times, handcuffed from doing some things. We need to change our priorities at NSA and focus on cyber, but that has to come from a cabinetlevel kind of Secretary. With the right level of structure, the U.S. government can change and be a more effective player in cyber. I have a great deal of confidence in the private sector investing in cyber security and that we can innovate our way into solution sets, both tactically

DE: We have to have a national discourse on what privacy is and what we allow government and private sector entities to do in that space. Part of the solution is that government has to restructure to be more effective, and then challenge the private sector by segments, by critical infrastructure, to help them mature their processes. We’re not postured in a way that could affect great change. How do we change?

We are structured as a government to handle a cold war and a traditional, conventional force threat. We need to change that so we’re also structured to handle a cyber threat. and when married up with a government plan, against the nation-states. QUESTION: This may not be a popular view, especially considering that it’s a large group of academics and government workers and little from the private sector, but what I heard was “Be afraid. Government is the answer. The fourth amendment is an anachronism from the past.” That is a troubling view to me, coming from the private sector.

How do we create synergy that allows for the kind of intimacy and the kind of interchange that I talked about? We need to have that dialog about what the government should do, and what we should allow the private sector to do. I believe, ultimately, we’re going to allow the private sector to do things that have been traditionally government roles. I’m going to applaud that, with the proper controls, because that brings in the cyber warriors to the problem set that we don’t currently have.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 23

| 23 8/10/17 12:36 AM


THE CURRENT STRATEGIC PERSPECTIVE OF CYBERSECURITY QUESTION: Do we, in national security (especially where cyber security is involved), know how to counter-punch? DE: We do have a U.S. Cyber Command. We also have the CIA who has authorities in covert action. It’s not agile enough to respond. We have capability and we’re developing military capability to do just that, to counter-punch. You don’t do that in isolation. You must do it, and should do it, as an overarching plan. I want a holistic plan and the kind of maturity against a nation-state adversary in order to counter-punch in cyber, maybe through financial sanctions. Right now, decision makers don’t have

QUESTION: Did Edward Snowden damage the efficacy of the NSA and is there any room for whistle blowing in the information community? DE: He had an opportunity to be a whistle blower. He did not take that opportunity. When I led all analysis at NSA, 25 percent of the NSA employees’ time was spent on making sure they didn’t violate the privacy of U.S. persons. I think we had robust procedures in place. The thing that he focused on was a counter-terrorism program that was implemented after 9/11. The program was founded around two individuals whom NSA was looking at overseas, but when

If I’m a company and I’m trying to protect everything, then I’m protecting nothing. If I’m a government, and I’m trying to do everything in cyber, then I’m doing nothing. I need to focus the level of effort and set priorities. enough capability or a ready mechanism to act with agility. Planning and maturity is needed, along with a capability that’s both offensive and defensive. I want our ability to understand the actor inside somebody’s network and quickly transpose into something that allows me to take action on the offense to deter it from happening again, and vice versa. A holistic plan needs to be developed to address the solution that allows a counter-punch immediately, but only if that’s the right thing to do as I look across the full spectrum of that plan. We used to have a large capability to do this. It’s atrophied to a degree. 24

they came into the United States through San Diego, NSA had to drop coverage, because we don’t do collection inside the United States. Ninety percent of what he revealed had nothing to do with that program. Yet, he revealed capability that I believe is hurting NSA and the intelligence community today. The concern at NSA is to not violate any law, process or procedure. If I come across a U.S. person overseas, I drop that collection immediately. I believe that the rules and regulations were working. The insider is a critical threat, and I believe that he proved to be just one of those persons.

QUESTION: What role does the U.S. have to protect [global company] assets outside of the U.S.? DE: If the U.S. had a Secretary of Cyber, we could set priorities and develop mechanisms that allow us to prioritize our actions and set a focus. It’s a global problem. If Deutsche Bank has a problem, the U.S. has a problem, Wall Street has a problem. There’s a global perspective here that has to be accountable. We have tremendous allies overseas that we can rely on and engage in partnerships with in cyber that can bring maturity. We have concerns about systems overseas where there’s a U.S. person that needs to be protected. That becomes very problematic for a foreign intelligence agency because if there’s a U.S. person outside the United States, the collection against him or her has to be dropped. We can work with our allies to develop secure measures and follow priorities. We have great allies, the Norwegians, the Scandinavians, the Israelis, the NATO countries, whom with we can work to help facilitate needed protection. It’s a global economy, and we have to be conscious of that. NSA can’t just be worried about the United States. It now needs to worry about the kind of critical infrastructure that might be overseas that could impact us.

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 24

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 KEYNOTE

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 25

| 25 8/10/17 12:37 AM


PANELISTS

RICK HOWARD

RICHARD NAYLOR

CASEY FLEMING

Mr. Howard is Chief Security Officer

Mr. Naylor is the Senior Cyber Advi-

T. Casey Fleming serves as Chairman

for Palo Alto Networks. Previously,

sor to the Director, Defense Security

and Chief Executive Officer of BLACK-

he was the TASC Chief Information

Service (DSS) and is Deputy Director

OPS Partners Corporation, the lead-

Security Officer managing the securi-

for CounterIntelligence, including

ing intelligence, think tank, strategy,

ty of classified and unclassified TASC

Cyber Operations. Mr. Naylor is respon-

and cybersecurity advisors to senior

networks. He led the Verisign iDefense

sible for executing DSS’ Cyber operations in

leadership of the world’s largest or-

Cyber Security Intelligence business as the

the National Industrial Security Program. He

ganizations in the private sector,

GM and Intelligence Director in charge of

joined the DSS in September 2011 as the

government, military, and academia.

a multinational network of security experts,

Chief, CyberSecurity Division. Mr. Naylor is

Mr. Fleming is widely recognized as the top

delivering cybersecurity intelligence products

the recipient of the 2015 Symantec-Govern-

thought-leader, leading expert, and speaker

to Fortune 500 companies. Mr. Howard ran

ment Cyber Award (Federal – Defense/In-

on intelligence, strategy, national security,

the global network of Security Operations

telligence) which recognizes individuals who

asymmetrical hybrid warfare, and cyberse-

Centers for Counterpane Internet Security,

exemplify excellence in government cyber

curity. The Cybersecurity Excellence Awards

overseeing intelligence-gathering activities.

security through contributions to programs

recently named him Cybersecurity Pro-

During his 23 years serving in the U.S.

that protect data and systems. Mr. Naylor

fessional of the Year. Mr. Fleming led orga-

Army, he held various command and staff

also led the DSS cyber team to the DoD’s 2013

nizations for IBM Corporation, Deloitte Con-

positions involving information technology

CounterIntelligence Functional Services Team

sulting, and Good Technology. He served

and computer security and he served

of the year. Prior to his DSS appointment, he

as the founding managing director of IBM’s

as the chief of the U.S. Army’s Computer

was the Deputy Director, Communications,

highly successful Cyber division. Mr. Fleming

Emergency Response Team (ACERT). He has

Computers, Architectures and Chief Informa-

earned his Bachelor of Science degree from

published numbers papers on technology

tion Officer, USCYBER COMMAND.

Texas A&M University and has participated

and security and served as executive editor

in executive programs with Harvard Business

for Cyber Fraud: Tactics, Techniques and Pro-

School and The Wharton School.

cedures and Cyber Security Essentials.

26

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 26

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION

THE EVOLVING THREAT TO CRITICAL INFRASTRUCTURE INTRODUCTION

$175M research and development port-

RICK HOWARD

folio of classified and unclassified cy-

I am the Chief Security Officer for a security vendor. Here are the three challenges I see in our industry, across government, and across commercial. First, it’s too complex to manage a security perimeter for your organization. Second, our infosec staff spends too much time not concentrating on the most important things. Third, we have no way to automatically take real time threat intelligence and get it converted into prevention and detection controls for the systems you already have in place in your own environment that are designed to do that.

Dr. Daniel Ragsdale is the founding director of the Texas A&M Cybersecurity Center and a Professor of Practice in the Department of Computer Science and Engineering. Dr. Ragsdale previously served as a DARPA Program Manager

bersecurity and educational programs. Before joining DARPA, Dr. Ragsdale served 30 years in the U.S. Army in a wide array of operational, educational, and research and development settings. During his Army career, Colonel (retired) Ragsdale participated in Operations Urgent Fury (Grenada), Enduring Freedom (Afghanistan), and Iraqi Freedom (Iraq). Dr. Ragsdale also served nearly 15 years at the United States Military Academy, West Point, where he was a leader in a variety of teaching and research roles. His academy service culminated as Vice Dean for Education, the Principal Deputy

We are consumed with the idea that you have to have vendor in depth in

If you have 15 to 20 tools, that’s four times 15 to 20, and that’s really expensive. Small organizations have 15 to 20 tools, but medium sized organizations have 60 tools. A largescale organization, like a big financial or a government organization, has over 150 tools. I talked to a CISO of a large financial just last month. He claims to have 333 different security tools in his environment. He doesn’t manage them; that is too hard to do. Think about how we got here. When I started this back in the 1990s, we subscribed to an idea of “defense in depth.” The idea was that

to West Point’s Chief Academic Officer. CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 27

PANEL DISCUSSION

and successfully led and managed a

One of the themes for the summit is that there’s cause for optimism. Nevertheless, there is a viable, real, and increasing threat. I want to make sure that’s very clear that we want no one to take our optimism as a means to say that the problem is solved. All three of the panelists have come from different perspectives. They looked at the problem in a different way, but all of them should be able to bring some keen insight to this important topic of the evolving cyber threat particularly with respect to critical infrastructure.

MODERATOR DANIEL RAGSDALE DIRECTOR, TEXAS A&M CYBERSECURITY CENTER AND PROFESSOR OF PRACTICE

your organization, meaning that you might have 15 to 20 security tools protecting your environments. The rationale is that because you don’t trust any one vendor that you need to have 15 or 20 vendors doing all that work. The secret in the security vendor community is that we make you manage all that yourself; we don’t integrate. One security tool that you buy is a point product. You buy that four separate times. You have to buy the box, you have to buy someone who can maintain the box, you have to buy someone who understands the intelligence coming off the box, and you need a fourth person in the security operations center to stitch all the tools together to make a coherent threat picture.

| 27 8/10/17 12:37 AM


THE EVOLVING THREAT TO CRITICAL INFRASTRUCTURE you would have multiple layers of security tools, and you would hope that one of them would stop the bad guy. We all had three tools in our defense in depth posture. We had a firewall, an intrusion detection system, and an anti-virus system. But, they weren’t connected to each other. We hoped that the firewall would stop the bad guy. If that didn’t work, we hoped that the intrusion detection system would. If that didn’t work, we hoped that the anti-virus would. Bad guys always found ways around the system, but that was all we had back then. It was the primary philosophy out there. The whole industry changed in 2010 when Lockheed Martin wrote the fantastic white paper about kill chain. That revolutionized the industry. They can be attached and integrated? Prevention and detection should be at each phase where the attackers had to go? We can actually find, fix, target and block them, and then assess how well we did. After that, the security vendor community began selling more tools. Instead of just having two or three in your organization, you now had 15 to 20 tools, 60 tools or over 100 tools. Your infosec staff did not increase. You had the same team, or they’d gotten smaller. You had no way to manage the devices on your own network. We have clung to the idea that you need to have a different vendor in each of those tools, all the way down the kill chain. Another best practice that we need to shatter is best of breed. Your infosec teams (that don’t have enough time to do what they have to do in the first place) spend months in the lab doing bake-offs against the greatest intrusion detection systems and then decide which vendor to pick. If they choose a vendor that they don’t already have, they’re going to spend the next year fork-lifting all that 28

technology out and fork-lifting the new technology in to get to the exact same spot they were a year ago. Teams need to be convinced that best of breed is the wrong way to think about the problem. What your team should be looking for in vendors is how well integrated they are down the kill chain. The tools you buy should be automatically talking to each other so they are informing themselves about what needs to be automatically configured and updated for the latest and greatest threat. Pick one vendor that you trust, spend a lot of time with them and develop a relationship so that your pros and cons, matches theirs. RICHARD NAYLOR I work for the Defense Security Service (DSS). We oversee the National Industrial Security Program (NISP) for 10,000 Cleared Contractors (CCs) at 13,000 locations in the United States; essentially, anybody who does classified business with the U.S. government. We do it for 31 Federal agencies, not just the Department of Defense. We partner with the Cleared Contractors to essentially become part of a shared; a sensor array that’s part of the law enforcement/ intelligence community sharing environment. We attempt to put them in a position to be more successful and to recognize a foreign intelligence entity or an adversary. A side benefit is they’ll pick up on criminal activity, but the main purpose is to thwart foreign intelligence entities. From those 10,000 cleared companies, I receive over 45,000 reports a year. That is about a third of what I should be seeing, even though there are contractual obligations to report those kinds of things. Of those reports, 70 percent have a cyber component. Seventeen percent of the reports from cleared

contractors are purely cyber based. 53 percent are a blended operation. What we normally find is the cyber component is being used to either confirm a HUMIT activity, or it’s a precursor to a HUMIT activity. I’ve seen activity where they’ve attempted to get into a network unsuccessfully. They attempt to buy the company first or buy the technology, or just ask for it. I’ve seen them try to penetrate networks. When that didn’t work, they go to their church to physically make a contact and establish a personal relationship. Eventually they can establish an automated relationship and get into their networks. If your overall security apparatus for your program is not interweaved with what’s going on in the cyber domain (i.e., if your CISO is completely and totally under the CIO’s thumb) that can be problematic. You cannot know the overall risk to your entity if you don’t know what’s going on in the cyber domain. We have a tendency to describe this problem set as offensive and defensive, but it’s more free-flowing than that. It’s more like the defensive component in a completely indefensible position. The best defenses in the world won’t keep out the adversary if the surface vulnerability is expansive and porous. You have to have a strong offensive component. We are trying to evolve a sharing program where not only the government’s taking information and reacting to it, but we’re also feeding them and putting them in a better position. Industry is the huge component here. That portion of the sensor array is huge, and they’re the first ones who are seeing cyber activity, it’s often the initial point of the entry. Of those 40,000 reports, about 1,000 of them per year turn into a full felony-level investigations or operations. DSS has a strong relationship with the

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 28

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION other government agencies, like the FBI, the NSA, and CYBERCOM. We interact and work with each other, and industry, to pursue and make it more costly for our adversaries. We also seek to partner with industry to ensure the right focus on the problem set. Our conversation is about how to bring the governance up and how does a CISO push information to the C-suite? The end objective is to make it a C-suite responsibility. If the USG is looking at industry to deliver a product, the taxpayer should demand delivery of uncompromised product as well as considerations of cost, schedule, and performance. Unfortunately, from a C-suite’s perspective, security is often a cost to be minimized, as to focus on production. The intent is to switch it, pushing security as an issue the board has to deal with because it’s something that they’re completely reliant on to deliver the product. RICK HOWARD I’ve been in different commercial companies, and every board has the heat matrix of risk to the company (the kinds of things they know would impact the company) as being force majeure and executives dying, cyber is always the last one listed as a risk. Cyber is not the risk; it’s a vector of positive change in your industry, and of where bad things can happen. The Boards make decisions about risk to the company, and our industry does a poor job of conveying risk to management. People like me are technical ranks, and we don’t really know how to talk to the board about business. We need to get better at that. CASEY FLEMING When you hear cyber security speakers speak, they’re all correct in what they’re telling you, but it’s a complex industry with many working parts. I was fortunate

enough to have founded the IBM cyber division. Since leaving IBM, I’ve continued to stay in cyber security. At IBM, they trained us to be focused on the customer and figure out why the customers were having problems, like data breaches, with such frequency, depth and severity. Our company commissioned a very strong intelligence group to be able to understand what was going on. Cyber security is fundamentally broken, for a number of different reasons. Number one is that we built the internet to be a communication vehicle. We didn’t think about security in the early days. Hardware and software are created, configured, and managed by the human condition. The network, where all the information is stored, is human based, and our adversaries are going after it. When we talk about cyber security, it is all about product-base and defense, but we never talk about the adversary. How can you possibly have a cyber security strategy if you don’t know who your adversary is? How can you fight a battle if you don’t know who your adversary is? You have to become extremely intimate with who your adversary is, what they are after, what their strategy is, and how they are changing. There are three types of adversaries. There are nation-states. There are 30 countries developing offensive cyber capability to weaponize cyber. Number one is China; number two is Russia. Then you’ve got Iran, North Korea, India, and then there are another 25 that are up and comers. Our offensive capability is not where it should be. U.S. Cyber Command was only created a few years ago because of significant failures in the Air Force, in the .mil network. We need to start looking at your adversaries and become intimately familiar with who adversaries are and what they want. For decades,

they have had a thirst for stealing innovation. The United States and its economy was founded and built on innovation. The industrial revolution and World War II propelled us to a world superpower. We’re in the process of losing that. Our adversaries are good at what they do. We are behind when it comes to cyber security. A recent study of 2,000 top senior security individuals across the globe, found that one third of all cyber attacks are successful. Three quarters of those executives are unaccountably confident in their cyber security strategy. We lose about $5 trillion a year in total value out of the U.S. economy, roughly one third of the GDP. That number comes from $500 billion a year—the raw innovation that’s stolen out of the economy. After you multiply that by the 10 years it was meant to power your company, your revenue, your profit, your jobs, your military, and your government agency— the total economic effect that’s coming out of your economy—that is $5 trillion. We’re working out a 1990’s duct-taped playbook, and our adversaries have been working for three decades out of 2050 playbook. When you lose a third of your GDP, you’re at war. It’s an economic war, but it’s bigger than that. It’s called asymmetrical hybrid war. When Congress reconvened, the entire week was spent talking about the Russian hacking. We’re still not 100 percent sure it was the Russians because attribution in cyber is extremely difficult with the different shadowing, proxy locations and geolocations. Congress, some of the speakers and CNN started mentioning hybrid warfare. Our adversaries are performing asymmetrical hybrid warfare on us. Cyber security is one of 38 different methods of asymmetrical hybrid warfare.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 29

| 29 8/10/17 12:37 AM


THE EVOLVING THREAT TO CRITICAL INFRASTRUCTURE RH: I want to be precise how we use words like “warfare.” No one has died and no nation has declared war on each other. What we’re seeing with the Russians is influence espionage operations. There’s been no war declared, no rounds have been fired. CF: It’s evolved warfare. People can die if life support (in hospitals or in homes) or grids are shut off. You have to understand who your adversary is and that they are playing by a different playbook. The Russians are using about seven or eight different methods of asymmetric hybrid warfare, and the Chinese are using every one of the 38 different methods of asymmetrical hybrid warfare. Only until we understand that, in the macroeconomic and geopolitical spectrum, are we ever going to be able to start to understand cyber security and defend ourselves. The other thing about cyber security is that our industry is extremely product based. We have an overdependence on products and marketing of those products and an overconfidence of cyber security products. The average enterprise has 85 cyber security products. One malicious, careless or “turned” insider can render all that useless by getting passwords. Those are the accurate numbers of what’s going on. It’s a third of the GDP. RH: What I mean by “playbooks” is not the number of attacks, but the number of steps and what each step entails. Government cyber intel groups think the number of playbooks that we need to deal with is small, less than 100. Commercial people think it’s bigger, like 10,000 to 20,000. It’s not a million; this is a doable problem. We’ve been using 5,000. You can put 5,000 things in a spreadsheet. QUESTION: What is a playbook? RH: Down the kill chain, when the adversary has to deliver, to compromise, 30

to establish command and control channel, or when they go lateral or they exfiltrate data, those are all clues that they leave behind as indicators of compromise. What network defenders do is convert those indicators of compromise into prevention and detection controls down the kill chain. That said, for a specific adversary, I call them a “playbook.” We are not good at defending against all the playbooks out there. This is accomplishable.

DR: There was discussion of attribution, and that there’s uncertainty in cyber. What I’m hearing is that there are evolving methods to do attribution. I’d like to know where we are going with attribution because without good attribution, it is careless to even contemplate anything offensive.

CF: Insider threat is extremely important because one rogue or turned insider—somebody in the supply chain— even a law firm, can render everything useless all the way up to headquarters with the information. Our main adversary still follows Sun Tzu from 500 BC, that an actual kinetic war is inefficient; you win the war before the battle. That’s what is occurring right now.

DR: Focus on the second category, as it could be actionable. Do you think we’re getting to a space where, in real time, we could have actionable intelligence?

Other reasons why we’re broken is that senior leadership does not take this seriously. We present to a lot of board of directors and they say cyber security is number one on their list of top threats, yet in the top 10, only three are even related to cyber security in some way. Cyber security is the top national security threat. When you look at the whole picture of cyber security, next to it, you have insider threat. Above that you’d have a box which would be intelligence, and above that would be your strategy. That’s the ecosystem that we all need to be looking at. Otherwise, when you see a data breach, most of us just think it’s unfortunate, but that there’s no price or damage associated with it. Understand that in a data breach, a chunk of the American dream is leaving forever. Russia has very strong offensive capability, but of their total investment, it’s about 15 percent. The other 85 percent is espionage.

RN: What’s your standard for proof? If your standard is an American court of law proof, then you have a problem. If not, then you’ve got more flexibility.

RH: Our idea about attribution has evolved over the last 10 years. When we started this, attribution meant we had to know that it was a person behind the keyboard launching the attack. Commercial cyber security network defenders don’t care who did it, if it was the Russians or the Chinese. I need to know how they did it, so I can block them from doing it to me. I can absolutely attribute 99 percent of the known attacks that are out there right now, all the way down the playbooks. It is useful to have that information for other decisions, but I don’t need to know that to make defensive decisions. RN: We have a tendency to think of the consumers of intelligence as a monolithic entity, but they are not. There are some that care only about how to turn it into a defensive position. There are some that have an intelligence component that can consume the context. There are some that are highly financed that absolutely don’t want to know the context. RH: There are two different goals for those two different organizations. Commercial network defenders are trying to stop material impact to their business.

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 30

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION Government cyber intel individuals are trying to influence geopolitical behavior. Those two goals are not the same thing. CF: Attribution still is extremely difficult. As we get better on trying to understand and locate the adversary, they get better at diversion techniques. We run a group of top level operatives in the darknet, which is also part of this ecosystem. The global banking heist last February involving the Bank of Bangladesh was $81 million, it was supposed to be $810 million, but they were a decimal place off and that transaction went through the New York Fed. We worked with the New York Fed and with the Treasury. That vulnerability was found in a patchy server and it was perpetrated by a Chinese APT. They gained access to the global financial network through the Mexican financial system because Mexicans didn’t do a 2006 patch on an Apache server. That’s where the Chinese got access into the Mexican financial system. Then, the Chinese kicked that vulnerability out into the dark web where someone out of the UK was exploiting other banks, like Ecuador and even an American bank. I called the CEO of that bank and told him that the bank’s information was being exfiltrated. He didn’t want to be sued, and he didn’t want his stock price to go down. It was all deniable. So,

he put me over to IT who said that there was no problem because it was patched a long time ago. I told them otherwise, and hours later, I was sent a cease and desist letter from the attorney. China and Russia want to stay stealth when they’re penetrating a network because later they may need access to innovation, information, or want to perpetrate a cyber war. The less sophisticated attackers plan and gather malware through the darknet, and then they like to brag about it. They try to sell it through the dark net. Attribution is still difficult, but sometimes we’re able to catch these individuals in the dark net because they’re trafficking on it. RH: The network defenders have to be better as a community to be precise about how we attribute. We tend to generally say it was or was not the Russians. In reality, we should say that there’s a 60 percent or an 80 percent chance it’s the Russians, or to get us to 85 percent, we would have to know these three other things. We do not have those conversations with our board members or our executives. We need to get better at that. RN: In terms of leveraging the kill chain, we’re attempting to move it more into the recon phase, things like the dark web. That’s just one of the possible sensor arrays. We want to try to get to their

intent (why they going there and what are they interested in there) then try to thwart their attempt. That includes non-cyber entities. I can give you an example of an adversary who was only interested in 14 individuals who knew how to do a particular thing. He had already stolen the technology, but didn’t know how to implement it, and he knew that there were 14 others that knew how to do that. He knew one of them, and wanted the contact list in order to get at the other 13. In the recon phase, we were able to pick up on that, and shut it down. QUESTION: With such uncertainty about attribution, how can we begin to engage in the offensive piece without knowing whom to pursue? RN: If the standard of proof is court of law then that is a problem. If it’s enough to build an operation to try to pin them into revealing themselves, that would be a legitimate step, but that takes time. RH: I worry about this because the hackers might retaliate upon your offensive attack. Your team better be able to absorb that. DAN ENNIS: A sophisticated plan is needed that anticipates what the adversary might do. Adversaries have fingerprints that we study which allows us to do

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 31

| 31 8/10/17 12:37 AM


THE EVOLVING THREAT TO CRITICAL INFRASTRUCTURE attribution to a point that we’re confident enough to take action, maybe not a cyber action but a different action. That can’t be done in isolation. Taking action will result in a counter-fire. This is obtainable if you have synergy across the government and the private sector. The problem right now is that we are not allowing our leadership to make the decisions they need to make because we don’t have the needed maturity. QUESTION: Let’s go where the attacks occur at the technical level. We don’t have access to military CPUs or FPGAs, we run simple attacks that are in 25 years of Dr. Dobb’s journal and in Circuit Cellar. They fly against almost everyone in the room. We just subvert the operating system in a few seconds, get down into the chip, we may reflash it with our own code. We own everything that processes on there from that point forward. Doesn’t matter what side of the boundary we’re on. When is this going to be addressed and breached, because even running NSA FLASK for four years, we’ve had some magnificent attacks just zoom right through. None of the boundary defenses get it. It’s going straight into the processor with the reserved op-codes, which you’re not supposed to be able to find, but we found 300 of them online, and they work. You can give me an LOA, and I’ll unlock your systems for you. When are we going to go after this hardware, whether it’s Lanier or Hayden or Super Micro or any of the root builders for the name brands we sell in the states, all of which is beyond porous to begin with? RH: There are numerous things like 32

that that we need to worry about. Think about the attack as a system down the kill chain. They may be able to accomplish what you just stated on that piece of hardware, but they still had to do a lot of other things to be successful in their mission. They still have to steal data, exfiltrate it, and do command and control. It needs to be approached as a system, and not individually. CF: We need some type of oversight group in the country to screen hardware when it comes in. That would go a long way to start cleaning up what is going on right now. My personal opinion is that if you have any Lenovo, Huawei or ZTE equipment you should get rid of it. These have all been subjects of malware and spyware and have been covered in the media. The federal government and the Department of State said that they will not do business with Huawei and ZTE. The last one is a cloud company called Baicells. Again, in my opinion if you have any of that in your company or your organization, red flags should be going up. JL: The deconstruction of manufacturing in the U.S. means that we are importing a large amount of our infrastructure from, in many cases, the third world (which itself is a vulnerability of our critical infrastructure). For example, we used to manufacture transformers for the power grid in the U.S. We don’t anymore. If there was a cyber event that blew half the transformers in Texas, we’d have to wait about six months for replacements to be delivered. Our resilience is a big issue on critical components of the infrastructure that have long lead times that we don’t manufacture in the U.S. In a system, architecture it is not purely about computers. It’s about pieces of hardware that have a computer component that can be damaged by

cyber events and about the supply chain for things that are hard to come by that might be damaged. CF: We need to start air gapping our technology and put a strategy together on our infrastructure to disconnect it from the internet. We need to protect our infrastructure, the mechanisms of keeping our grid up and all the other severe infrastructure that we need. We need to be moving as quickly as we can on those critical systems and keep them tight and closed and also manage the insider threat piece of it. As soon as you start to squeeze out access on the internet, then the adversaries will find other methods, like the human method. RH: I agree that those transformers are vulnerable. What needs to happen in order for us to act now to fix the problem? What does catastrophic loss to the nation mean? Using death as the metric, suppose, for example, that people died in the Northeast because they didn’t have electricity. What are the odds that the Russians might do that? These are questions none of us are good at answering. We need to be better at that before we spend a lot of resources, even change our way of life, and before we can rationalize that decision. QUESTION: How is cyber security communicating with each other so that attackers don’t leverage each other’s findings? We had a similar problem in the energy industry, a cyber attack.One of the operators sent out bulletin to every service, every operator in the industry, identifying what was going on. We would have never known until somebody communicated it to us. Is there a process within an organization? RH: That goes to the importance of information sharing. In the security

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 32

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION vendor community, we don’t share threat intelligence with each other. All the other verticals do (financials vector, healthcare, IT, telecom). Everybody else has figured out that they should work together on this one thing, even though they’re competitors. An organization started about three years ago called the CyberThreat Alliance which is a group of security vendors who have decided that we’re not going to compete on intelligence. We make sure that we all have the same intelligence, and compete on product instead. All security vendors have similar capability. There are eight of us in the alliance right now: Palo Alto Networks, Symantec, Intel Security, Fortinet, Barracuda, Zscaler, Telefónica, and ReversingLabs. We’ve been sharing that threat intelligence for the last couple of years, and will begin to share attributed playbook information for every adversary that we know. If the alliance gets enough members, we will blanket the world so that everybody in the world that connects to the internet will have at least one of us in their organizations protecting their environments, receiving real time updates to every known playbook in existence. It’s free to join. QUESTION: Several of you have been talking about offense as a potential defense. How do you project strength in the offensive in cyberspace? RN: It’s tied to our conversation about attribution. One that gets pursued most often is felony level investigations. NR: Russia may be afraid of attacking us because we have 2,000 nuclear weapons pointing at them. How do you tell them the same story about cyber without revealing our offensive weapons on the cyberspace? The moment the Russians know we have something on cyberspace as offense tools, they will start patching it up.

RH: It doesn’t have to be a cyber response. I think President Obama’s response to the Russians was dramatic, draconian, even, when he kicked out 32 from the embassy. That’s a step in the right direction. RN: You can take the word “cyber” out of your question. How do you respond to an adversary who is stealing? It’s not a cyber question. CF: It’s all about attribution. You better be sure that you know who is doing it and it’s difficult to know that. We still don’t know it’s the Russians 100 percent. We have strict federal laws about fighting back, because once you do that, you go into this next realm of cyber war. It’s one level below nuclear that we could start this game of going after each other and there’s a lot of collateral damage because of attribution. RH: Five years ago, we would have never have attributed cyber attack to a country. In the last couple years, we’ve done North Korea, and now we’ve done Russia. That our government, our president would stand up there and say, “We think it’s those guys. We’re going to do something about that.” That’s a huge change that we’ve never seen before. RN: The other thing that’ll become interesting over time is the liability and who indemnifies those who put other ones in a stress position. If you identify something throughout your 35 enemies and then you spread out through all your 35,000 customers, and essentially you shut out a U.S. company, unwittingly, and that’s a bad day. Who indemnifies you from that? QUESTION: This discipline is still evolving. Semantics really do matter. What is cyber warfare? I believe the way it was described here was very loose, not accurate by U.S. government

standards, on what warfare is, but colloquially, we all understand what you’re talking about. Which one are we going to use as far as the dialogue, that does matter, and is still evolving. Our court systems have not caught up yet. There are other challenges that we have that we don’t agree on yet. If we can’t agree on some of those basic things as the experts here, how can we expect the rest of the community to know where to go when we don’t even have agreement on some of the fundamentals of the discipline? RH: We have warfare in everything. It seems like there’s activity on both sides that appear warlike, but to what Colonel Vega was saying, there are things that are implied when we say we’re at war with somebody, especially in cyber security. The Chinese have said that their whole doctrine is in asymmetric warfare. It’s hard to have that conversation when our enemies are using that discussion. QUESTION: I wonder if there ought to be a think tank that’s not in Harvard or Washington to look at the question “Are we or are we not at war?” SANDRA BRAMAN: Texas A&M Department of Communications Professor, studying information technologies and their policy implications. International law also does provide definitions for what war is; having had an international legal expert on the panel would have been useful.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 33

| 33 8/10/17 12:37 AM


RHONDA MACLEAN With more than 30 years of information technology industry expertise Ms. MacLean founded the private risk management consulting company, MacLean Risk Partners, LLC. She serves on the Board of KEYNOTE

Directors of several cyber security and fraud prevention companies, including RedSeal Systems and ThreatMetrix, Inc. She provides industry leadership on AXELOS’ Cyber Resiliency Executive Action Team. Ms. MacLean formerly served on the Board of Directors of Vontu and PGP Corporation, leading technology companies dedicated to data protection and data loss solutions, both acquired by Symantec. Prior to establishing MacLean Risk Partners, she was the Chief Information Security Officer for Boeing, Bank of America and Barclays, PLC, Global Retail and Commercial Bank in London. Major accomplishments as the global executive in these roles included: industry leading security enhancements that increased adoption of both on-line consumer and business to business ecommerce; leading ISO27001 certification; receiving numerous awards for cyber risk management projects that promoted customer confidence and increased revenue opportunities; and leading crisis management teams for cyber security and privacy related events. 34

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 34

The more we can simplify innovation, the more resiliency we’ll get because more people will use it.

I’m going to talk about making cyber resilience strategic. I was the computing security executive at Boeing (we didn’t have the word “cyber” then), and I ended up taking over all of the information technology security. I’m going to dovetail into the experiences that I had moving from aerospace to banking. I was at Bank of America, as the Chief Global Executive for Security, for over a decade. After 9/11, I was also responsible for our business resiliency. We not only recovered from a resilience perspective for ourselves, but we also recovered for some of the other banks. As a result of that, in 2002, after Presidential Directive 63 was authorized, I was the First Sector Coordinator for the financial services industry. We formed the Financial Services Coordinating Council (FSCC). We did a lot of education and outreach in critical infrastructure. At the time, there were 8,000 regional banks that didn’t have the kind of resources or the exposure we had. After leaving Bank of America, I moved to Barclays Bank in London. I had the opportunity to see how executives think about cyber resilience in the international space. This is not a technology-only issue. The business environment is just as important, as the legal and regulatory environment. My philosophy around understanding the regulatory environment and the cross-border

issues associated with doing business with global networks is transparency. There are many different jurisdictions that you need to be concerned about. Vulnerabilities and threats are global. Many people don’t even know what they own. They don’t know where their supply chain ends and begins, and where their data resides. Really understanding those vulnerabilities and threats in your environment is difficult. No matter what industry you’re in, the common factor is continuous globalization and complex technology. There are strategic things with our physical structures and our critical infrastructure where we should have a more resilient backup. We perhaps should not be as dependent on the resiliency from a globalization perspective and should consider having our own “spare parts” for certain components of our critical infrastructure.

I have a different feeling about the “cyber thugs” than the previous panel. I made it a practice, whether I was in aerospace or banking, to not be a soft target. We do need to find attribution when possible. Just because you don’t go into a bank with a drawn gun and rob it, but rather take $85 million out in two seconds, is no excuse for not finding the bank robber. If we don’t feel like it’s easy to do attribution, then we need to be thinking about how to get better at it because people need to be afraid

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 KEYNOTE

MAKING CYBER RESILIENCE STRATEGIC TO THE BUSINESS to do this. They need to know that we can track them down eventually.

The other difficult thing is talking to business leaders. Cyber is not necessarily the first thing on their mind, and cyber is not business as usual in most corporations today. The business process should include a resiliency quality control. Thinking about the business solution you’re providing is essential. Cyber professionals are service providers. The service we’re providing to the business is to enable the business to take risk. It enables them to broaden their business processes and take advantage and have agility. The human factor is probably the biggest problem of all. People just make mistakes, and we have not delivered tools that make it easier to avoid them. An example of such a tool is the picture that appears after you

The supply chain is critical. I was astounded at how many suppliers we had at Boeing, Bank of America, and Barclays. As Barclays is global, we had banks in Russia and other foreign places. Many people don’t have an inventory of their supply chain, or even know if a contract is in place and if the contractor knows his responsibilities. If the contractor breaches, the company that hired the contractor suffers the loss. It was Targets’ CEO and CIO who lost their jobs. The supply chain cannot be underestimated. If we don’t understand our supply chain

in any given sector, then we’re one supplier away from creating a vulnerability for the whole sector. It is bid out to share responsibility. I believe that even customers are responsible. A customer has to accept his share of responsibility if he is going to set “password” as his password, refuses to take a phone call if all his money gets transferred to another bank, or he doesn’t have a firewall, antivirus or any malware detection on his device. The whole end-to-end process of services and capabilities is a shared responsibility. Continuous internal education of the people who deliver the service—from the employees to your suppliers—is one of the harder things to correct. Before you can translate the value cyber brings to the business, first you must understand what the business is. If you do not understand the business strategies, it’s going to be hard to convince the customer to listen to you and make them understand why he or she should take advantage of your services. Understanding the business strategies and objectives can help you communicate the value that you’re delivering to the organization. The executives are listening to their customers, their employees and their shareholders. There’s more happening as far as partners and supply chain because that’s going both ways now.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 35

KEYNOTE

When I was at Boeing, hackers broke in and were using super computers to quickly crack passwords. We discovered them, and ultimately got a wiretap warrant. We found that the hackers were trying to crack passwords, but their real target was the Clerk of the District Courts’ computers which contained witness protection information. We never would have found that out had we not gone after them. While you may not be the end target, it’s still worth tracking down the adversary to figure out what’s happening and sharing that information.

log onto Bank of America’s website. That was an innovative thing that my team put together as a result of phishing. We wanted a way for customers to know that when they come to our online banking services they really are at Bank of America’s online banking site. From a human factor perspective, we wanted something that anyone could understand. We had very little uptick to call centers when we rolled that out nationwide, which is very unusual. When you roll out a change on your web presence, often times you will have a huge in-flux of your customers calling and complaining that they don’t understand how the system works. That drives up costs and reduces customer satisfaction. The more we can simplify innovation, the more resiliency we’ll get because more people will use it. We need to make it easy for customers to use.

| 35 8/10/17 12:37 AM


MAKING CYBER RESILIENCE STRATEGIC TO THE BUSINESS I used to have a war room everywhere I worked in order to create a dialog with the executives. The conversation started with the business strategies, not with my security strategies. Next, I would explain how our strategies and priorities fit that business. I’m a big believer in numbers because statistics matter. I have not found an executive yet that didn’t like facts and data. I would tell the executives about our program on authentication and identity management, and what we were doing for the customer and employees. Last, I’d discuss finance and I would take our total cost of providing the services that I had described and divide that by the number of customers we served. The only way to get a budget is to be able to articulate

and it could, but the evidence so far is that it doesn’t. But, they do care about their brand. They want to make sure that people trust them. There is a trend to bring more risk people directly onto public and nonpublic boards, which I think is wonderful because that’s a different view and mindset and a different set of questions. The audit committee has typically been the group who is focused on the financials and, primarily, operational risk, but many of them have little or no technical background. I pulled some reports from the National Association of Corporate Directors, who had done some recent studies on what boards want. The re-

The human factor is probably the biggest problem of all. People just make mistakes. how your spending adds to the business value. I never had trouble getting money because I was selective in setting priorities. Being focused on the right things and being able to articulate that back will pay in dividends. There’s a lot of emphasis on making sure the customers are happy. The key right now is promoting innovation and competitive advantage. We have a lot of disruptive technologies that are changing the way Americans do business. We need to think about how to secure and transform business. Executives very rarely want to know about technical issues and they tend to care little about risk. They really want to know what the impact is going to be to the company. Are we going to lose customers? Brand is keen for companies. Nevertheless, Target’s stock managed to rise that year, even though they had that breach. Many think a breach would crater their stock, 36

port’s results show the biggest things that are on corporate boards’ minds (CEO’s from the New York Stock Exchange). It was interesting how many of them were less than confident that they had good controls. That means that there are a lot of opportunities in working with our executives to build that confidence and have that dialog. The report further shows that, if there is an adverse event, executives are worried about customers leaving and also the cost. The companies that I was working for had a number of breaches. (Anybody who says they haven’t had a breach, just doesn’t know it or they’re not being truthful.) You can’t quantify the cost associated with dealing with a breach. It’s more than re-setting passwords on servers. It’s the cost associated with the brand damage. If it’s a privacy event, you’re going to have to give

access to credit reports and identity theft insurance and all other associated costs. It is estimated that it can take almost $200 per customer to recover from a data breach. With 40 million online banking customers, this can become a huge figure. Plus, there is the damage to how customers feel about it. Executives are concerned about these associated costs. Executives are also concerned about the competitive advantage associated with espionage. Most of the car companies have had their car plans stolen by foreign competitors, resulting in lost costs for R&D, lost profits and lost opportunity. It matters that we protect this information. I was surprised the report showed that regulatory and compliance was a low priority because vendors talk about compliance frequently. Yet, that was low. (By the way, 1 percent of the directors said they wanted antidotes.) When you talk to the boards, they want to hear your strategy. They want to know how it fits to the business strategy and they want metrics, facts and data. If you think talking about the risk of potential cyber events will cause them to spend money or to change their thinking about the importance of it, that’s not how to do it. You need to understand how you add value to those products and services of their business, the business that you’re serving. Aligning any program to the business objectives and understanding what the customers’ perception of value is, from their viewpoint, is a necessary step. It is important to ensure that you have this regular ongoing awareness and adaptive learning. There are huge opportunities for online innovative training and games. Although there’s a lot of work being done, training is a real challenge for many companies to get right. That’s something that should be commoditized. If you

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 36

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 KEYNOTE

don’t know how to do the metrics, get someone that does, like a statistician. Know what is—and what is not—important, be able to back it up with facts and data, and to tie it to the money because security can impact revenues. We had a real problem with people adopting online banking in Europe. The customers were slower to adopt it because they didn’t feel secure. We did a lot of focus groups. We found that, unlike Americans, Europeans don’t mind security, like using chips and pins. We were able to increase online bank adoption by 40 percent per quarter with the roll out of the chip and pin program, and we upped our revenues. When I was there, I worked with marketing and sales to determine how much value was tied to each customer. Then, when I went to the board, I showed them the added revenue that we believed we impacted by doing something that enhanced customer trust and made us more effective with our lines of business. They saw us as a partner, not the police. QUESTION: As a critical infrastructure defender, you’ve based things on how it fits the business model. When you’re a critical infrastructure defender, you’re defending things that tie right back into civilization, specifically the power grid. There’s a classical

struggle in companies that do energy delivery between IT focused folks and corporate, who see security as a cost center, and operational folks, who view things in security as a reliability issue. How does that translate into something where there’s not necessarily a product that’s being sold but it’s civilization that needs to stay stitched together? RM: In 2003, when the power grid went down on the East Coast, all the banks were on the phone with the North American Electric Reliability Council (NERC). After that, we learned that the electric grid has to do the same thing that the banks are doing. In some cases, it’s going to be a whole infrastructure, and the impact and the cost to the country can be great. What makes it extremely hard for the power grid sector is that there are many small power resellers as part of the overall grid, so the impact isn’t felt like at a big bank. It is a national security issue. It’s a challenge from the business management and structure to really figure it out and get everybody on board.

both in energy and performance, by being able to make good decisions. One of the problems that we have is manufacturers not wanting to connect their operational technology into the Cloud. We understand that we have to have cyber security, but what’s the value proposition and how would you argue that openness into the Cloud through your operational technology? RM: The solution begins by talking to the manufacturer and the manufacturing groups. GE started an effort to have a manufacturing protocol. That is an area where there are innovation opportunities. If you’re not building that security in and lack the confidence to put your traditional legacy systems in the Cloud, you have to do like every business does. In order to design a solution, the whole picture needs to be looked at and the components that need to be included in that business—technical and operational architecture—need to be identified.

QUESTION: One of the value propositions of data in manufacturing is this global view and being able to have efficiencies, CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 37

| 37 8/10/17 12:37 AM


PANELISTS

DAFINA TONCHEVA

JOSEPH WEISS

Dafina Toncheva is a Partner of U.S.

Joseph Weiss is an industry expert

Venture Partners. Ms. Toncheva fo-

on control systems and electron-

cuses on enterprise, SaaS and se-

ic security of control systems, with

curity software investments. Ms.

more than 40 years of experience in

Toncheva served as a Partner at

the energy industry. Mr. Weiss spent

Tugboat Ventures and a Principal

more than 14 years at the Electric Power Re-

Investor since 2010. She joined the Tug-

search Institute (EPRI) where he led a variety

boat Ventures in 2008. Previously, Toncheva

of programs, including the Y2K Embedded

served as a Vice President at Venrock, where

Systems Program and the cyber security for

she was employed for two years and focused

digital control systems. As Technical Manag-

on investments in security and technology

er, Enterprise Infrastructure Security (EIS)

and helped expand its software investments

Program, he provided technical and outreach

in the areas of SaaS, virtualization, security,

leadership for the energy industry’s critical in-

infrastructure, and enterprise applications

frastructure protection (CIP) program. He was

with investments such as Cloudflare and Aria

responsible for developing many utility in-

Systems. Prior to Venrock, Ms. Toncheva

dustry security primers and implementation

held roles in development and product

guidelines. He was also the EPRI Exploratory

management at Microsoft Corporation and

Research lead on instrumentation, controls,

focused on authentication systems, digital

and communications. Mr. Weiss serves as a

signatures, and business workflow. She

member of numerous organizations related

served as a Board Observer on the Boards

to control system security. He served as the

of Cloudflare, PGP Corporation, Imperva

Task Force Lead for review of information se-

Inc. and RedSeal Networks, Inc. Ms. Tonche-

curity impacts on IEEE standards. He is also

va holds an M.B.A. from Stanford University

a Director on ISA’s Standards and Practices

Graduate School of Business and B.S. in

Board. He has also responded to numerous

Computer Science, magna cum laude, with

Government Accountability Office (GAO)

special focus on Cryptography, Efficient

information requests on cyber security and

Algorithms, and Database Systems from

Smart Grid issues.

Harvard University. 38

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 38

MODERATOR MLADEN KEZUNOVIC DIRECTOR, SMART GRID CENTER AND PROFESSOR AT TEXAS A&M UNIVERSITY

Dr. Mladen Kezunovic is a Eugene E. Webb endowed Regents Professor at Texas A&M University since 1986, serving in leading roles as Director, Smart Grid Center; Site Director, NSF Power Systems Engineering Research Center (PSerc), and Director, Power Systems Control and Protection Lab. He has consulted for utilities and vendors worldwide, and served on the Board of Directors of the Smart Grid Interoperability Panel (SGIP) representing research organizations and universities (2009-2013). He has also served as CEO and Principal Consultant of Xpert Power Associates and was a Principal Investigator on hundreds R&D projects. Dr. Webb has received the Inaugural 2011 IEEE Educational Activities Board Standards Education Award “for educating students and engineers about the importance and benefits of interoperability standards” and the CIGRE Technical Panelists Committee Award in 2013.

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION

EMERGING TECHNOLOGIES TO ADDRESS CURRENT AND FUTURE THREATS INTRODUCTION MLADEN KEZUNOVIC

DR. TOM OVERBYE

Mr. Tousley is the Deputy Director

Dr. Tom Overbye is an expert in pow-

of Cyber Security Division (CSD with-

er system computational algorithms,

in the Department of Homeland

operations, control, and visualiza-

Security’s (DHS) Advanced Research

tion. Dr. Overbye is a member of the Trust-

Projects Agency. His responsibilities

worthy Cyber Infrastructure for the Power

include organizational liaison, Integrated

Grid NSF Cyber Trust Center, which is work-

Product Team (IPT) management and sup-

ing to create an intelligent, adaptive power

port, and project leadership for educational/

grid that can survive malicious adversaries,

operational efforts, such as the Computer

provide continuous delivery of power, and

Security Incident Response (CSIRT) project.

support dynamically varying trust require-

He also supports several initiatives in critical

ments. Dr. Overbye is also concerned with

infrastructure protection and cyber-physical

improving power system trustworthiness by

systems. Mr. Tousley served 20 years as

better using information from the growing

an Army officer in the Corps of Engineers,

stream of data generated by the operation of

many in interagency technology programs.

the power grid. His goal is to better present

He led the Watch/Warning program in the

this data in order to help human operators

Federal Bureau of Investigation as part of

address and contain blackouts before they

the National Infrastructure Protection Center

cascade out of control.

(part of the Clinton Administration’s early engagement with national cyber security challenges). He previously managed the operations security team for a large internet service provider, was the principal with a technology start-up company in the private sector, and was program manager at the DHS National Cybersecurity Division. He has served nine years with DHS, principally with S&T, but also with the Domestic Nuclear Detection Office and supporting Customs and Borders Protection.

TAMU_CSProceedings_pp.indd 39

When we go forward and we invent, how do we in these complex systems, prove the things that we have developed? Going into a control system that is extremely complex in a production mode to test is very, very costly. Do we have alternatives to doing that, like large scale test beds that are production type test beds? DAFINA TONCHEVA Subsequently, over the next 20 years, USVP invested in companies in the DoD space, in networking, mobile Cloud and securities technologies. My security background starts with college, where I studied computer science and applied math. I had a particular affinity for cryptography, and I spent a lot of my time studying that fairly new field. After college, I worked as a software engineer on two-factor authentication, digital rights management, and a number of other security projects.

PANEL DISCUSSION

SCOTT TOUSLEY

In my current role, I study the security market, and stay as close as possible to emerging technologies. I study the needs of the customers of the big businesses and organizations that purchase security software. I also study the research of universities and other product organizations in that area. It is our goal to find the most

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

| 39 8/10/17 12:37 AM


EMERGING TECHNOLOGIES TO ADDRESS CURRENT AND FUTURE THREATS cutting edge technologies that are developed by experienced teams which address the big needs in the security market. Security has been popular for the last 20 years. There are 1,400+ security companies today. If there has been so much money poured into security, and solving the problem has attracted some of the smartest people, why hasn’t it been solved yet? If we look at the numbers they are somewhat terrifying. Every year, tax increases by 40 percent relative to the year before. It takes, on average, 200 hundred days for an organization to find out that it has been compromised. There are 200,000 open security positions, and there are estimates that there will be about a million open security positions in the next four years. What does that mean? Even if we provide security tools, no matter how good they are, we won’t have enough people to deploy those tools, automate them, and manage them on a daily basis. We talked a lot about the economic losses. Another scary number is that, in 2016, there were 2.5 billion identities stolen online. There are 3.5 billion people online. That means that, on average, for each one of us, there is a 60 percent plus chance that our online identity has been compromised or stolen last year. That is pretty terrifying given that there are 1,400 security companies in the market trying to solve that very same problem. Last year, businesses in the United States and abroad spent $100 billion on security solutions. We’ve been throwing a lot of money at the problem and a lot of smart people have been working on it, but it’s only getting worse. There are several reasons why that’s the case. It’s an arms race. We put up defenses and then very sophisticated, smart hackers take those defenses down or penetrate through them. So, we start the 40

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 40

cycle again. Hacking is popular because it’s profitable. Hackers hack for economic reasons, and, in the last five years, we have started hearing about hackers hacking for political reasons. The market incentive is there. Five or seven years ago, when we were looking at security companies to invest in, the assets we thought about were financial assets. We didn’t, at the time, worry much about the political or national security implications of cyber attacks, and now, that is a topic in the news media daily. The incentives are there, and the stakes are very high. Reason number two, is that it’s not a level playing field. It’s a game in which the good guys are always at a disadvantage because we have a huge surface to protect 100 percent of the time. Hackers only need one vulnerability to exploit in order to succeed. That unfairness is intrinsic to the problem, and it will always be there. The third reason is the technologies, as good as they are (we as investors are very excited about advancements in technologies), are the same technologies used by the hackers. While we get excited about emerging technologies that protect us better, we also have to remember that those same technologies are available to the adversaries to create more sophisticated attacks to go after our businesses and our assets. Fifteen years ago, and maybe as recently as five years ago, some people still didn’t believe that security should be a separate field because there was this philosophical debate that, if the product is built securely from the very beginning, then we don’t need to buy security products. Security will be redundant and obsolete because products, by design, will be secure. Nobody believes that today. The issue is, as technologies advance and become more sophisticated and complicated, security considerations become

exponentially more complicated, as well. We can’t expect developers to have the security background and expertise necessary to build something 100 percent securely. In fact, the more sophisticated security problems become, the technologies that are used to build software are becoming increasingly simpler. There are so many libraries and online repositories available to download entire pieces of code to put together an application in no time. While this proliferation of software is making businesses more effective and productive, it is also making security an even bigger concern. The final reason why security is a growing problem is the human factor. We continue to make mistakes, as many of us are not properly trained. Security isn’t on the top of our minds as employees, consumers and most individuals out there. In fact, if you look at your kids’ Facebook and Instagram accounts, you will find out that it’s exactly the opposite of how we think about security. There is a tendency to put everything out there. People are more open and more transparent, as opposed to more closed and more secure. JOSEPH WEISS I’m going to give you a very different view of what you’ve been hearing. I’m a control system engineer. Until 2000, my job was to build insecure, vulnerable control systems because they were reliable, efficient, and safe. Now these systems are a security risk. First, I’ll talk about definitions. People equate “cyber” to “cyber security” or “cyber attacks.” In the IT world, cyber means a connection to the internet, a Windows interface and somebody stealing your data. In the control system world, it is different. Cyber simply means electronic communication between systems that affects either confidentiality

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION (which we don’t care much about), integrity (which we care a little about), or availability (which we care a lot about). The most important letter that’s not even there is the letter S, which is safety. IT can’t kill people, but control systems can. Yet, there is nothing in the CIA triad there to address the letter S. Prior to the advent of cyber, most people didn’t know what a control system was, unless you were a control system engineer. What changed that was the micro-processor. We went from being only zeros or ones, to being data. That’s what everybody started wanting because the most important information in any industrial company is the data. That has created a major problem because we’re now interconnected with all kinds of things and people. When we were studying control system theory, it was docking satellites or concern with optimizing feed water flow. The concept of security was never there. In computer science, not many people knew or cared about serial communications. We’re not training on the control systems side effectively enough to understand security, and we’re not training the computer scientist to understand how control systems operate. It is a major cultural problem, and it propagates all the way up the management chain in almost all organizations. One of the other points are organizational charts. In the electric industry, there is transmission, distribution and generation, and we draw lines between them. Hackers have no organizational charts. We better get used to the fact that this is propagating and we’re not doing that. I have a database that I have amassed over time of just control system cyber incidents. I’ve identified over 950 actual control system cyber incidents; impact to date, shows over $40 billion in direct damage. With more

than 1,000 deaths to date. Cyber incidents do not need to be malicious to cause a problem. Moreover, there is a lack of control system cyber forensics and logging when you get below the internet protocol layer, which is why you haven’t heard about much of these incidents. People have heard about Stuxnet, which was a manipulation of physics. The attackers basically changed the control system logic. The controllers then changed the process, and that’s what damaged the centrifuges. The attackers then changed the control system logic back to where it belonged, so that the control system operators were not aware of the changes. Now, I don’t think anyone will tell me that was not cyber. Another example is when the EPA changed the emission requirements, and Volkswagen diesels could not, nor could anybody else, meet the new emission requirements and still meet performance. So what did they do? They came up with control system logic (there are no networks here) smart enough to change the emission and fuel controls when the cars were under test for emissions. After the emissions test was finished, the logic back. That was over 100,000 vehicles. Even though that was more than 100,000 incidents, I count that in my database as one case. Think about this with our definitions, about insiders. This was a rogue company, not an insider, a rogue company. What was malicious? They obviously did it. We need to rethink what malicious means. In a sense, there are two pieces to energy and manufacturing. One piece is the energy for the electric transmission distribution or electric sub stations. We’ve been able to demonstrate that you can hack and take complete control of the protective relays in sub stations. A hacker would be able to take the grid out for 9 to18 months because DHS

declassified the Aurora information. The second thing is the process controls. There’s a “small” company that started an initiative to develop a standardsbased secure-by-design control system because none exist. The name of that small company that’s starting this initiative is Exxon-Mobil. What does that tell you when a company the size of Exxon-Mobil cannot find a standards-based, secure control system? I’m actually under contract helping them and in this case trying to get utilities to participate. On the electric side, our relays are not secure. IT is concerned about vulnerabilities. Control systems are concerned about impacts and mission assurance, not information assurance. In the Ukraine, in 2015, the breakers were remotely opened. The Russians chose not to re-close the breakers. If they would have done that, the outage would not have been six hours, it would have been months. The malware that was used in the Ukraine was similar to the malware that was installed in our electric grids in 2014. The malware used in the Ukraine was (and is) in our grids first and then modified to use in the Ukraine. The final point is, recently the Russians cyber attacked the electric grid in the Ukraine again and turned off the lights in part of Kiev using similar systems to those used in US grids. Part of the reason the Ukrainian outage was only 6 hours was that the Ukrainians were used to running in manual operation and their grids are still in manual operation. I challenge any of the utilities here to be able to run their grids in manual operation for a week, much less 6 months to a year. It’s a really different view. We need to realize that our vendors are worldwide, and what happens in one place can happen elsewhere.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 41

| 41 8/10/17 12:37 AM


EMERGING TECHNOLOGIES TO ADDRESS CURRENT AND FUTURE THREATS SCOTT TOUSLEY I’m with the science and tech part of DHS. Our office has a budget of about $90 or $100 million a year, so we practice judiciously in trying to get creative in combining funding areas. Our $90 million is probably combined with another $200 million from other sources indirectly. It’s very easy for an adversary, with a relatively small investment and motivation to attack, extort or engage in espionage, to have a disproportionately negative impact. How do you defend when almost anything can be attacked? It’ll be interesting to see if that attention on the non-tech side, over time, combined with the explosion of everything cyber (physical, internet of things,

ends, these strong, academic cultures sag right back into their old habits and traditions. It takes a long time to change culture. I’m competing for money every day for ways to better detect explosives at airports or cross check records so that somebody passing through the border can get to his job on the other side. DHS has a tremendously large operations support activity because, on any given day, more than a million people cross borders, and a lot of trade comes through. We can’t get in the way of that, so things that add efficiencies tend to get funded more easily because the return on investment is demonstrable, where the return on investment in cyber security is not there yet.

Prior to the advent of cyber, most people didn’t know what a control system was, unless you were a control system engineer. smart cities, etc.) can connect those two initiatives and mindsets and end up with a greater sense of cultural awareness (in a technical sense) about how to protect an area in your business. So, the business culture and the technical background culture together is a big part of this. Most of this problem is not technical, but organizational—people, social, cultural, even anthropology based. This is a problem of tribes, cultures and a lack of communication. I challenge any university, including this one, to blend proposals and projects with people from other parts of the school that are not the technical parts. I worked with NSF over the years, and I watched them attempt to spark things that combine these fields. The moment the funding 42

support the transition after the funding ends in order to make the open source part of this process work. The transition strategy is a big part of how we do that so that the investment of every dollar now turns into something that’s commercially available years later. We’re like the venture capital world, but we operate within the government. We try to find places where to spend money that will have a greater payoff down the road. There are two really interesting challenges you see in most R&D organizations. One is the question of time, deciding whether to invest for just the next year or two in product improvement service or, like DARPA, doing something to demonstrate something now because in 10 or 15 years there will be new businesses around that demonstration. We operate in between those two extremes. So the question of time scale, no matter where you are operating in an RDT&E activity, is always there.

Then there’s also this pandemic biological threat area, where efforts between HHS, CDC, DHS and military expertise, need to be funded. First, we are spending money to better benefit a CIO or CISO in our own department, in a dot gov or in the critical infrastructure. We’re focused on this in the relatively near term, more near term than DARPA and NSF. We’re also an industrial R&D activity that’s changing ops today with what we fund.

The other problem is the question of lateral scale. We fund efforts in many different technical areas, including software, privacy, identity and cyber physical systems, and we work with specific critical infrastructure sectors, like forensics. DHS is the biggest law enforcement activity in the country and diagnosing the crime data is a large part of the investigation. We are always deciding whether to fund 20 different projects this year at $4 million each or four different projects at $20 million each, depending on the difference in impact between them.

We have to leverage many different investment areas. A third of what we spend management time on in the cyber security research division at DHS is tech transition and strategy. The effort needs to have an impact as soon as the contract dollar ends. For every proposal, we look very hard at who is at the table to

Everything we do is collaborative. We have a lot of influence in one of the major OSTP groups that guides RDT&E investment across the country in the government. One of those OSTP groups is in the health sciences area. In our case, the national information technology R&D area, we spend a lot of time

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 42

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION working that group so that we know what energy we are spending in cyber security. Health and Human Services is trying to get into this space also because they realize there are things that they should have been doing and were not. The last thing I’ll mention ties back to innovation and venture capital. We’ve had a lot of success as the executive agent for the Silicon Valley Innovation Program out of DHS, where we were able to invest SPIR sized funding amounts, 200K at a time, maybe as much as four cycles. We are trying to fund the demonstration of a new tech area that has governmental applications in parallel with that company’s investment and venture money for what that company is already successfully doing commercially, rather than waiting and catching up with a particular government need or requirement in the area. That has been a successful model and it’s an example of the fact that, with a lot of energy and intellectual leadership, we’ve been able to extend the limits of what is usually a risk averse government in RDT&E. DR. TOM OVERBYE Cyber security is a great area, but if you really want a great career, I would encourage you to go into energy. We have the challenge of powering the 21st century, and beyond, with renewable and sustainable energy. Of the energy infrastructure, electricity is about 40 percent of the pie. We operate the world’s most complex machine. Electricity is part of a coupled infrastructure. Certainly, we’re coupled with natural gas, but we’re very coupled with the cyber infrastructure, too. One of the challenges in doing this work together is that it takes a collaborative effort. As an electric engineer, I know the grid. I’m not a cyber person, I’m a grid person. We have to work together. I

hope that we can accomplish that as a community. Building these effective inter-disciplinary communities takes time. It takes time to get to know each other and the lingo we use. Once I was giving a talk about clouds, and afterwards, I realized that the cyber people thought I was talking about the Cloud. I was talking about those fluffy white things in the sky impacting our solar TV systems. One of my cottage industries is explaining the grid to non-technical professionals. There are a lot of misconceptions about how the electric grid operates. Some people think that if you take out one light bulb in the grid, it’s going to collapse, like a string of Christmas tree lights. It’s problematic when people who really should know how the grid operates, but don’t, are making policy decisions about the grid. It helps to work collaboratively. Another misconception is the notion that our society can go off grid, that with solar on your rooftop you’ll be sustainable. That’ll work for some, but the vast majority of our society depends on the grid. We need a reliable, trustworthy grid from a cyber point of view and from an operational point of view. The cyber challenge is a big challenge. We have the IT challenges and the challenges at the control system level. I’d advise students to learn about cyber security and about electric power grids, too. We really have very few of those people. We need good test beds to study this because we’re dealing with the world’s most complex machine. We need to have places where we can play around with test beds that represent the behavior of the grid, but also the coupled cyber infrastructure, including the control system. QUESTION: A question I’m getting as a security person now is about voting. DHS has said

that voting will be a critical infrastructure. How do we adapt to having these new infrastructures added into the mix and get results quickly? The consensus of opinion in the field is that we were lucky that the voting systems held up and worked. ST: Voting is a good example of one of the areas in our country that was built on a federal system. We’re fractious, and we are designed that way. Voting is a good case study of how to, without regulating it, facilitate a stronger outcome and work through what needs fixed. Security for the census is a problem we should think about. Voting is a challenge, but there have been a lot of good people technically working on it. The challenge is getting over the mountain of implementation, as well as the social, organizational and governmental trust challenges. JW: This affects the control systems world a lot. Each time there is another big hack, control system cyber falls further off the table. When Russia cyber attacked Ukraine again not many people heard about it because the focus was on voting. Cyber is becoming more important to the Boards, but it’s IT, cyber not control system cyber. In 1999, the Olympic Pipeline Company (a joint venture of Shell and Texaco) had a cyber event which resulted in a pipeline rupture. People were killed, people went to jail, and the Olympic Pipeline Company went bankrupt. It was a control system cyber event. That’s not the only case where control system cyber has led to bankruptcy. The San Bruno natural gas pipeline rupture was a control system cyber event which was caused by, of all things, wanting to replace a noninteruptible power supply. That was the original cause of the over pressure that broke that pipe.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 43

| 43 8/10/17 12:37 AM


EMERGING TECHNOLOGIES TO ADDRESS CURRENT AND FUTURE THREATS QUESTION: My question is about the discontinuity between control systems engineering or health informatics and security professionals. How do we get inroads into those communities so we can secure the things that keep people alive and the lights on? JW: My suggestion is that universities should require a course for every engineering discipline on cyber security. Every computer science person should have an intro to control systems from a cyber perspective because we’re growing further apart. The advanced technologies for IT and the advanced technologies for control systems are going in different directions. We need to be in a position to prevent that from happening. TO: As an educator, we’re always dealing with the challenges of what to teach the students. It is difficult to come up with classes that address cyber at the level that’s useful to the computer scientists and the electrical power engineers, and about the grid at the level that the computer scientist can understand without boring the electrical engineering students. We did have some pretty good success at Illinois in having our power engineers take the intro to cyber security class. ST: This is a good example of something that is substantially cultural in character. Industrial control system is evolving depending upon different industry areas. It’s a hybrid of known principles, that are more important now than ever, in different industry areas, but also an understanding of what resonates in a social, conversational sense with different industries. There is a need to be adaptive to the communications process. QUESTION: An updated version of Framework for Improving Critical Infrastructure in Cyber 44

Security was recently released and I would encourage you to read it. If you take this framework and consider the NERC Critical Infrastructure Protection (CIPS), in theory, it should move us to a more secure place. What is the value of the framework and CIPS? JW: In the industrial controls world, cyber is an issue because it was the only risk to reliability and safety that wasn’t examined. Cyber is an issue to be looked at to determine if it can impact or cause a problem. I look at life through a control system engineer’s eyes. Too much, in my opinion, in the world of cyber is being thrown at us from the perspective of IT. People will only take notice if they understand what the impact could be. IT is worried about vulnerabilities. Control systems people are worried about impacts. For example, could a certain event result in a valve opening and releasing a toxic chemical or result in not being able to open a relay, which could then cause equipment damage? That’s a big problem. If it isn’t going to impact reliability or safety, it may not mean much. What’s important is to get the IT and operational world together and understand what each of them means by risk because each of them is defining risk in their own context. We need to understand what it is, especially when we go to the Boards. We need to be together and we haven’t been. QUESTION: In terms of investments and innovation, how do you look at the types of industries out there while at the same time being mindful of industries, like the power industry, that has a lot of legacy solutions that may or may not proceed as fast with innovation and returns?

DT: All industries have legacy software because, by definition, IT does not want to touch all software that’s considered infrastructure. They just build on top of it. That creates issues and that’s something that security vendors keep in mind as they build security solutions. In general, when we look at potential investments in security companies, we look at the addressable market. If the security vendor under consideration can sell to FinTech, which we consider to be leading in cyber security or if they can sell to pharma or if they can sell to oil and gas, then it is an attractive investment opportunity because the addressable market is really large. We don’t treat power any differently than any of the other industries. Most companies invest across industries and sectors. I think that there’s room for consulting services and custom solutions that are specifically tailored to the problem of one sector or one customer. That’s why there will always be consulting companies building custom code around, what is otherwise, more generic products. Going back to IT security versus systems control security, it’s not realistic for us to expect that every cyber security expert will have the background to understand the complexities and the details of implications of protecting a power plant. As long as there are processes and organizational structures in place to demand collaboration across these different disciplines and areas of expertise, that will go a long way to solving the problem. It’s keeping these different experts in isolation from each other that ends up creating areas of slack coverage and that creates problems. TO: One point on the electric industry that a lot of people may not know is that we have very large utilities, but there are also a lot of very small utilities. A

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 44

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION municipal may be serving a village of a thousand people, and that’s very different than serving millions of people. That’s somewhat reflected in the regulations, but not fully. QUESTION: As an investor in cyber technology for operating systems, where do you want to see the money being invested by the venture capital or the private equity community that could make the most difference to your community? Are the solutions there and they’re just not being communicated? Or are these solutions that you believe need to be developed? JW: Most of the security technology has been IT focused (intrusion detection, firewalls, some way of looking at IP packets), but it’s not addressed to control systems themselves, in particular, devices that are called level zero and level one devices. For those that don’t know what that means, the Purdue References Model, level zero and level one, are the sensors, controllers, and analyzers. These are the things that are monitoring and controlling in real time. They’re generally not Windows based. They use proprietary real time, operating systems, or they may be fully embedded. This is across every industry, including DOD. My view of things is that the security industry is too IT focused. So, they don’t look at those systems that aren’t on an IP network. Part of what DARPA is trying to understand of the situational awareness is at the actual operating layer. Another big issue is monitoring networks, but there are very few that are monitoring the control system network. Intrusion detection is a really tough thing.

DT: I happen to be an efficient market believer. If there is demand, supply,

shortly after, follows. If there really is a critical need to protect the industrial control systems in a way different than how IT systems are being protected, then there either are people already doing it or there really isn’t that much of a need. If there aren’t solutions, then probably the need isn’t big enough at this point.

TO: My recommendation is this is a really great area that students should pursue, and my other recommendation is that test beds need to be built. We have a project going on with DARPA that’s going to look at some of these issues. ST: It’s all about collaboration across different organizations and tech transition. A field you want to make sure you’re keeping up with is information and Big Data, data flows, and privacy. One of the things we’re looking at is the block chain technology areas and what it can and

DT: Since this was a panel on emerging technologies, I will draw your attention to three technologies that I find exciting as I look into dealing with cyber security challenges in the near future. One is cryptography. Even though the technology has existed for 20 years, it hasn’t really been used widely in the market because it is very difficult to implement. Especially now with data and computing being so distributed and perimeter-less, cryptography becomes even more important. There are advances being made in processing and manipulating data-at-rest which is still encrypted. There are a number of companies doing that ground-breaking technology which will allow businesses to put their data in multiple applications. The second one is machine learning. We have seen some very exciting in-

Good guys are at a disadvantage with a huge surface to protect 100% of the time. Hackers only need one vulnerability to exploit in order to succeed. can’t do for the Department of Homeland Security operating in a democracy. Looking at all of these tech areas from an information flow and contamination and assurance perspective is something that will keep you busy for decades. JW: There needs to be collaboration between the IT and the engineering organizations. Two of the big things are going to be the Big Data or predictive maintenance (knowing how the systems are working, how much more life remains, and how to run it more efficiently). It’s a very complicated and exciting area.

novations when it comes to paring machine learning with security. Companies, like Crow strike, have reinvented antivirus technologies. You may know that Amazon basically doesn’t purchase security. They build everything in-house. Amazon is a buyer of technologies like that because they believe they can have 80 to 90 percent coverage on malware using those technologies. The last type of technology that I find exciting is coupling Big Data, and that means collection, analysis, and correlation of data across various domains in order to find security abnormalities.

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 45

| 45 8/10/17 12:37 AM


EMERGING TECHNOLOGIES TO ADDRESS CURRENT AND FUTURE THREATS This space is called security analytics. It is emerging, and it is very interesting because it addresses very complicated problems, like insider threat. An example is if an IT person in an organization downloads the design of a power plant. This might not appear to be a problem unless you know that that engineer is disgruntled because three days earlier he had a bad performance review. Well, security alone cannot find that to be a problem. Nothing malicious has been done. However, there’s malicious intent. If you pair security, meaning you know that this person downloaded the design and then you can look at the HR database and see that that person had a bad performance review, a red flag is raised and you can discover the potential for malicious intent. 

46

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 46

8/10/17 12:37 AM


CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 47

| 47 8/10/17 12:37 AM


PANELISTS

ROBERT (BOB) BUTLER

JACQUELYN SCHNEIDER

With over 34 years of experience in

Jacquelyn G. Schneider is an Instruc-

intelligence, information technology,

tor in the Center for Naval Warfare

and national security in both public

Studies and an affiliated faculty

and private sectors, Robert Butler

member of the Center for Cyber

is currently Managing Director at

Conflict Studies. Her research focuses

Cyber Strategies, LLC. Previously, he

on the intersection of technology, national

served as the Chief Security Officer for IO, a

security, and political psychology with a

global data center service and product firm.

special interest in cyber, unmanned tech-

He has consulted as a Special Government

nologies, and Northeast Asia. Her work

Expert to the Office of the Secretary of De-

has appeared in print in Journal of Conflict

fense, Air Force Scientific Advisory Board

Resolution and Strategic Studies Quarterly

and other organizations on cyber security

and on-line at War on the Rocks, The

and enterprise risk management. He also

Washington Post, The National Interest,

serves as a fellow at the Center for New

Bulletin of the Atomic Scientists, and The

American Security (CNAS) and is a member

Center for a New American Security. Ms.

of the Texas State Cyber Security, Education

Schneider is an active member of the

and Economic Development Council. Prior to

defense policy community with previous

assuming his current roles, Mr. Butler served

adjunct positions at RAND and the Center

as the first Deputy Assistant Secretary of

for a New American Security. She previously

Defense for Cyber Policy (August 2009-August

served as an active duty Air Force officer in

2011.) In this role, Mr. Butler acted as the

South Korea and Japan and is currently a

principal advisor to the Secretary of Defense

reservist assigned to U.S. Cyber Command.

and other Defense leaders on the development of cyber strategy and policy.

MODERATOR ANDREW ROSS DIRECTOR, DIRECTOR AND PROFESSOR, CERTIFICATE IN NATIONAL SECURITY AFFAIRS

At the Bush School of Government and Public Service, Dr. Andrew L. Ross is director of the National Security Affairs program and a senior fellow in the Institute for Science, Technology and Public Policy. Professor Ross’ previous appointment was at the University of New Mexico (UNM), where he served as director of the Center for Science, Technology, and Policy. He led UNM’s University Strategic Partnership with the Defense Threat Reduction Agency and served as the program manager for UNM’s Educational Partnership Agreement with the Air Force Research Laboratory. Prior to his appointment at UNM in 2005, Professor Ross spent sixteen years at the U.S. Naval War College, where he served as a research professor in the Strategic Research Department (SRD) of the College’s Center for Naval Warfare Studies; director of studies, SRD; director of the College’s project Military Transformation and the

48

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 48

Defense Industry After Next.

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION

POLICY TO ADDRESS CURRENT AND FUTURE THREATS INTRODUCTION ANDREW ROSS

ERIK GARTZKE

Mr. Tousley is the Deputy Director

Erik Gartzke is Professor of Political

of Cyber Security Division (CSD with-

Science and Director of the Center for

in the Department of Homeland

Pacific Security at the University of

Security’s (DHS) Advanced Research

California, San Diego, where he has

Projects Agency. His responsibilities

been a member of the research facul-

include organizational liaison, Integrated

ty since 2007. Professor Gartzke’s research

Product Team (IPT) management and sup-

focuses on war, peace and international

port, and project leadership for educational/

institutions. His ongoing project on cyber-

operational efforts, such as the Computer

war and the automation of military conflict

Security Incident Response (CSIRT) project.

focuses on understanding cybersecurity as

He also supports several initiatives in critical

an ever-evolving battlefield. This is a natural

infrastructure protection and cyber-physical

extension of his interests in nuclear security,

systems. Mr. Tousley served 20 years as

the liberal peace, alliances, uncertainty and

an Army officer in the Corps of Engineers,

war, deterrence theory, and the evolving

many in interagency technology programs.

technological nature of interstate conflict.

He led the Watch/Warning program in the

He has written on the effects of commerce,

Federal Bureau of Investigation as part of

economic development, system structure and

the National Infrastructure Protection Center

climate change on war. Professor Gartzke’s

(part of the Clinton Administration’s early

research appears in numerous academic

engagement with national cyber security

journals, including the American Political

challenges). He previously managed the op-

Science Review, the American Journal of Political

erations security team for a large internet

Science, the British Journal of Political Science,

service provider, was the principal with a

International Organization, International Secu-

technology start-up company in the private

rity, International Studies Quarterly, the Journal

sector, and was program manager at the

of Conflict Resolution, the Journal of Politics,

DHS National Cybersecurity Division. He has

and World Politics.

served nine years with DHS, principally with

I’d like to thank the four members of our panel, Bob Butler, Jackie Schneider, Scott Tousley, and Erik Gartzke for joining us. We shared a set of questions with each of the panelists and, based on our discussions thus far, I’d like to add to those questions. First, how do we win? That first means deciding what “win” means. There’s been a lot of discussion around the term “asymmetric warfare.” Some have objected to the term “warfare,” so we can use “asymmetric conflict.” Obviously, cyber capabilities are used against us asymmetrically. Second, who’s in charge and who should do what across the public and private sectors, including research and academia? Third, if cyber security is fundamentally broken, then cyber security policy must be broken, as well. How do we fix it? Fourth, if the return on investment in cyber is not there yet, what will it take to get there and how will we know when we’re there? Finally, it has been suggested that in the cyber realm we too often provide 20th century analog responses to 21st century digital challenges. Is that really the case? How do we do better?

PANEL DISCUSSION

SCOTT TOUSLEY

S&T, but also with the Domestic Nuclear Detection Office and supporting Customs and Borders Protection.

TAMU_CSProceedings_pp.indd 49

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

| 49 8/10/17 12:37 AM


POLICY TO ADDRESS CURRENT AND FUTURE THREATS ROBERT (BOB) BUTLER I’ve worked in the government, 26 years in the Air Force, the last one serving as a Deputy Assistant Secretary in space policy and cyber policy. My undergraduate background is in quantitative business methods, and I did a lot of programming and worked in networking early on as an Air Force officer. In business, I’ve had the opportunity to run a P&L with Computer Sciences Corporation, working on their strategy side, integrating global red teams. Also, I’ve had the opportunity to work with small entrepreneurs and grow with big anchor partners like Goldman Sachs. That’s given me an opportunity to see the cyber issues from different angles. The threat is a big problem. When I look at the threat, I break it down to different classes of actors. When we’re talking about hackers, we’re talking about thousands of dollars being invested to acquire hacking tools or to reuse existing hacking tools. When I step it up into criminals, mercenaries, and some of the smaller states, we’re at a magnitude of at least 10 beyond that in terms of dollar investment. Then, with the nation state (what we call in the Defense Science Board a tier 5 or tier 6 threat) there’s a lot of concern because we’re talking about billions of dollars of investment. It’s not just cyber security, but it’s the integration of all the trade craft. I worked in the intelligence community, negotiating with the Russians and seeing what the Chinese are doing, and for those nation states there are no limits to what they might do. The Russians and Chinese look at ways to use clandestine means, business means, as well as online network behavior to accomplish their goal. We talk about convergence in this world from a technology perspective, but really, it’s 50

more than that. It’s the convergence of a trade craft. Cover companies do indeed inform online network behavior, and they’re growing from an adversary’s perspective. We have challenges in deterring the cyber threat, and we must keep in mind that the threat is as an element of a risk equation. Risk is a function of dealing with threats, vulnerabilities, and the impact of consequences in the area of cyber deterrence the first challenge is that these high-end nation states are starting to develop significant offensive capabilities. If we try to deter a high-end nation state, we will face the risk of retaliation. There is a lot of asymmetry in what an adversary’s national interests are compared to what our national interests are, and how to move forward between those two different interests. The second challenge with which we must work is terrorists and small nation states use of information technology, threatening us and affecting our critical infrastructure. The third challenge in cyber deterrence revolves around malicious cyber activity or attacks at the lowest level against small public and private entities and organizations, which are being attritted by a thousand hacks. Those are at least three deterrence challenges that we face. In the first area, which is right at the heart of national security, the critical infrastructure and the military and supporting critical infrastructure need to be made more resilient. Today, our most effective way of deterring a high end nation state is through the threat of nuclear weapons. The fact that we have them and can use them deters Russia. Beyond the threat of nuclear cost imposition, deterrence is enabled by ensuring the resilience of national security command and control to include nuclear C2, integrated missile

defense, and thin line force projection, as well as, dependencies that those missions have. We’re working hard to close risk exposure in this space. In the second area, critical infrastructure protection and resilience needs to be improved. We have many critical infrastructures so a determination needs to be made about which have the highest priority. The electric grid and communications are certainly two very important critical infrastructures. There is a bigger superset, but no matter the number of critical infrastructures, they do not all have the same level of importance. We need to focus on the most important and accelerate our efforts to make these infrastructures more resilient. In the third area, we need to find a way to knock down crime. In 1998, I was the DoD representative into the first negotiation with the Russians on cyber arms control, both at the bilateral level and multilaterally, and we weren’t ready. As we worked through that issue, we realized the Russians were buying time and trying to find out more about what we were doing. We just did not have a good strategy. Within a year, we were able to transition the international discussion from one about cyber arms control to cyber crime which eventually led to the Budapest Convention (back in 2002 to 2003). Today, we need to not only create norms, but also the mechanisms to enforce norms for improving cyberspace. A few years back we started with the FBI and botnet takedown campaigns. Working with Interpol, we have scaled these activities into global botnet takedown campaigns, but we need to do more. In conclusion, I wanted to offer some guiding principles about deterrence in cyberspace. Cyber deterrence includes, deterrence by denial, resilience, defenses,

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 50

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION as well as cost imposition. In the case of cost imposition, it comes down to not just looking at Russia, China, or Iran, but looking at the adversary’s leadership. It is about perception and calculus on the part of the decision maker, especially for countries that are not representational democracies, but that are autocratic. We need to understand what they value, figure out what to do about it, and then communicate a threshold, which if exceeded, will result in a response of imposing cost. In the area of response to a critical attack, it’s not a question of whether we’re going to respond, it is how we’re going to respond. It definitely is more than cyber. It is about influence and shaping. We have a strong military establishment to protect American values and national interests. We don’t want to go to war because it’s expensive, and it doesn’t reflect the image that we want to portray as Americans. So, we are using flexible deterrent options as we prepare flexible response options. In thinking about risk of escalation from the adversary’s perspective, a failure to respond with some type of deterrent activity actually creates a worse situation for us down the road, as you can see in some of the activities that have been going on. There hasn’t been a lot to deter Russia and they continue to push. JACQUELYN SCHNEIDER I’m coming from the Naval War College, but these views are my own, and they do not represent the views of the Naval War College or the U.S. Navy. In terms of deterrence theory, the core precept is that in order to deter, we must both increase the cost of attack and decrease the benefit from attack. It’s a simple mathematical equation, and we can do that through both denying the adversary access and also creating

punishment options. While it’s a simple mathematical concept, the problem is influencing the perceptions and cognition of the adversary. How do we signal the capability to punish or deny, and how does the adversary perceive that signal? What creates costs for the adversary? These become complex questions to what should be a logical problem and nuances about vulnerabilities, escalations, and tailored versus general deterrence. The important thing for U.S. policy is what, who, and how am I deterring? The Obama administration has tried to tackle these questions and, in December of 2015, put out a solution set that summarizes the different concepts they have chosen to implement to solve this problem. The U.S. State Department views these policies as a whole government approach that includes diplomatic, informational, military, economic, intelligence, and law enforcement. That’s important because the response may not actually be within the cyberspace domain. We’ve had a mix of deterrence by denial (the State Department calls it deterrence through cost imposition) and deterrence by punishment. By denial, I mean increasing the cost that the adversary has to attack our critical infrastructure. There are public-private partnerships through DHS, the sharing of threat information through DHS’ N-Kick, and there have been a series of executive orders about improving critical infrastructure cyber security, and a series of different initiatives by Cyber Command, DISA, NSA, and other DoD to secure the DoDIN (the DoD infrastructure). Most significantly, the Obama administration listed the critical infrastructures. According to the PPD-21, which is the presidential directive that delineates critical infrastructure, there are

chemical, commercial facilities, communications, critical manufacturing, dams, defense industrial base, emergency, services, energy, financial services, food and agriculture, government facilities, healthcare and public health, information technology, nuclear reactors, materials and waste, transportation systems, waste and waste water systems, entertainment, and the electoral system. What is not the critical infrastructure at this point? This makes the problem of deterrence much larger because we’re trying to deter attacks on every vector of what is the American society. The Obama Administration has created a series of punishment alternatives for deterrence by punishment. The first and most prolific, and the only one they’ve used, is the economic sanctions. The economic sanctions came out after the Sony attack and gave Treasury the authority to implement sanctions against nations that are harboring individuals that conduct attacks against critical infrastructure, as well as against individuals within those nations. Interestingly enough, the critical infrastructure list that the Obama administration created originally did not include the electoral system. In order to sanction the Russians, they had to put out an addendum to the executive order. The Obama administration has also been working with the DOJ to put out warrants, which may have a significant effect on discussions with Xi Jinping and the Chinese about the exploitation of intellectual property. Finally, there is the threat of offensive cyber operations. The Department of Defense cyberspace policy states that “the United States will continue to respond to cyber attacks against U.S. interests at a time and manner, in a place of our choosing, using appropriate

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 51

| 51 8/10/17 12:37 AM


POLICY TO ADDRESS CURRENT AND FUTURE THREATS instruments of U.S. power, and in accordance with applicable law,” which means that the entire DoD arsenal is at the use of the president in order to punish those who choose to attack our critical infrastructure. Over the last few years, the Cyber Command has stood up a series of teams within the cyber mission teams that have offensive capabilities against our adversaries. The Obama Administration has decided that it is more important to have an ambiguous policy as opposed to a declaratory policy. The Administration says that they would like a nuanced and graduated declaratory policy about defense, but they would like to remain ambiguous in terms of punishment because of thresholds for response and consequences to discourage preemption or malicious cyber activities will allow adversaries to take action just below the threshold for response. They have focused instead on norm creation. Primary norms that matters to critical infrastructure are that states should not conduct or knowingly support online activity that intentionally damages critical infrastructure. Although much has been done, the general consensus is that we don’t have policy that works, and these attacks could happen at any minute. The question becomes what are the challenges to this policy, and is it working? We have enumerated a series of different critical infrastructure sectors, and then we have added to it as we’ve realized that we didn’t enumerate enough that mattered to us. This is a huge problem for deterrence policy because the adversary doesn’t know where we stand. There’s a significant question about whether network exploitation is important to deter versus attack. Should the U.S. be in the business of deterring the OPM hack, or should we think about that more as spy versus spy? 52

There’s a complexity of the dual use nature of this infrastructure because, while we have identified it within our norms of civilian infrastructure, this infrastructure also helps the Department of Defense. It’s difficult to completely bifurcate civilian infrastructure from defense infrastructure. If our norms state that we shouldn’t attack civilian infrastructure, does that mean the DoD should not be attacking adversary infrastructure? These voluntary norms may be effective when there is no crisis, but today the difference between peace time and conflict is unclear. So, when a crisis occurs, what really is valid in terms of deterrence? There’s also a significant limit of deterrence by denial which is the actual technical capability to deny these types of attacks. Huge complexity exists in public-private partnerships and in the U.S. laws about who can do what in terms of defense. The 1880’s law of Posse Comitatus has limited the amount of effort that the Department of Defense can take in order to protect much of the critical infrastructure in the United States. There are also limits to deterrence by punishment. In addition to attribution concerns, there are concerns about proportionality, escalation, as well as the timeliness of these responses. All of this is driven by a perception of asymmetric vulnerability. We in the United States have a perception that we are more vulnerable to these attacks than our adversaries which creates hurdles for norm creation and to deterrence by both denial and by punishment. These are all perceptions. An accurate metric of the United States’ cyber vulnerabilities for its critical infrastructure does not exist. The good news is that the Chinese and Russians don’t have

it either, and they both also recognize that they are vulnerable to attacks. By influencing our adversaries to believe that there is mutual vulnerability within this domain we might be able to gain an advantage. We don’t have to change the truth about the vulnerabilities we just need to change the perceptions about them. Thinking more about influence operations and how that affects deterrence will be useful in the future. We need to be specific about what we want to deter. We have too many critical infrastructure sectors that we say we care about, and the line where we’re willing to commit punishment and force need to be more declaratory. I am not sure deterrence is the right word that we should be looking at. We need to reexamine words like containment, and think about what containment could mean for safeguarding critical infrastructure. The best solution to all of this is resilience. If we could create more resilient systems, many of the problems with theory and policy, would be less of a problem. SCOTT TOUSLEY I’m Scott Tousley with DHS. In recent years, the DHS has wrestled over the topic of public-private partnerships. With one small exception, we are the only non-regulatory government entity working with infrastructure sectors. Working the partnership in an industry where tech, organizational competition, profits and globalization are constantly changing is difficult when trying to collaborate in productive ways while being mindful of potential regulation. From the perspective of DHS, publicprivate partnership in critical infrastructure and cyber security is an organizational leadership challenge. That’s one policy element that will be there for years to come.

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 52

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION There are many cases where what we’re trying to secure (because it’s connected and digitized) is inherently similar, in a technical sense, to the same system’s safety condition. Complex system safety and security are somewhat similar, but they’re not quite the same. There’s a long history in the United States of fairly productive no-fault technical investigation. Monday morning at most major hospitals there are surgery discussions amongst peers which improves the quality of surgery and surgical operations because people who made mistakes learn how to fix them. The biggest improvement the military has made in the last 30 years has nothing to do with the equipment, but it is the training system and the constant critique amongst peers. The National Transportation Safety Board’s job is not to support lawsuits, but to fix systemic issues. How can we improve cyber security through that same mentality and collaboration? If I were in charge of a major business I would be looking at one of my peers to critique me and help me improve, even if that meant I would risk some competitiveness. I’d do the same for them. Most of what we’re talking about in cyber security is winwin, and there really isn’t an angle of competitiveness. The no fault lessons learned idea, from a technical policy implementation perspective, is worth pursuing. A piece of legislation passed when DHS was formed called the Safety Act. The Safety Act allowed companies to strengthen their terrorism protection condition by bringing over systems software capabilities (from the military side, in most cases) and deploying them in commercial or civil environments. The companies were able to still factor that into their insurance and risk condition. That has worked very successfully out of DHS (actually the Science and Tech

organization), to find ways to support tech and its movement into the homeland space from mostly the military space. Then, the cyber problems emerged and grew substantially, but the legislation never spoke of cyber. Cyber is a chronic problem, but the Safety Act was designed for an acute problem (partial governmental indemnification). There’s a lot of interest in industry in getting government help providing underwriting so that private entities can go to the insurance market and get better insurance coverage. Some of the better companies in our economy figured out that there is more economic value in the information that flows through their systems than there is in the systems themselves. It is unclear where we’re going to end up as a country with the monetization of information that is processed in my company, taking into consideration privacy restrictions and information flows. I’m not talking about data-at-rest and cracking a database. I’m talking about suddenly one morning everything Goldman Sachs is reporting to the market is suspect. Those attacks will happen. Unfortunately, the criminal enterprises aren’t going away. From a policy and practice perspective, handling the protection of information at movement is a challenge, not so much technically, but in figuring out how to design the right technical practice and policy in a democracy at massive scale and speed. Europeans generally trust their governments with all sorts of information, but they do not trust their companies. In the United States, it’s exactly the opposite. Moving forward, the tension between the two approaches is going to cause more friction as information flow and the monetization of information flows become a larger proportion of the economy. From a DHS

and a technical perspective, there are many difficult and interesting policy implementation challenges. ERIK GARTZKE I’m a political scientist and I know about security and the logic of politics in the domain of conflict. Clausewitz said that war is a continuation of politics by other means. Cyber war is probably the continuation of politics by other means, too. While technology may have transformed the way in which we interact and, by extension the way we interact in a conflicting manner, it probably hasn’t transformed the reasons why we interact in a conflicting manner. From my perspective, in an area where the data is hard to come by and somewhat suspect and the topic itself is very complex and multifaceted, one area where we can build structure and understanding is in the human and social nature of the process. Most of us are living in more densely populated places than where our ancestors lived. Wherever that is, chances are there are more people there today than previously, as the Earth’s population is now 7 billion people. What security apparatus do we have against the inherent capability of others to do us harm? There are 6.999999 billion other people in the world who can do us harm, and there is nothing to stop them. You might have a positive view of human nature and think that this won’t happen, but it is a capability. The pure presence of a capability is not enough to tell us what’s going to happen in the world. It’s necessary to understand the technology, but it’s helpful to also understand the motivations and the nature of the processes that might use that technology, in different settings, to understand what’s going to or not going

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 53

| 53 8/10/17 12:37 AM


POLICY TO ADDRESS CURRENT AND FUTURE THREATS to happen. There are a lot of people in the world that can do us harm. Most won’t for any number of reasons, including that they’ve recognized the futility of doing so. We don’t see nearly as much violence between individuals as we could potentially, given the capability to do so. An interesting question to ask is why, if we have so much potential, is so little actualized? What does the difference between these two things do to allow us to determine who’s going to do what to whom, when, and why (in this case, through the internet, cyber systems or our networks of utilities)? There are many people explaining how our increasing interconnectedness creates the potential for somebody else to do harm to the things that we depend upon and care about, and that’s important. What I want to add to that is to ask when does it make sense for them to do that, and when would it be futile? Imagine that an asset that a foreign entity, like a terrorist group, wants to target in the United States is a radar control system at a commercial airport (maybe for air defense systems, like antimissile defense systems). Or, the Israelis might want to turn off the Syrian radars before they potentially attack the Syrian nuclear facility to make it easier for them to be successful. The important point in the Israeli example, if in fact it happened, is that it wasn’t a cyber attack by itself. Most often, cyber attacks become useful as complements to other kinds of political or military violence, not as substitutes to them. For example, a cyber attack would be useful to turn off an opponent’s radars before the planes targeting the opponent’s reactors arrive. But, if their reactors can’t be targeted because there is no ability to do so, then turning off their radars 54

doesn’t serve any purpose other than altering them to a potential attack. In the study of international security, we distinguish between two basic kinds of violence. One is taking things from people, called denial or conquest. It is physically imposing a state of the world that’s not the one that was there before and preventing somebody else from changing things. In cyber, this is tricky. A lot of what can be done in cyber is creating conditions in which some other capability can be used to impose that state of the world (e.g., turning off radar systems), making it that much easier for the kinetic things to occur successfully. If those kinetic things cannot occur, then it doesn’t make sense to do the first part. If a foreign power turned off some aspect of U.S. infrastructure, are they going to invade next? If we think that’s not very likely, then their ability to influence the first part of this through the internet is not nearly as useful, in practical terms, as it is a capability. It exists, but it’s not necessarily going to be actualized. We all live under the threat of this kind of denial. Any time in the last 60 years, we could have experienced a nuclear attack. The capacity to do harm is much more ubiquitous than the practice of harm. Understanding when the practice of harm makes sense is a very useful. The other thing you can do with violence is you can punish somebody until they choose to change their behavior. I’m not going to change your behavior, but I’m going to make you miserable until you change it for me. A bombing campaign, for example, does this. There are a lot of ways in which we use punishment as a strategy either in prospect to deter or compel, or in retrospect (ending the punishment when the unwanted action stops or the wanted action starts).

Again, cyber is tricky. In prospect, how do you convince an adversary that you’re able to attack them through the internet or cyber systems in a way that’s going to both be sufficient to compel them and be credible? I could tell you that I am going to attack you through the internet, but you know I’m a political scientist so it’s not a credible threat. Suppose somebody’s talking to you who might actually be able to do this. The problem that you are both faced with is that there are reasons why people, organizations or countries might lie about being able and willing to do this. They have to prove to you that they can do this, and, in so doing they reduce their ability to do it as effectively. If they show you the exploit in your vulnerability, then you have an opportunity to patch that exploit, or react to it, and at the very least you’ll be better prepared if in fact they carry this out. This implies that the way that you use these things is more often in retrospect. You can’t threaten others as often, but you can use it as part of a more elaborate attack. The use of cyber for conflict is, therefore, disproportionately useful for actors who are disproportionately capable. QUESTION: There’s a great analogy with rational nation states for dealing with an adversary who has cyber capability, but is not rational or constrained by what we would consider rational behavior (ISIS is in that category), then the method of deterrence is no longer effective. There’s no threat that you have over them. Also, what do you think about the Stuxnet as an application of cyber with no follow on kinetic activity, as compared to Georgia where there was a very connected cyber and kinetic attack?

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 54

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION EG: I’m going to assume that individuals in a high stakes environment have objectives, and they seek to realize them. They may do it poorly, but they’re trying to get somewhere. I perceive that ISIS has a doctrine and agenda, and they’re seeking to realize it. There’s this classical reductive logic that states if a terrorist can’t be deterred, the terrorist must be destroyed. But, if you can destroy them (or can convince them that you can destroy them), then you can deter them. Presumably the terrorist thinks that he is going to suffer a cost, but he is also going to achieve an objective. If you could absolutely convince him that he won’t achieve his goal, then all his motivation to take high risks is gone, and potentially he will quit. Right now, with ISIS, people are quitting the organization because they’re being less successful. Stuxnet is informative on two levels. One, it’s this attempt to be very subtle because the problem in Stuxnet would be that if it worked too well the Iranians would know immediately they had a problem. That’s part of the paradox: we want to do things that fundamentally affect our adversaries, but in cyber we have to be careful about how we do that. The Iranian response was to be so angry that they retaliated against the Saudis, even though they did not know if it was the Iranians, the Israelis or the Americans to blame. The Iranians understood that, because the U.S. had escalation dominance, picking a bigger kinetic fight with the U.S. would have turned out badly, and they ultimately accepted Stuxnet and moved on. That’s where this capability becomes functional and useful. It’s having a gun at a knife fight that makes you important. QUESTION: A lot of discussion has been about external threats, Russia and China. But an insider threat can also result in huge

losses. A Snowden-type incident can result in intellectual property loss at estimates anywhere from $500 million to a billion dollars. Is it more of an external or internal threat game? My second question is about deterrence. Is it the panel’s position that if there’s a hacking incident where Russia or China targets us and the potential deterrence is the imposition of embargoes, then we might be dragged along and suffer even worse because of the embargo? BB: The issue is about another aspect of resilience. How do we build counterintelligence, and how does that become part of an adaptive resilience strategy going forward? It’s hard because of the human factor. There are tools that can help us, but one thing that we typically don’t think about (especially when we’re in comfortable areas where everyone is cleared and has the same accesses) is that not everybody thinks the same way. Building counterintelligence strategies is important as another aspect of deterrence by denial and adaptive resilience. From a U.S. based multinational global perspective, we need to find ways to get likeminded partners together, not only in terms of heads of state, but also in industry. I advocate for an international cyber stability board. In the United States, successful partnerships have been created, and we should try to scale those models internationally. The financial services ISAC has scaled globally. The Enduring Security Framework goes well beyond information sharing into solutioning, with limited liability protections on both sides and leveraging the partner strengths. If we could do that on an international scale, we could begin to get the perspectives you’re talking about. It’s getting the “coalition of the

willing” together and taking advantage of the competencies within the group. EG: Cyber is fundamentally an informational domain and the kinds of conflict that will occur in that domain will be fundamentally informational, as well. The real play at the international level is espionage. There’s much to be gained by getting inside your adversary’s systems and learning about them. Strategic espionage during the 20th century is striking. The Second World War is a good example. Everybody was listening in on everybody else’s conversations at one point or another. In the First World War, the British left the Atlantic cable open to the Germans so the Germans could communicate with their agents in North America, in large part so that they could listen in on the conversation. It’s more valuable to know what the enemy’s doing than to stop them from doing it. The stability-instability paradox, which became important during the cold war, works in reverse in cyber. It’s maybe an explanation for why we’re worried about these big peak cyber attacks, a “cyber Pearl Harbor.” We have this flood of really low level stuff, and relatively little that looks big. We don’t know what the distribution looks like, but a lot of low intensity stuff would be consistent with the phenomenon that we think is true about the internet, which is it’s informational and fundamentally offense dominant. JS: The insider threat is a challenge more for the government because businesses are already dealing with insiders (people who are hired then leave and take direct or indirect IP with them). Businesses are used to the judgment of how open to be, how to protect what they’re doing, and with whom to sign an alliance. The government still tends to fight each other over budget investment

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 55

| 55 8/10/17 12:37 AM


POLICY TO ADDRESS CURRENT AND FUTURE THREATS lines, so we aren’t used to making those judgments. The other thing to remember is that this is so inherently flat, and it’s so easy to get access and start impersonating somebody. It’s only going to get easier. JS: I want to make one remark about Snowden and deterrence. There is one externality of what Snowden did that is potentially beneficial to the United States’ cyber deterrence policy. It has been very difficult in the past for the United States to signal its capabilities in cyberspace, because it’s classified, but now the whole world knows we are good at it. That bolsters the United States’ credibility in terms of cyber deterrence. QUESTION: What about nontechnical policies like insurance? Would that likely help the cyber security threat? ST: We’re seeing it grow and emerge. Most of the insurance world is looking at it in a venture sense within those industries. But, what does it look like from a profit potential perspective? Now more companies are trying to look at the insurance they need beyond the straightforward breach questions. It’s being actively explored, and early coverages are being written. It’s a pretty fertile area from a business creation and development sense. It will be interesting to see where the newer information control tech goes, like the block chain. BB: When I was with the IO data center company and served as their Chief Security Officer, we had to buy cyber risk insurance. The challenge was we were always looking back at the last breach to determine coverage amounts. The actuarial data model holds up to a point, but it is not the kind of valuation that’s needed going forward. An interesting area people are exploring, while working with telecos and financial service 56

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 56

companies, is “big data” analytics to help better predict (rather than just “look back”) at what coverages should be. QUESTION: During the interwar period, the period between World War I and World War II, there was an explosion of innovation. How people perceived their day-today life changed entirely because of the environment during that time period. Then, fast forward to today. With the policies that are in place now and the encouragement that we saw with the current administration on innovations, have we gone far enough to encourage, as a country, the innovation in cyber security? Are the policies for the publicprivate partnerships that enabled us during World War II in place today so that we can have free flow of information exchange of capabilities and capacities that we can leverage between government and the private sector, considering all of the regulatory requirements? What I’m alluding to is that we’re making it really hard to do that. What’s the policy that will enable us to partner with the public and private sector? JS: Innovation is a huge theme right now within the U.S. Government. When you look at the technologies that are coming out of Secretary of Defense Carter led innovation initiatives (e.g., AI, drones, network warfare), there’s a focus on offensive capabilities and utilization of digital technologies. Where we’ve seen less is innovation in terms of defending the capabilities that we’re creating with the other innovations. DIUx is working on this at some level working with partners in Silicon Valley to create solutions for network defense for the DoD. An asset that we haven’t seen used much within the Department of Defense

are the individuals that are not information or cyber warriors, but the fighter pilots, tank drivers, and infantry personnel, giving them more agency over their own digital capabilities so they understand their cyber terrain and can be integrated at a much lower level in terms of network defense. As we look at where innovation goes after Carter, we need to focus more in terms of cyber on both defense and also war fighter led innovation within the Department of Defense in cyberspace. ST: The issue isn’t that the policy hasn’t gone far enough, it’s more a cultural issue. During the inter-war years and World War II, this country was not globalized in the sense that it is now. It was a military culture of hardened survivors that had been bonded together in a tough time, and the entire country had come through the depression. Now, the world is chaotic and nobody is certain about the idea of networking the world together. Partnerships work when partners want them to work. It takes energy to push it against the tendency for the systems to relax back into the way that things have been. Changing the culture inherently takes time. EG: Another option to deterrence, which has always existed in military affairs and strategy, is deception. Cyber is a particularly useful domain in which to practice deception. We’re worried about it when hackers do it, but conceptually, defenders can do it too. Imagine a Trojan horse that, instead of requiring somebody to click on it, sits in somebody’s data archives, and, if an attacker penetrates, he has to choose between potentially taking malicious code back to his own servers or not taking that data. It weakens the attacker’s ability. There are practical and policy problems with implementing this, like hack backs and attempts by corporations to punish

8/10/17 12:37 AM


DAY 2 JANUARY 12, 2017 PANEL DISCUSSION hackers. But, the positive thing about this is the attribution problem is solved because the only way you can get that malicious code is by breaking into a server where you shouldn’t be. BB: There’s a piece out now on disruptive innovation which recommend to all of you. Craig Wiener at George Mason University put it together. Craig’s work highlights forward leaning work in information warfare and cyberspace activities that were taking place in the Defense Department during the 1980s and 1990s especially as the NSA realized they were going dark in an analog world. These early IW and cyber “pioneers” were actually quite successful in aligning vision, strategy and resources in a sustained campaign of innovation. Looking forward, there is a need to focus on unity of effort by bringing all the players together and doing national security campaign planning. It’s not just cyber—it’s about cyber-enabled campaign planning—and we need to strive for much better synchronization of efforts across the government agencies and in the private sector along with allies. We need to work hard on creating cyber resilience in a prioritized way. I focus where the thin line is in the military, but we need to also build foundational capabilities inside of critical infrastructure resilience. We need to grow anticipatory intelligence and attribution capabilities. We need to work on extended deterrence with our allies.

School should think about concepts that we need in enterprise risk assessment, security economics, and captive and self insurance, as well as how to incentivize public-private partnerships. The Bush School can examine the norms around the international cyber stability construct. From a political-legal perspective, there are great opportunities to reinterpret existing statutes, such as the Defense Production Act the FAST Act and the Critical Infrastructure Partnership Advisories Council (CIPAC), to help strengthen our cyber posture. Finally, from a national security perspective, we need to significantly grow our work cyber force and manage this talent pool. For example, we have opportunities with the National Guard Bureau, to develop a highly skilled workforce across public and private sectors. Innovation in cyber talent management is another area where I think A&M could make a significant positive difference for the country.

ST: There’s a point that hasn’t come up that is useful to keep in mind. There are about, by my count, 2 billion people in this world who live-in societies that are controlled, and the technology is agnostic. It’ll go anywhere, and it’s being used everywhere. A lot of us have some history in direct military conflict or nuclear deterrence or something else symmetric in some degree, fashion, or form, but in this case, we have a few very high end criminal gangs (state connected and sponsored) successfully fighting thousands of small operations across the United States and other places in the world. We’re a family of small operations and we’re all doing it on our own. The problem that worries me is that the sanction is social and cultural. There are 2 billion people governing throughout the world who aim to protect the people in their own society in their own way. That puts a fundamental different characteristic that could turn out to be pretty important to what we’re used to dealing with in a symmetric way.

These challenges present a great opportunity for A&M as a university. The engineering school should think about growing ways to build inherently secure processors at one level, and how to create secure and safe infrastructure at another level. Additionally, the school can think about how to retrofit legacy and build new infrastructure. The Mays Business CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_pp.indd 57

| 57 8/10/17 12:37 AM


CYBERSECURITY OF CRITICAL INFRASTRUCTURE 2017 SUMMIT PROCEEDINGS JANUARY 11 – 13, 2017 ANNENBERG PRESIDENTIAL CONFERENCE CENTER TEXAS A&M UNIVERSITY

Founded in 1997, the Bush School of Government and Public Service has become one of the leading public and international affairs graduate schools in the nation, offering master’s level education for students aspiring to careers in public service. The School’s philosophy is based on the belief of its founder, George H.W. Bush, that public service is a noble calling—a belief that continues to shape all aspects of the curriculum, research, and student experience.

HAGLER INSTITUTE FOR ADVANCED STUDY

a t Te x a s A & M Un i v e r s i t y

The Hagler Institute for Advanced Study at Texas A&M University (formerly the Texas A&M University Institute for Advanced Study) provides a catalyst to enrich the intellectual climate and educational experiences at Texas A&M.

TEXAS A&M ENGINEERING / TEXAS A&M ENGINEERING EXPERIMENT STATION The A&M System Engineering Program comprises the Texas A&M University College of Engineering and three state engineering agencies; the Texas A&M Engineering Extension Service (TEEX), Texas

Texas A&M University has a long history of providing information assurance and cybersecurity educational opportunities, dating back to the mid-1990s. The combination of great faculty, great facilities and great students ensures that A&M will soon move to the very forefront of cybersecurity research and education. The Texas A&M Cybersecurity Center seeks to advance the collective cybersecurity knowledge, capabilities and

A&M Transportation Institute (TTI), and the Texas A&M Engineering Experiment Station (TEES). TEES conducts research to provide practical answers to critical state and national needs which addresses major issues in all transportation modes (including surface, air, pipeline, water and rail), as well as policy, economic, finance, environmental, safety and security concerns.

practices through ground-breaking research, novel and innovative cybersecurity education, and mutually beneficial academic, governmental and commercial partnerships. 58

| THE TEXAS A&M CYBERSECURITY CENTER

TAMU_CSProceedings_pp.indd 58

8/10/17 12:37 AM


OUR MISSION THE TEXAS A&M CYBERSECURITY CENTER IS COMMITTED TO THE FACILITATION OF GROUND-BREAKING RESEARCH, THE DEVELOPMENT OF INNOVATIVE EDUCATIONAL METHODS, AND THE ESTABLISHMENT OF MUTUALLY BENEFICIAL RELATIONSHIPS WITH A WIDE VARIETY OF COMMERCIAL, GOVERNMENTAL, AND ACADEMIC PARTNERS.

TEXAS A&M CYBERSECURITY CENTER Dr. Daniel Ragsdale Director and Professor of Practice Dr. Robert “Trez” Jones Assistant Director Jennifer Cutler Program Coordinator Kelsey Henderson Student Worker Nate Graf Student Worker Jonathan Grimes Student Worker HAGLER INSTITUTE FOR ADVANCED STUDY Dr. John Junkins Director and Distinguished Professor Dr. Clifford Fry Associate Director TEXAS A&M ENGINEERING/TEXAS A&M ENGINEERING EXTENSION SERVICES Dr. Katherine Banks Vice Chancellor, Dean of Engineering Dr. Dimitris Lagoudas Senior Associate Dean for Research, Deputy Director of TEES Dr. N.K. Anand Executive Associate Dean of Engineering, Associate Director of TEES Dr. Narashimha Reddy Associate Agency Director for Strategic Initiatives and Centers for TEES Dr. Cindy Lawley Assistant Agency Director for Workforce Development TEES Charles Hill TEES Director for National Security Initiative Dr. Mladen Kezunovic Director of the Smart Grid Center Dr. James Wall Executive Director of Texas Center for Applied Technology Dr. Eleftherios Lakovou Director of Manufacturing and Logistics Innovation Initiatives Dr. Tom Overbye Professor, Department of Electrical and Computer Engineering Katherine Davis Professor, Department of Electrical and Computer Engineering

TAMU_CSProceedings_CVR-4pp.indd 3

Marilyn Martell Senior Assistant Vice Chancellor for Marketing and Communications Brian Blake Director of Communications Amy Halbert Assistant Director of Communications Aubrey Bloom Media Relations Coordinator Dedra Nevill Director of Donor Relations and Events Rachael Rigelsky Event Coordinator Dr. Brian Conroy TEES Director of Industry Relations Norman Garza Jr. Assistant Vice Chancellor for Government Relations Julia Pierko Director of Government Initiatives Kristle Comley TEES Program Specialist II Colton Riedel Bradley Fellow Ryan Vrecenar Bradley Fellow Tyler Holmes Student Andrew Meserole Student BUSH SCHOOL Mark Welsh Dean Jennifer Parks Program Coordinator Dr. Andrew Ross Professor, Department of International Affairs Dr. Danny Davis Senior Lecturer, Department of Public Service and Administration Kelsey Sutter Graduate Student Emily Otto Graduate Student

ANNENBERG PRESIDENTIAL CONFERENCE CENTER Sarah Chrastecky Interim Director Jamie Burns Assistant Manager Michael Johnson Audiovisual Technician Ashton Wood Event Assistant TEXAS A&M FOUNDATION Tyson Voelkle President Lynn Schlemeyer Vice President of Development Support Rori Brownlow Administrative Assistant to the Vice President for Development Support Palo Alto Networks Rick Howard Chief Security Officer Redseal Inc. Ray Rothrock Chief Executive Officer Cyber Strategies LLC Bob Butler Co-Founder and Managing Director United States Military Academy at West Point Colonel J. Carlos Vega Chief of Outreach for the Army Cyber Institute Information Trust Institute Edmond Rogers Smart Grid Security Engineer BLACKOPS Partners Corporation Casey Fleming Chairman and CEO

CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

8/10/17 12:40 AM


CYBERSECURITY OF CRITICAL INFRASTRUCTURE SUMMIT 2017 PROCEEDINGS

TAMU_CSProceedings_CVR-4pp.indd 4

8/10/17 12:40 AM


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.