Crowdsourcing Intelligence
Thomas Baracos K1183412
M.A. Intelligence and International Security Department of War Studies King’s College London
Submitted: 31 August 2012
Word Count: 14,723
-‐ The Information Technology Revolution
2
-‐ Information Overload
2
-‐ Pareto vs. Long Tail
2
-‐ Decentralised Production and Action
2
-‐ Theory of Crowdsourcing
2
-‐ Diversity and Aggregation
2
-‐ Contribution of Amateurs
2
-‐ Participation and Incentives
2
-‐ Mechanisms for Establishing Credibility
2
-‐ The Benefits of Crowdsourcing the Intelligence Process
2
1. Requirement Setting
2
2. Collection
2
2.1 HUMINT
2
2.2 IMINT
2
2.3 SIGINT
2
3. Analysis
2
4. Dissemination
2
-‐ Caveats and Limitations
2
-‐ Crowdsourced Situational Awareness
2
-‐ Model and Applications
2
-‐ Going Forward: From Analysts to Amateurs
2
-‐ Bibliography
2
2
Crowdsourcing Intelligence
The purpose of this paper is to demonstrate how crowdsourcing can be used to
alleviate the burden of information overload and thereby potentially enhance situational awareness. The paper will first address how the information technology (IT) revolution has affected the intelligence tradecraft. It will highlight the resulting issue of information overload and its consequences on both the organisation of intelligence activities and the production of finished intelligence. The paper will then examine the theory of crowdsourcing, the practice of obtaining needed services, ideas, or content by soliciting contributions in the form of an open call from a large group of people and especially from the online community rather than from traditional sources.1 It will outline how crowdsourcing is defined, who takes part in it, what allows for their participation and what motivates them to do so. Some examples used to explain the concept of crowdsourcing will be drawn from outside the intelligence domain but will still be of direct relevance to the intelligence tradecraft. Subsequently, it will be demonstrated how crowdsourcing can be used to deal with the repercussions of the information technology revolution. This means applying crowdsourcing methods to the way intelligence is conducted. In effect, each stage of the intelligence cycle can potentially benefit from the use of crowdsourcing methods and this paper will consider how. Most examples in this section will be directly related to intelligence matters. In addition to enumerating the potential benefits of crowdsourcing, this paper will also examine the potential drawbacks and limitation. Following that, it will propose that crowdsourcing the intelligence process offers a tool for creating enhanced situational awareness. Ultimately, the paper will conclude that crowdsourcing could redefine who conducts intelligence and how it is done by supplementing the intelligence professional with intelligent amateurs. 1 Merriam-‐Webster Dictionary, Crowdsourcing.
3
-‐ The Information Technology Revolution
To begin, it is necessary to determine the context in which crowdsourcing is
applicable. The information technology revolution, that is still occurring today, is the starting point. It has had profound implications for society as a whole, notably inducing changes in the fields of technology, information management and intelligence. It has provided mobile telephones, cable and satellite television, computers ranging in size from laptops to powerful supercomputers, the Internet, cloud computing, the explosion of wide-‐ area connectivity and the massive data storage and manipulation necessary for much of modern information management and intelligence. These new information technologies have led to an explosion of online open sources. Now vast quantities of information about target groups and countries, their economies, culture, physical geography, climate and so on are available not just centrally but at any point of access to the Internet, which is almost anywhere on the planet.2 Furthermore, everybody and anybody has access to this data. As a consequence, information technology is changing the way intelligence is used and perceived. It is no longer seen as a shadowy discipline operated only by spies and secret agents. We have effectively entered into a new ‘age of information’ and left behind the old ‘age of secrets.’3 Intelligence has moved away from the Cold War mentality.4 When the task of the intelligence agencies was dealing with closed societies that did not release information, covert methods were necessary to ascertain motives and strategy. Today’s new environment is one in which an analyst is faced with an abundance of information that can be obtained quickly and cheaply.5 To be sure, some of this information has always been available (and, of course, some remains secret), but the last two decades have given open information a recognition and usage commensurate with many changing aspects of contemporary society as both a product and an analytic tool to deal with it.6 The information revolution is thus radically transforming the environment in which the 2 Omand (2012), OSINT’s P lace i n A ll S ource Intelligence. 3 CRS Report for Congress (2007), Open S ource Intelligence, 2. 4 Burke (2007), Freeing Knowledge. 5 Ibid. 6 Gibson (2004), Open S ource, 19.
4
Intelligence Community (IC) operates. New threats, new technologies, new sources and new collection and analysis methods are now integral parts of the intelligence tradecraft. To remain effective in this new landscape, the Intelligence Community needs to adapt to the accelerated reality of the information technology revolution. -‐ Information Overload
The first and most significant impact the IT revolution has produced is that of
information overload. Thomas Fingar, former Deputy Director of U.S. National Intelligence, notes that on the one hand, thanks to the information technology revolution there are now enormous quantities of information available to intelligence professionals. However, there is also a vast amount of unanalysed information that is just data.7 The explosion of online open sources has left intelligence agencies overloaded with data and unable to answer key questions. It has become a long-‐standing academic orthodoxy that in most cases of surprise attack, intelligence existed that could have warned of an attack, but the intelligence was not pieced together or utilised in the right way. Such examples can be found in abundance in the history of intelligence. In Roberta Wohlstetter’s analysis of the United States’ failure to predict the Japanese attack on Pearl Harbour, she argues that such intelligence failures are to be expected because of the difficulty in identifying ‘signals’ from the background ‘noise’ of raw facts, especially due to the quantity of the latter. Having said that, the amount of noise that drowned out the warning of an imminent Japanese attack is almost insignificant compared to the terabytes of information that intelligence agencies have to deal with today. The risk is now that intelligence collectors become, as Thomas Fingar puts it, ‘vacuum cleaners on steroids.’8 There is simply too much information for intelligence agencies and analysts to cope with. For instance, a simple Google search for ‘Iran Nuclear’ yields about 8 million hits.9 Moreover, by some estimates this kind of stored information is doubling every two years.10 Therefore, to successfully process all this information, time and resources 7 Finley (2006), Intelligence fixes. 8 Ibid. 9 Google Search, Iran N uclear. 10 Treverton (2003), R eshaping N ational Intelligence, 9.
5
need to be aligned with only a few reliable sources on a select number of issues, or else there would be simply too much to report on. -‐ Pareto vs. Long Tail
The dire consequences of the Intelligence Community’s inability to cope with the
new information environment are best explained by a comparison of the Pareto principle and the Long Tail theory. In recent decades, the Pareto principle (also known as the 80-‐20 law) has become very popular in business and institutional decision-‐making. For a business, if 80 percent of profits are known to come from 20 percent of clients, it makes sense to target advertising at only those 20 percent of clients. For a software company like Microsoft, if 80 percent of software crashes are due to 20 percent of bugs, it makes sense to concentrate on fixing only those 20 percent of bugs.11 From these examples, one might imagine that the Pareto principle is a universal law. The Intelligence Community has effectively adopted such an approach. By focusing on a select number of critical threats, it believes it can account for most risks. Indeed, the former and present Directors of Central Intelligence have been on record saying that their agencies can only cover the top tier targets. 12 They do not permit for coverage of the lower tier issues and countries because of a lack of resources. The Director of National Intelligence stated in the August 2009 National Intelligence Strategy that by concentrating efforts on Iran, Russia and North Korea, it would be possible to reduce risks of proliferation of weapons of mass destruction.13 Focusing on Iran is also of prime concern due to its support of terrorism and provision of lethal aid to US and Coalition adversaries.14 Therefore, by addressing what are seen as core issues, the related minor threats will be accounted for. Seemingly, this would be logical. What is most critical warrants most attention. However, this kind of thinking allowed security officials to miss the warning signals that could have prevented the Fort Hood shooting that left 13 dead, as well as the Christmas Day bomb attempt, which by contrast, was thwarted not by 11 Hafner (2001), P areto’s P rinciple. 12 Steele (2002), T he N ew Craft of Intelligence, 131. 13 Office of the Director of National Intelligence (2009), T he N ational Intelligence S trategy. 14 Ibid.
6
the thousands of analysts employed to find lone terrorists, but by an alert airline passenger who saw smoke coming from his seat-‐mate.15
Indeed, there are cases where the Pareto principle does not apply. Such cases have
been labelled as ‘Long-‐Tailed’ distributions by Chris Anderson, the editor-‐in-‐chief of Wired Magazine and curator of TED (Technology, Entertainment, Design). This is how he describes it: The theory of the Long Tail is that our culture and economy is increasingly shifting away from a focus on a relatively small number of ‘hits’ (mainstream products and markets) at the head of the demand curve and toward a huge number of niches in the tail. As the costs of production and distribution fall, especially online, there is now less need to lump products and consumers into one-‐size-‐fits-‐all containers. In an era without the constraints of physical shelf space and other bottlenecks of distribution, narrowly-‐targeted goods and services can be as economically attractive as mainstream fare.16 One such example is book retail. Amazon has found that more than half of books sales are from outside the top-‐selling 20 percent of titles (in this case, the top 130,000 titles).17 If Amazon were to offer only the top-‐selling titles, it would lose a substantial portion of sales. ‘What is really amazing about the Long Tail’, says Anderson, ‘is the sheer size of it.’ By combining enough non-‐hits on the Long Tail, a market bigger than the actual hits is created. Consider the implication: If the Amazon statistics are any guide, the market for books that are not even sold in the average bookstore is larger than the market for those that are. In other words, the potential book market may be twice as big as it appears to be.
15 Washington Post (2012), T op S ecret A merica. 16 Anderson, T he Long T ail, In a N utshell. 17 Anderson (2004), T he Long T ail.
7
The question is, should intelligence services concentrate on only the highest-‐priority
20 percent of threats, or should they also evaluate the remaining 80 percent in the ‘tail’? In other words, are threats distributed more like software bugs, or are threats distributed more like books? Even if it can be shown that 20 percent of threats account for 80 percent of risk (certainly not a foregone conclusion), in deciding whether or not to apply the Pareto principle, i.e., to examine only high-‐priority threats, one must look at the possible consequences. If one makes the wrong choice regarding books sales, the potential consequence is foregone profit. If one makes the wrong choice in deciding which threats to analyse, the potential consequence is death and destruction at the hands of terrorists or military forces. In the words of Condoleezza Rice: Let us remember, those charged with protecting us from attack have to succeed 100 percent of the time. To inflict devastation on a massive scale, the terrorists only have to succeed once, and we know they are trying every day.18 18 CNN Politics (2004), T ranscript of Condoleezza R ice's 9/11 commission s tatement.
8
-‐ Decentralised Production and Action
As a consequence of information overload, not all potential threats are covered
because attention and efforts are focused on only a small number of possible targets. Some will likely slip through the net and these gaps will lead to warning failures. How, then, is it possible to cover all potential threats? One solution is to widen the range of people who can contribute to the task. Non-‐governmental analysis of security issues, whether by members of civil society, academia or the private sector, can help to identify emerging issues and set priorities.19 Whilst some aspects of intelligence may always remain the preserve of secret government agencies, the intelligence cycle can nevertheless benefit from external contributions and open source intelligence.20 Prior to the advent of the Internet, it was too expensive to decentralise social production and action, but today, organisations have a way to instantaneously coordinate autonomous efforts on a massive scale for almost nothing. Yochai Benkler, of Harvard Law School, notes that this reality is best exemplified by the use of cell phones.21 Once cellular technology became mainstream, people stopped making plans. When they would speak, they would say things like: ‘I will call you when I get there’ or ‘call me when you get off work.’ That is a point-‐to-‐point replacement of planning with coordination.22 Now with more advanced information technology, it is possible to do the same with large groups. Rather than having to make any kind of long-‐term projections, a group can be coordinated and instantly deal with issues when they arise. The fact that they are now well enough coordinated means that it is not necessary anymore to decide in advance what to do. Each can do his own part and when it becomes necessary to join forces in order to address a particular issue or deadline, they can do so.
The problem of information overload is indeed determined by the fact that the
knowledge of the circumstances of which intelligence professionals must make use never 19 Martin, Wilson (2008), T he V alue of N on-Governmental Intelligence. 20 Ibid. 21 Shirky (2005), Institutions v s. Collaborations. 22 Ibid.
9
exists in concentrated or integrated form, but solely as the dispersed bits of incomplete and frequently contradictory knowledge possessed only by all the separate individual members of the ‘crowd.’23 Now, thanks to new information technologies, instead of having to take on the institutional and structural costs of drawing workers together in a prearranged structure with explicit goals, such as the Intelligence Community, the intelligence process can be reengineered so that anybody can contribute at any amount at any time and from any place. That being the case, the IC’s work could thus be expanded to cover all potential threats, not only those deemed critical. Thanks to the information technology revolution, coordinating such a group effort is now not only possible, but it is cheap, fast and effective. Doing so is what crowdsourcing is really all about: taking a function once performed by employees and outsourcing it to an undefined and generally large network of people in the form of an open call. In effect, the Intelligence Community’s best option to solve the challenges posed by the information technology revolution is to crowdsource intelligence. Thus, the IC can potentially widen the intelligence picture and sharpen it. -‐ Theory of Crowdsourcing
Before explaining how crowdsourcing can be applied to intelligence, the parameters
of crowdsourcing must be defined. What exactly is crowdsourcing? The term ‘crowdsourced’ describes projects whose design, construction and execution are implemented by a community of people rather than a single corporation or organisation. To date, crowdsourcing has mostly been used as a Web-‐based business model to harness the contributions of a distributed network of individuals. This has taken the form of peer-‐ production (when the job is performed collaboratively), but has also often been undertaken by sole individuals. The crucial prerequisite is the use of the open call format and the large network of potential collaborators. In other words, a problem is posted online, a vast number of individuals offer solutions to the problem and the winning ideas are awarded some form of a bounty.24 23 Martin, Wilson (2008), T he V alue of N on-Governmental Intelligence. 24 Brabham (2008), Crowdsourcing as a M odel for P roblem S olving.
10
Jeff Howe, who coined the term ‘crowdsourcing’, gives an excellent illustration of the
practical application of this concept in his book Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business. Goldcorp, a Canadian gold mining company, developed the Goldcorp Challenge in March 2000. Its Red Lake Mine in Ontario, Canada, was severely under-‐performing due to the fact that the mine’s deposits were buried deep underground and the company’s geologists were unable to determine the precious metal’s exact location. Rob McEwen, the company’s founder and former chairman, thought that if his employees could not find the gold, then someone else would have to. Mining is an intensely secretive industry, and apart from the minerals themselves, geological data is the most precious and carefully guarded resource. McEwen was determined to find those precious minerals so he decided to do what no one had ever done; he shared all his company’s geological data (which went back as far as 1948) with the whole world. Four hundred megabytes worth of data about the 55,000 acre site were placed on the company’s Website. Participants from around the world were encouraged to examine the geologic data from Goldcorp’s Red Lake Mine and submit proposals identifying potential targets where gold could be found in exchange for a $575,000 bounty. Word spread fast around the Internet and within a few weeks, submissions came in from all over the world as more than 1,000 virtual prospectors from 50 countries reviewed and analysed the data. Some were geologists, but many were individuals in unrelated sectors – mathematicians, military officers, students, and consultants. ‘They had applied math, advanced physics, intelligent systems, computer graphics, and organic solutions to inorganic problems. There were capabilities I had never seen before in the industry,’ says McEwen. In all, more than 110 sites were identified and 50 percent of these were previously unknown to the company. Of these new targets, more than 80 per cent yielded significant reserves – $6 billion worth in gold to be exact. By crowdsourcing the company’s problem, McEwen turned his $100 million mining business into a $9 billion juggernaut.
What this example of crowdsourcing provides is a view into a problem-‐solving
model that can be generalised, applied to a variety of industries, to solve both mundane and highly complex tasks. Crowdsourcing is not merely a buzzword, but is instead a strategic 11
model to attract an interested, motivated crowd of individuals capable of providing solutions superior in quality and quantity to those that even traditional forms of business can.25 Here are some examples. First, the crowd outperformed in-‐house geophysicists during the Goldcorp Challenge. BAE Systems, Eli Lilly and Hewlett Packard, amongst others, publish technical research and development challenges on the InnoCentive Website and rely on the crowd to solve the problems that stump their corporate scientific researchers. A business called Threadless operates a Website that allows the crowd to design original t-‐ shirts that always sell out of stock. With the iStockphoto Website, the crowd produces commercials and fresh stock photography on par with professional firms. And above all, the crowd outperforms industry faster and cheaper than even the top minds in their respective fields. It took only a few weeks for Goldcorp to start receiving submissions and the $575,000 prize is a tiny fraction of the ensuing $6 billion revenue. Threadless produces new t-‐shirt prints every week and the artists whose designs are chosen receive $2,000 in cash and a $500 gift card (or an extra $200 in cash).26 Such compensation is not a great expense when one considers that Forbes found that Threadless made around $30 million of revenue just in 2009.27 The fact is, crowdsourcing is changing the value of intellectual labour in a transnational world.28 It may come to be the single fastest way of achieving a shared understanding of any and all challenges, from economic and social threats to state and non-‐state violence, to terrorism and transnational crime.29 -‐ Diversity and Aggregation
The first key concept to understand is that crowdsourcing is not a search for
unanimity. Instead, as James Surowiecki, author of The Wisdom of the Crowds, explains, part of crowdsourcing’s success is derived from averaging solutions. 25 Brabham (2008), Crowdsourcing as a M odel for P roblem S olving. 26 Wei (2009), T -Shirt S tartup T hreadless. 27 Ibid. 28 Brabham (2008), Crowdsourcing as a M odel for P roblem S olving. 29 Steele (2006), P eacekeeping Intelligence.
12
The wisdom of crowds is not about consensus. It really emerges from disagreement and even conflict. It is what you might call the average opinion of the group, but it is not an opinion that everyone in the group can agree on. That means you cannot find collective wisdom via compromise.30 The collective decisions taken by the crowd will only be smart if each individual tries to be as independent and original as possible. Instead of just taking the advice of an expert, each member of the group has to make his own choices, based on his own background and experiences. In doing so, it will make the group smarter. Independence of judgement is fundamental so as not to lose that one idea, different from all others, that might be the best solution to a problem. It is thus important to understand that conventional and expert opinions are not the only ones that matter. In the case of crowdsourcing, the more, the stranger, the better. This plurality of experiences, combined into a single viewpoint, is what the wisdom of the crowd consists of.
The quintessential examples of this type of distributed intelligence are those of
guessing the temperature of a room or how many jellybeans are in a jar. They demonstrate that averaging the guesses of many people, no matter their backgrounds, can be more accurate than the guess of one expert individual. Such wisdom of crowds also manifests itself from sports betting to traditional opinion polls.31 This implies that the aggregate accuracy of a crowd’s estimate will consistently be superior to the accuracy of any single individual, regardless of their capabilities.32 As a matter of fact, Robert David Steele writes that: ‘on any given topic, most experts do not agree with one another, and are correct roughly 65 percent of the time. The crowd by contrast, with bias and ignorance, when normalised, is right on target’ roughly 90 percent of the time.’33 Crowdsourcing can thus truly be an effective problem-‐solving model. 30 Random House (2004), Q&A with James S urowiecki. 31 CNET News (2008), Q&A: Jeff H owe. 32 Mathur, Crowdsourcing. 33 Steele (2006), P eacekeeping Intelligence.
13
However, greater individual participation will not, on its own, add up to much
unless it is matched by a capacity to share and then combine ideas.34 To do so, the Internet is key. The Web provides a perfect platform capable of aggregating millions of disparate, independent ideas in the way markets and intelligent voting systems do.35 Dedicated global information networks and the establishment of open-‐intelligence information architectures empower collective groups of individuals. The development of information technology is thus making many new forms of information-‐sharing possible.36 It allows for amateurs to gather, produce, share and own information all together. The beauty of it is that crowdsourcing, like the Internet through which it operates, recognises no boundaries. It is accessible to anyone and anywhere. The network does not care if contributors are down the block, downstate, or down under – if they can perform the service, design the product, or solve the problem, then they can participate.37 -‐ Contribution of Amateurs
It is now apparent that not everyone need be an expert to contribute to
crowdsourced projects. In fact, the exact contrary holds true. The wise crowd insists on the presence of non-‐experts, on the presence of amateurs.38 An expert is ‘a person who has a comprehensive and authoritative knowledge of, or skill in, a particular area.’39 In comparison, an amateur is defined as ‘a person who engages in a pursuit on an unpaid basis.’40 In essence, the two definitions are not mutually exclusive. Consequently, an amateur may very well have developed the same capabilities as an expert, and adhere to the same standards, although without necessarily receiving the same level or kind of compensation. Amateurs are those who love what they do and thus devote their time and energy to get good at it. They are those who participate in crowdsourcing projects such as 34 Leadbeater (2008), We-Think, 2. 35 Brabham (2008), Crowdsourcing as a M odel for P roblem S olving. 36 Steele (2006), P eacekeeping Intelligence. 37 Howe (2008), Crowdsourcing, 16. 38 Brabham (2008), Crowdsourcing as a M odel for P roblem S olving. 39 New Oxford American Dictionary, E xpert. 40 New Oxford American Dictionary, A mateur.
14
InnoCentive and iStockphoto. Jeff Howe argues that ‘the best person to do a job is the one who most wants to do that job: and the best people to evaluate their performance are their friends and peers who will enthusiastically pitch in to improve the final product, simply for the sheer pleasure of helping one another and creating something better from which they will all benefit.’41
The online hacker community is an example of amateur crowdsourcing that is
particularly relevant to intelligence. A dedicated group of individuals with the necessary skill set has managed to develop the tools to breach the supposedly most secure of networks, stealing megabytes upon megabytes of valuable information and leaving governments and corporations scrambling to catch up. This group is called Anonymous. The members are unpaid and their identities are unknown, but all of them are specialists. Evidently, motivated non-‐professionals can become just as good, and in some cases even better, than the leading authorities. That is another reason to not blindly follow the advice of commonly established experts; they may not be the best. Some of the most technical domains are the realm of the dilettante and not just the professional. The same could be said of the intelligence tradecraft. The intelligence establishment is not composed of all potential collaborators. Only a small fraction of potential contributors are selected to become analysts. Therefore, there are many non-‐expert collaborators within the crowd that could be very useful for producing intelligence. Those amateurs are the ones that need to be mobilised via crowdsourcing so as to benefit the work of the Intelligence Community.
There are several other explanations for the ascendency of amateurs over experts.
Experts, no matter how smart, only have limited amounts of information. It is very rare that one person can know more than a large group of people, and almost never does that same person know more about a whole series of questions.42 An individual’s capacity to accumulate knowledge is naturally limited. By contrast, a group is capable of storing a much larger volume of information. As a result, the pool of knowledge from which an 41 Crowe (2012), Disasters 2.0, 208. 42 Random House (2004), Q&A with James S urowiecki.
15
expert can draw to answer a question is quite small when compared to that of an intelligent crowd -‐ hence, the advantage of using the crowd for problem-‐solving.
Actually finding an expert can also be problematic. It can actually be very hard to
identify true experts. Indeed, it is growing increasingly difficult for organisations to find or retain talent.43 The knowledge industry is no longer market-‐based or government-‐owned as it previously was. The information technology revolution has changed that. Today, competition for top talent is fierce and true experts are a rare commodity. The same applies for intelligence agencies. They cannot attract all of the top analytical thinkers. However, what is not in scarce supply are skilled amateurs. Those amateurs, while maybe not recognised experts, do have information that can be useful -‐ hence, their inherent value. In the world of mass collaboration, James Surowiecki notes that if a group is smart enough to find a real expert, it is more than smart enough not to need one.44 The information retained within the crowd can compensate, complete or corroborate that of the paid professionals. -‐ Participation and Incentives
Another important concept to understand about crowdsourcing is why people
participate. According to Ryan and Deci, ‘to be motivated means to be moved to do something.’45 By the same token, Kanfer suggests, that motivation is a stable psychological force within individuals that activates goal-‐directed behaviour.46 Battistella and Nonino divide behavioural motives into intrinsic and extrinsic categories. Intrinsic motivation is associated with the motivating potential of the task itself: the activity is valued for its own sake, and it is the best reward for doing something. The motivation comes from within the person, from the pleasure of working on the task or the satisfaction of completing a task. This is probably the very first reason to participate in crowdsourcing. Most collaborators have an interest in the task at hand and therefore actually want to participate, just like 43 Tapscott, Williams (2008), Wikinomics, 25. 44 Random House (2004), Q&A with James S urowiecki. 45 Ryan, Deci (2000), Intrinsic and E xtrinsic M otivations. 46 Kanfer (1990), M otivation T heory.
16
those working together on the Wikipedia free-‐content encyclopaedia project, the best-‐ known crowdsourcing enterprise to date. Extrinsic motivation comes from outside the person. In its purest form, it is based on the prospect of receiving compensation for performing a task, i.e. the reward is separable from the task itself. Financial compensation, for example, is an extrinsic motivation that can spur people into action. It is an important type of motivation for crowdsourcing projects such as Threadless, InnoCentive and iStockphoto.
A good example of intrinsic motivations is the drive behind the Intellipedia project.
Intellipedia is an online system for collaborative data sharing used by the United States Intelligence Community. It is built on the same software used by Wikipedia.47 The Defence Intelligence Agency (DIA) conducted an ethnographic study of Intellipedia which concluded that the users they spoke with were typically very excited about the way Intellipedia affords them the opportunity to publicise their work and interests across the larger IC.48 Analysts were posting to Intellipedia voluntarily, without clear direction from their managers. These individuals seem motivated by a desire to make their division’s information available to others in the Intelligence Community and that this was a task worthwhile in itself.49 Seemingly, this shows how individual members of the community were intrinsically motivated to participate in the project.
The crisis relief efforts undertaken after the 2010 earthquake in Haiti illustrates
how extrinsic motivations can incite participation in crowdsourcing projects. The catastrophic magnitude 7.0 earthquake that affected several million people also destroyed most of the country’s infrastructure. As a consequence, the flow of humanitarian aid was broken up because not only was it difficult to reach those in need, but it was also nearly impossible to even know who and where those in need were. The solution put forward was to rely on Haitians themselves. Those individuals in need of help, but out of reach, could use their mobile phones to let authorities know their geographic position and their physical 47 Magnusson (2006), Feds Lagging. 48 Dixon, McNamara (2008), Our E xperience with Intellipedia. 49 Ibid.
17
condition by submitting reports via the Internet, social media platforms and text messages. Relief workers were then able to capture these real-‐time reports, identify immediate needs for medical assistance, food or water, and plot them on an interactive Web-‐based map available to anyone with an Internet connection. It was thus possible to coordinate and prioritise the relief efforts. This is how a cholera outbreak was tracked and eventually traced back to its origins. This example demonstrates how the prospect of receiving humanitarian and financial assistance motivated the Haitian population into participating in a crowdsourcing project. Thus, an extrinsic motivation pushed them to take part in a new, unconventional way. -‐ Mechanisms for Establishing Credibility
With all that, there still remains the important issue of credibility. There needs to be
some kind of mechanism to address the problem of participants spreading misinformation, either by intent or ignorance. Some contributors may be led to participate only for financial compensation, regardless of standards. It must be understood that individuals engaging in such behaviour actually marginalise themselves and are eventually eliminated by nature. In fact, there are a number of mechanisms to ensure quality and credibility shown by the examples below. First, Threadless has two ways of dealing with such contributors: formal and informal peer review. When individuals send their t-‐shirt prints, each design is reviewed by other contributors who formally vote on their value. Only the designs that get a large amount of votes are manufactured. Therefore, if individuals send low-‐quality t-‐shirt designs, they will not be chosen and they will be sidelined. The marketplace provides for the informal peer review. Consumers effectively ‘vote’ with their credit cards. If a t-‐shirt design is no good, it will never sell, thus eliminating that contribution.
On Wikipedia, there exists similar review systems to eliminate false information. As
a matter of fact, Wikipedia has even crowdsourced that process. Regular contributors, recognised for the quality of their work, are invited to be part of a group (basically another, smaller crowd) that evaluates all new entries. This arrangement has proven to be quite successful. As point in case, the Encyclopaedia Britannica was found to have nearly as many 18
errors as Wikipedia. The key difference is that Wikipedia’s fluid content creation mechanisms and large volunteer community ensure that its errors get fixed quickly.50
Intellipedia users have also developed a mechanism to gather contextual
information on their peers, allowing them to assess the credibility of contributors.51 One of the ways analysts determine the validity of an Intellipedia page or change is to click on the author’s link and look at their background. Since there is no anonymity in Intellipedia, all the information about an author’s previous contributions, where and for whom he works within the IC, as well as his contact details, is available. Clicking on the author’s link also frequently brings the reader to the contributor’s blog: as several of the users interviewed by the DIA study team pointed out, it is not unusual for people who contribute to Intellipedia to maintain a blog, and to provide Intellipedia links to their blogs.52 These provide a place for people to establish their identity. As one heavy Intellipedia user commented, people check his blog to see if he has the right credentials for the work he is doing.53 This and the other integrated methods for evaluating trustworthiness demonstrate that it is possible for crowdsourcing to be a reliable source of information.
To sum up, crowdsourcing is based on the principle that the more people participate
in a group and contribute to its goals, the more valuable the group. By combining all the individual contributions of the crowd through the Internet, the group can come up with more creative and effective ideas. Indeed, creativity has always been a highly collaborative, cumulative and social activity in which people with different skills, points of view and insight share and develop ideas together.54 At root most creativity is collaborative; it is not usually the product of a lone individuals’ flash of insight. The more ideas are shared the more they breed, mutate and multiply, and that process is ultimately the source of creativity and innovation.55 Crowdsourcing harnesses these ideas and channels them into a 50 Tapscott, Williams (2008), Wikinomics, xi. 51 Dixon, McNamara (2008), Our E xperience with Intellipedia. 52 Ibid. 53 Ibid. 54 Leadbeater (2008), We-Think, 6. 55 Ibid.
19
single productive effort. Hence, the crowd can now achieve goals that individual members could not. The Intelligence Community could benefit from this type of creative endeavour. It could not only use some outside help; as it was demonstrated earlier, it needs all it can get in order to deal with the effects of the IT revolution. -‐ The Benefits of Crowdsourcing the Intelligence Process
To understand how intelligence can be crowdsourced, it is necessary to begin with a
brief summary of how intelligence is conducted. Intelligence can be defined as a process by which basic information is transformed into actionable knowledge.56 Accordingly, Michael Turner’s definition of intelligence illustrates this process. He describes intelligence as being policy-‐relevant information that is collected by government employees, subjected to evaluation and analysis, and then disseminated to government decision-‐makers.57 These divisions into stages are by no means clear-‐cut, but they do provide the system’s basic framework.58 The intelligence process is generally described as a cycle starting with a function of direction that sets the requirements and priorities to be given to the intelligence agencies as to what areas of intelligence customers would like to obtain information.59 Then comes the collection phase. Traditionally, collection relied on the recruitment of human sources (HUMINT) and the interception (and when necessary deciphering) of communications (COMINT) and their patterns and modes that together with radar and electronic intelligence (ELINT) make up the broader category called signals intelligence (SIGINT).60 In addition, imagery from satellites and photo-‐reconnaissance, called IMINT, is also essential. The cycle then moves on to the processing and analysis of information. Bringing different lines of reporting together, key judgments are derived about the meaning of the intelligence.61 Finally, the cycle ends with the dissemination of the finished 56 Gill, Phythian (2010), Intelligence, 3. 57 Turner (1991), Issues. 58 Herman (1996), Intelligence, 39. 59 Omand (2010), S ecuring the S tate, 117. 60 Ibid., 29. 61 Ibid., 118.
20
intelligence to end-‐users for it to be employed in decision-‐ and policy-‐making.62 Now that it has been established how intelligence operations are structured, it is possible to apply the theory of crowdsourcing to each stage of the cycle. 1. Requirement Setting
The complexity of the new information environment has made it substantially more
difficult to establish what it is that needs to be known and done by intelligence services. Alex Martin and Peter Wilson believe that the whole methodological basis of intelligence production is inadequate for this new environment. According to them, it relies on gathering intelligence which aims to confirm an existing view, rather than the generation of competing hypotheses which the intelligence agencies then systematically aim to refute.63 This approach is said to be at the heart of intelligence failures such as the 1982 Falklands and the 1991 Gulf crises.64 In those cases, it would appear that policy-‐makers ‘pulled’ intelligence to confirm their beliefs and substantiate their policies. Consequently, as Michael Handel wrote: ‘the only hope for improving the quality and reliability of the Intelligence Community lies in distancing it from the political arena.’65 This approach is based on professional objectivity. Intelligence should be ‘pushed’ towards decision-‐makers so as to ensure that all potential threats are addressed. This point of view is consistent with the idea that by opening up the requirement setting process to illicit participation from the crowd, most risks, if not all, would be covered. Indeed, the crowd is the real information leader in today’s world, and its ‘eyes on target’, whatever the deficiencies of some of its individual members, are the gold standard for global coverage.66 If intelligence concerning any and all potential threats is to be provided to decision-‐makers, more contributors than just the usual intelligence services need to be involved. Individuals on the ground can push intelligence to analysts in the community. The aim would be to import a degree of 62 Ibid., 126. 63 Martin, Wilson (2008), T he V alue of N on-Governmental Intelligence. 64 Omand (2010), S ecuring the S tate, 179. 65 Handel (1989), War, S trategy and Intelligence. 66 Steele (2006), P eacekeeping Intelligence.
21
competition and a range of different perspectives to the current complacent and conservative process.
In accordance with this view, what Martin and Wilson suggest is perfectly suitable
to crowdsourcing. The generation of alternative hypotheses is a creative process requiring a willingness to challenge accepted wisdom. The most radical hypotheses, based on the least existing evidence, are likely to be the most powerful. If they are false they can be rapidly refuted. But if they cannot be refuted and therefore have to be taken seriously, then they are likely to be more valuable and insightful than a hypothesis which builds incrementally on the existing accepted wisdom.67 Thus, in an open call external collaborators could be solicited to generate hypotheses, in competition with government analysts, which are radical but still rooted in some reality and insight. Historians, psychologists, journalists and other experts could then bring to bear their own professional knowledge to sift the hypotheses and attempt to refute them, meeting the need to include historical, psychological and geopolitical factors in analysis rather than relying solely on secret intelligence. The hypotheses which could not be refuted by the crowd using open information would then generate a list of intelligence ‘requirements’, which could guide the intelligence agencies to focus their efforts on those hypotheses which were grounded in open information, contained most insight and could not be safely disregarded.68 Therefore, the process of crowdsourcing requirement setting could increase the reach of intelligence agencies by spotlighting threats requiring action. Such a method would be cheaper, faster and would allow the IC to gain new insight on critical situations. It could thus build a more complete intelligence picture.
67 Martin, Wilson (2008), T he V alue of N on-Governmental Intelligence. 68 Ibid.
22
The US Navy has turned to crowdsourcing via online multiplayer games in order to
hunt for better ideas in its fight against piracy.69 It will host one of the least likely online games ever: MMOWGLI (Massive Multiplayer Online War Game Leveraging the Internet), pronounced like the Jungle Book protagonist. Larry Schuette, the director for innovation at the Office of Naval Research, describes the game like this: A pirate scenario first appears on your screen: three pirate ships are holding the world hostage. Chinese-‐US relations are strained to the limit and both countries have naval ships in the area. Humanitarian aid for rig workers is blocked. The world is blaming the US for plundering African resources. What do you do? Two text boxes pop onto the screen. The first reads ‘Innovate’, and asks: ‘What new resources could turn the tide in the Somali pirate situation’? The second reads ‘Defend’ and asks: ‘What new risks could arise that would transform the Somali pirate situation’? Beneath either are two boxes to import and record your brief answer: 140 characters (just like Twitter). Then comes the crowdvoting. During the first week of the game, your fellow players will vote on your suggestion. If they think it is noteworthy, they can tweak it. New cards allow players to Expand (‘Build this idea to expand its impact’), Counter (‘Challenge this idea’), Adapt (‘Take this idea in a different direction’) or Explore (‘Something missing? Ask a question’). Players are awarded points based on the number of affirmations their ideas get from their peers. Based on that, players are invited to the next round. There are three rounds, with each lasting a week, so the ideas can marinate. People with good ideas will win. At the end of the third week, the game will display a logical treeing of those ideas. It is more like systems analysis, as opposed to wargaming.70 A crowd of about 1000 registrants, approximately split evenly between dot-‐mil, dot-‐com and dot-‐edu email addresses, pitched in to come up with innovative solutions to a fictitious crisis that is not that far removed from reality. The winning contributions could then be 69 Ackerman (2011), N avy Crowdsources P irate Fight. 70 Ibid.
23
taken up to further the US counterpiracy strategy, hence setting requirements for the Intelligence Community. In effect, this relatively simple and open process of crowdsourcing the hypothesis generation and testing against open information could lay the foundation for more effective intelligence gathering and analysis. 2. Collection
The advent of the information technology revolution has made it both possible and
necessary to crowdsource the intelligence collection phase. It has made it possible because the wide-‐spread use of information technology is changing who can actually perform the task. It has made it necessary because of the problem of information overload. Intelligence collectors are overwhelmed by the sheer amount of data they have to gather and appraise. This is true for all three of the most commonly referenced intelligence collection methods; HUMINT, IMINT and SIGINT. That being so, intelligence collectors could use the crowd’s help to complement and facilitate their tasks. 2.1 HUMINT
First of all, crowdsourcing would add a whole new dimension to human intelligence
collection. In the past, it was neither feasible nor efficient to collect human intelligence reports from everyone in the community. Today, the IT revolution is bringing together a combination of technology and social forces that makes effectively integrating information from civilian sources into an all source intelligence picture possible.71 Therefore, it is possible to quickly and inexpensively create a large, inclusive network to take advantage of data collected from sources that were once inaccessible, including those who will only have one significant contribution, ever.
By crowdsourcing human intelligence gathering, the security situation of the areas
of operation is becoming less of an important factor. The recent events of the Arab Spring, 71 Marshall (2012), Fusing Crowdsourcing and Operations.
24
notably in Egypt, Libya and Syria are good examples. The crackdowns in those countries by governmental forces have compelled many to flee their homes and their countries have been largely sealed off to the external world. Access to insider information on these crisis is consequently very difficult. In times before the widespread presence of information technology, it would have been almost impossible to obtain further information on such situations, whereas now, information about the government’s violence can be made available online by anyone on the ground. For instance, Websites called Libya Crisis Map and Syria Tracker have made it possible to communicate and retrieve information about ongoing developments through wireless technologies such as cell phones and computers.72 Witnesses submit reports of events via phone or social networks, much like the Haitians did, which are subsequently translated, geo-‐referenced, coded and verified by a group of volunteers who triangulate the information with other sources.73 They also filter the reports and remove duplicates. Reports that have a low confidence level vis-‐a-‐vis veracity are also removed. Volunteers also use a voting feature to ‘score’ the veracity of eye-‐witness reports. Using this approach, the Syria Tracker team and their volunteers have been able to verify almost 90 percent of the documented killings mapped on their platform thanks to video and photographic evidence.74 They have also been able to associate specific names to about 88 percent of those reported killed by Syrian forces since the uprising began.75 It is also known for a fact that the International Criminal Court and Amnesty International closely followed the Libya Crisis Map to obtain information.76 In effect, treating each member of the population as a stakeholder could increase their willingness to participate in crowdsourcing projects. As collaborators begin to see the success and positive outcomes of their reporting, it is safe to assume that the probability of further contributions will increase: success begets success.77 72 Syria Tracker, Crowdmap. 73 Meier (2012), Crisis M apping S yria. 74 Ibid. 75 Ibid. 76 Meier (2012), Crisis M apping S yria. 77 Mumm (2012), Crowdsourcing: A N ew P erspective.
25
2.2 IMINT
In addition to HUMINT, imagery intelligence could also benefit from using
crowdsourcing methods. IMINT collectors could employ the crowd much in the way that the National Aeronautics and Space Administration (NASA) already has. NASA has decided to crowdsource imagery analysis of Mars to create a map of the planet. In a very telling display of the potential of crowdsourcing, 85,000 people contributed and their work was said to be practically indistinguishable from the markings of a fully trained expert once you computed the average.78 In the realm of intelligence, with the proliferation of digital IMINT sensors -‐ whether on satellites, manned aircraft or drones -‐ thoughts are turning to storage and management of the data they generate.79 While experience with satellites is now decades-‐old, it has never encompassed today's resolutions, bit rates and sheer number of airborne sensors. Lieutenant General David A. Deptula of the US Air Force warns that the IC will soon be ‘swimming in sensors and drowning in data.’80 With commercial satellite such as the American GeoEye-‐1 that can every day produce millions of square kilometres’ worth of images with a resolution of .41 meters, how is it possible for intelligence analysts to assess all the information?81 Crowdsourcing provides an answer: if they had 85,000 contributors helping with the job, it would be easy. Due to the fact that commercial satellites and software like Google Maps have made images of anywhere in the world, including supposedly restricted areas, available to all, anyone can participate. Consequently, secrecy is not an issue with IMINT. Robert Rowley, a 48-‐year-‐old supervisor of a Dairy Queen in Arizona, gave substance to this fact. While combing through satellite images publicly available on the Internet, he was among the first to notice fuel tankers slipping past North Atlantic Treaty Organisation (NATO) warships and docking at ports controlled by Colonel Gaddafi, which led to international interdictions. On top of that, Rowley had noticed that a property listed as a commercial warehouse had a yard containing what appeared to be military vehicles. He published his observations on 78 Benkler (2005), On the N ew Open-Source E conomics. 79 Defence Management Journal, M anaging the Flood of IMINT. 80 Magnusson (2010), M ilitary S wimming i n S ensors. 81 GeoEye, E arth Imagery.
26
Twitter; 10 hours later, the spot was hit by a NATO air strike.82 This clearly demonstrates that for all its sophisticated eyes overhead, intelligence services were overtaken by the crowd. Accordingly, not only expertly-‐trained intelligence professionals can do the job; amateurs can also make significant contributions. If a large number of such contributions were to be combined, they could potentially replace the need for the high quantities of IMINT analysts employed by the IC. 2.3 SIGINT
Signals intelligence equally stands to gain from crowdsourcing. One of the aspect of
SIGINT most relevant to crowdsourcing is that of massive computing.83 In the words of Robert David Steele, agencies such as the National Security Agency (NSA) and the Government Communications Headquarters (GCHQ) are dealing with the effects of the IT revolution by trying to establish the ‘mother of all databases’ able to fully integrate all available foreign intelligence, law enforcement intelligence, and related private sector data (for instance, usage of credit cards by individuals approved for inclusion on terrorist watch lists).84 However, these massive amounts of data are becoming nearly impossible to manage with the existing capabilities and infrastructure of intelligence services. As a consequence, the Intelligence Community has had to spend its time, money and resources dealing with the consequences of information overload. For example, the Utah Data Centre is being built for the NSA to become a primary storage resource capable of storing data on the scale of yottabytes.85 The planned structure is projected to cost $2 billion when finished in 2013 and cost about $40 million per year to maintain.86 Evidently, the problem of information overload is already significant and will continue to grow because of the ongoing IT revolution. Crowdsourcing could help by leveraging the public’s spare computer time to complement NSA’s existing computing power. In May 1999, the University of 82 Smith (2011), H ow S ocial M edia Users A re H elping N ATO. 83 Steele (2002), T he N ew Craft of Intelligence, 19. 84 Ibid. 85 Bamford (2012), T he N SA i s Building the Country’s Biggest S py Center. 86 Ibid.
27
California at Berkley launched the SETI@home project.87 Berkley’s Search for Extraterrestrial Intelligence (SETI) project scans data gathered by large radio telescopes to listen for narrow-‐bandwidth radio signals from space. By recording and analysing them, scientists hope to identify anomalies -‐ signals amid the noise -‐ that would betray the presence of intelligent life on other planets.88 Berkley had originally been using super computers to analyse all the data, but they were unable to process all of the information. A handful of astronomers and computer scientists proposed to recruit the public. Volunteers would download a simple screen saver, which would kick into gear whenever the user stopped using their machine. Once a computer finished scanning a chunk of data, it would automatically send it back to a central server, which would then give the computer a new chunk to work on. By 2005, over 5.2 million users had downloaded the SETI@home screensaver, logging nearly three million years of computing time. The Guinness Book of Records recognised it as ‘the largest computation of history.’89 While the project failed to find any proof of extraterrestrial life, it has conclusively succeeded in proving the many can work together to cheaply and quickly outperform the few. Technical intelligence gathering agencies could use this crowd-‐computing model to help sift through the vast amounts of information they collect and have to analyse.
SIGINT agencies could apply this type of crowd-‐computing to cryptanalysis as well.
They can use this kind of raw computational power to decipher encrypted codes. To demonstrate the strength of their encryption system, in 1977, RSA Laboratories, a leading computer and network security company specialising in factor-‐based keys, challenged readers of Scientific American and the academic community at large to factor a 129-‐digit number, which they dubbed RSA-‐129.90 Then, using the number to determine they key to the cipher, the participants were to find a secret message. At the time, even the most powerful computer would have required thousands of years to work through all the various possible combinations of numbers, so, for all practical purposes, the key was 87 U.C. Berkley, A bout S ETI@HOME. 88 Howe (2008), Crowdsourcing, 11. 89 Ibid. 90 RSA Laboratories, T he R SA Factoring Challenge.
28
uncrackable. In 1994, with the advent of the information technology revolution, RSA re-‐ posted their proposed project on Internet bulletin boards and asked for volunteers. An international community of encryption devotees, consisting largely of mathematicians, computer scientists and engineers, took up the call. Michael Graff, a university undergraduate, wrote a program that would allocate work to participants as more and more of the problem was completed. Derek Atkins, a Massachusetts Institute of Technology graduate student, provided the computer for saving and verifying the partial solutions as they came in. Before the project was done, hundreds of people had taken part, using whatever computing resources they could find. In ten months, the team had their answer. The unbreakable code had been broken. The most significant lesson of the incident was not that the team broke the code, but how they did it.91 Even complex, esoteric, intelligence-‐ style problems can be tackled with an impromptu network of experts mustered with a minimum of formal organisation.92 This is the style of intelligence production that will often be needed by SIGINT agencies to tackle the challenges of the Information Age.93 3. Analysis
The complexity of the world and of the new information environment might appear
as largely a collector’s problem.94 However, on some reflection, there is little denying that the information technology revolution has fuelled the need for more processing and for more information intermediaries.95 Since data flows being processed are so huge, but their secret content is low, why should intelligence services waste their time and resources doing so much processing? Indeed, as it was established earlier, collectors are overwhelmed by the amount of information they have to gather. As a result, they cannot filter it all. Therefore, the problem has shifted onto analysts who now have to finish the collectors job of vetting raw data, in addition to having to perform their main task of 91 Berkowitz, Goodman (2002), Best T ruth, 92. 92 Ibid. 93 Ibid. 94 George, Bruce (2008), A nalyzing Intelligence, 242. 95 Treverton (2003), R eshaping N ational Intelligence, 119.
29
dissecting and analysing it to produce intelligence. The validating function that is causing a duplication of efforts could be performed by connecting the validators inside intelligence services to the surfers of the Net, who are outside intelligence, in the crowd. A large group of independent, external collaborators could review the vast amounts of data collected, synthesise it and report back to intelligence services what is of significance. In turn, analysts could assess that which has already been vetted, thus reducing the amount of information that they need to process. Essentially, the crowd would serve as a filter for raw information to be vetted and directed to analysts.
Crowdsourcing has already been employed to this effect in an experience led by the
US government. In March 2006, 55,000 Arabic-‐language documents captured in Iraq were posted on a Website run by the US Army Foreign Military Studies Office. The Bush administration did so in hopes of ‘leveraging the Internet’ to find new evidence of the prewar dangers posed by Saddam Hussein.96 In early November 2006 though, the entire set of documents was removed. Media reports stated that the Website was taken offline because of security concerns regarding the posting of sophisticated diagrams and other information regarding nuclear weapon designs. Indeed, roughly a dozen documents contained charts, diagrams, equations and lengthy narratives about bomb building that nuclear experts who had viewed them said went beyond what is available elsewhere on the Internet and in other public forums. For instance, the papers gave detailed information on how to build nuclear firing circuits and triggering explosives, as well as the radioactive cores of atom bombs.97 Thereby, the whole project was deemed a failure because it exposed dangerous knowledge to the public. However, it did prove that the crowd could perform the kind of analysis that intelligence professionals conduct. It quickly and effectively processed an enormous amount of information, all in arabic and some of it very technical, and located that which was most important, nuclear blueprints. The Intelligence Community itself could not have done better in such a short amount of time and for such an inexpensive price. 96 Broad (2006), U.S. Web R esource A rchive. 97 Ibid.
30
The project’s results also proved that the crowd could complement the work done
by intelligence services. Once the information regarding nuclear weapons was found, Intelligence services then took over where the crowd left off and this new information was integrated into the all source intelligence picture. It is only after comparing it to what secret information they already had that they could determine if this was proof or not that Saddam had an active nuclear and WMD program. As it turned out, the information found in the documents did not disprove US pre-‐war intelligence assessments, but it was only possible to do so once the data had actually been processed, which intelligence services could not do themselves. In that way, this analytical experiment can really be seen as a crowdsourcing success. Indeed, insight came from the synthesis, not only the dissection, of knowledge.98 Intelligence services reviewed the results and added their ‘secret bits’ as appropriate for relevant consumers.99 Using the crowd to support intelligence analysts’ work thus allowed them to determine what was available openly in order to know what their secrets added.100 Hence, the crowd should be viewed as an analyst’s source of first resort in that it could help determine the value of that which is secret.101 Consequently, this allows for intelligence services to marshal their limited resources for the most intractable problems and relieves analysts of having to accomplish mundane tasks.102 Indeed, intelligence analysts need all their wits about them when doing their jobs. Having too much to do can lead to analytical mistakes. In addition to reducing their workload, crowdsourcing could help reduce analysts’ tendencies to succumb to biases. In Robert Jervis’ view, analysts tend to look for, and find, what they expect to see.103 Analysts develop plausible inferences about what is happening that guide their interpretation of the relatively few specific bits of information that are available.104 Today, there is even more 98 George, Bruce (2008), A nalyzing Intelligence, 247. 99 Treverton (2003), R eshaping N ational Intelligence, 124. 100 Ibid. 101 CRS Report for Congress (2007), Open S ource Intelligence, 3. 102 Mercado (2004), S ailing the S ea of OSINT. 103 Jervis (2010), Why Intelligence Fails. 104 Ibid.
31
data to deal with, thus the tendency for analysts to incorrectly appraise intelligence is aggravated. Furthermore, their expertise can actually lead them to develop false assumptions. They do not make an effort to consciously articulate the beliefs that guide their thinking and consider what evidence should be available if they were true, or what it would take to disprove them.105 In contrast, amateurs within the crowd may have fewer vested interests in the subject of concern since the task at hand may not be their primary one, thus limiting the influence of any presumptions. By opening up the analytical process to elicit contributions from the crowd, the greater amount of feedback could help reduce the risks of cognitive biases. Crowdsourcing could help target cognitive weak spots and vulnerable theories, while tapping into the wisdom of the crowd to find innovative ideas to replace them.
Biases arising from cultural, organisational or political prejudices would be resolved
by the very nature of crowdsourcing. Crowdsourcing is designed to include diversity of opinion, independent judgement and decentralised beliefs. Diversity of opinion entails diversity of identity, diversity of skills, and diversity of political investment.106 From this diversity of opinions emerges a multitude of independent judgements. Crowdsourcing requires such diverse and unconventional viewpoints to be effective. If a central body is spreading select ideas which are in turn taken up by individuals of a group, never will it be possible to generate the necessary diversity of opinions to achieve independence of thought. In this way, crowdsourcing helps avoid what Michael Herman describes in the intelligence tradecraft as ‘groupthink’, whereby members of an analytical community may consciously or subconsciously accept dominant analyses or arguments and, for a variety of reasons, refrain from rocking the boat registering their analytical dissent.107 Of course, this does not mean that individuals within the crowd cannot be subject to exterior influences. As Robert Jervis points out: ‘facts do not speak for themselves, but inevitably are seen in a framework of understanding and belief -‐ whether that framework is recognised or not.’108 105 Ibid. 106 Brabham (2008), Crowdsourcing as a M odel for P roblem S olving. 107 Herman (1996), Intelligence, 106. 108 Jervis (2010), Why Intelligence Fails.
32
Nevertheless, it suggests that when all of the different view points of each individual are aggregated, the net result will be free of biases. Even though individual estimates can be quite poor for reasons from bias to ignorance, the errors tend to cancel out and produce a surprisingly accurate average.109 Thereby, crowdsourcing can reduce the chance of intelligence analysis being affected by biases. 4. Dissemination
The main issue with the dissemination of intelligence is who gets access to what
information. Indeed, sharing information within the Intelligence Community is above all a question of secrecy. With crowdsourcing, that question translates into whether to share classified intelligence with the crowd or not. How intelligence services operate has always been kept secret and so has the product of their endeavours. There are understandable and defensible reasons for that, safety concerns for human sources being a case in point.110 Nevertheless, is the huge amount of secrecy shrouding the IC necessary? The truth is, according to Robert David Steele, that our opponents already know most of our sources and methods. The Indian government, for example, was easily able to hide its preparations for a nuclear test because it knew precisely when secret imagery satellites would be passing overhead, and it knew precisely what secret signals satellites could and could not intercept.111 Steele adds that the same is true of terrorists. Therefore, the burden lies with intelligence agencies to justify so much secrecy.
Stephen C. Mercado writes that some insiders err in believing intelligence to be
identical with covert sources and methods.112 Too many people within the IC still mistake secrets for intelligence and still confuse the method with the product. There is an assumption in the IC that information becomes more valuable as it rises in classification. Such a mentality is thought to result in a closed system that values keeping secrets more 109 Robson (2012), $ 1 market offers more i nsight. 110 Gill, Phythian (2010), Intelligence i n an Insecure World, 91. 111 Steele (2002), T he N ew Craft of Intelligence, 10. 112 Mercado (2004), S ailing the S ea of OSINT.
33
than accumulating general knowledge that is publicly available.113 This phenomenon was described by Lieutenant General Michael Hayden, speaking as director of the NSA in 2002, as the ‘stovepipe mentality’ that only allows information to move vertically, but not horizontally to the areas where it is actually needed.114 Security actually keeps more information out than it allows in, notes Robert David Steele.115 In fact, much criticism in the United States since the 9/11 attacks has been concentrated on the failure of intelligence agencies to share information with others.116 As the information technology revolution continues and more individuals have more access to more and more information, it becomes clear that attempting to restrict and control information flows becomes an exercise in futility.117 Intelligence today is in the information, not the secrets. As a matter of fact, it has been estimated by some, including George Kennan, the doyen of American cold warriors, that open sources may account for ‘as much as eighty percent’ of the intelligence database in general.118
The 9/11 Commission recommended that the Cold War assumption that intelligence
can be shared only with those who ‘need to know’ be replaced by a ‘need-‐to-‐share’ culture of integration.119 Dr. D. Calvin Andrus, from the CIA’s Directorate of Support, explains that just as military units in the field must know where other units are located in geographic space, intelligence analysts must know where their colleagues across the IC are located in intellectual space. It is crucial to be informed of who knows what so as to better coordinate efforts. This knowledge results from sharing information. Information-‐sharing among individuals allows market niches to be filled, ants to fend off predator attacks, and plants to distribute themselves in the ecosystem.120 Increased information-‐sharing among intelligence officers will allow them to self-‐organise in order to respond in near-‐real-‐time 113 Burke (2007), Freeing Knowledge. 114 Ibid. 115 Steele (2002), T he N ew Craft of Intelligence, 8. 116 Gill, Phythian (2010), Intelligence i n an Insecure World, 91. 117 Burke (2007), Freeing Knowledge. 118 Hulnick (1999), Fixing the S py M achine. 119 Gill, Phythian (2010), Intelligence i n an Insecure World, 91. 120 Andrus (2007), T oward a Complex A daptive Intelligence Community.
34
to national security concerns.121 In this vein, John Perry Barlow, writing in Forbes, calls for the Intelligence Community to take its cues from the scientific community, where the scientific method is utilised to arrive at ‘truth’ based on the widest possible consensus through peer-‐review and the distribution of conclusions and analysis, rather then concealment and classification.122 Cooperative techniques such as crowdsourcing represent a tremendous opportunity to rapidly gain information about the operating environment. It provides a dynamic platform for multiple users, from a wide spectrum of people, to input data into a common situational picture. Often, once users see the power of mutually shared information, crowdsourcing programs go viral and grow rapidly as users add more information and new users see the value of the program.123 Accordingly, the main reason to share information through crowdsourcing is that it provides all contributors with a growing picture of the environment as well as tools to help them navigate it, allowing them to easily find the information they need to act. Thus, crowdsourcing can facilitate the dissemination of intelligence, enabling faster and more effective action to be taken.
In effect, the question of crowdsourcing intelligence dissemination is not about
revealing sources and methods. It is about data processing and coordination. This is the heart of the matter, not secrecy. In fact, as Robert David Steele argues: It is about whether the IC has language qualified clandestine case officers and analysts, and enough of them (it does not); whether it has a means of collecting, digitising and translating all open sources of information in a comprehensive and timely fashion (it does not); and whether it has an all-‐source processing capability that avoids the loss of key leads of forthcoming threats, like terrorist actions (it does not).124
121 Ibid. 122 Burke (2007), Freeing Knowledge. 123 Marshall (2012), Fusing Crowdsourcing and Operations. 124 Steele (2002), T he N ew Craft of Intelligence, 10.
35
Crowdsourcing is an effective answer to those questions. As it was demonstrated, it makes possible an alternative model for intelligence that relies on distributed hypothesis testing, distributed collection, distributed processing, distributed analysis and shared intelligence. Crowdsourcing ultimately allows for real-‐time Web-‐based global intelligence coverage in all languages of minor and major targets. -‐ Caveats and Limitations
As some experts have declared to the Aspin-‐Brown Commission, more than 80
percent of the data used by the Intelligence Community now comes from open sources.125 While crowdsourced intelligence would rely on that 80 percent, there still remains 20 percent that is secret and out of the crowd’s reach. That is the first limitation of crowdsourcing; crowdsourcing methods are not applicable when secrecy must be preserved. Covert action is the most obvious of intelligence functions that cannot be crowdsourced. The CIA officially defines covert action as ‘operations designed to influence governments, events, organisations, or persons in support of foreign policy in a manner that is not necessarily attributable to the sponsoring power.’126 Covert action operations may include political, economic, propaganda, or paramilitary activities. Therefore, covert operations are not about producing intelligence, they are actually positive and direct actions. In contrast, crowdsourcing is only a method that can be employed to produce intelligence, not act on it. That mandate will always remain in the hands of the appropriate authorities and therefore, crowdsourcing cannot be applied to covert action.
This fact leads to an important caveat: crowdsourcing techniques can be employed
by anybody, even one’s adversaries. For example, during the 2009-‐2012 Iranian post-‐ election protests, the Iranian government turned to the crowd to help identify regime dissidents and protestors. A website called Gerdab (which means ‘vortex') belonging to The Information Centre of the Islamic Revolutionary Guard Corps for Investigating Organised 125 Berkowitz, Goodman (2002), Best T ruth, 78. 126 Ibid., 124.
36
Crime shows images of 20 people with red circles drawn around their faces claiming that they have been involved in creating chaos in Tehran.127 Citizens are invited to call or email if they can identify the people on the photos. Meanwhile, some supporters of the protest movement have themselves published several photos of Iranian security forces and, in particular, suspected undercover agents asking citizens to help identify them.128 Such a reality may in fact act as a disincentive to be part of the crowd. Although sources can find security in anonymity, thus decreasing the likelihood of reprisal for their actions, there is nevertheless a deadly cat-‐and-‐mouse game between non-‐democratic regimes, such as in China and Iran, which use commercially-‐available surveillance technology to monitor online activity, identifying possible dissidents, and the public users that employ the latest in open source encryption and data-‐hiding techniques to communicate their information. Therefore, risks still exist, although advances in digital steganography, the art and science of writing hidden messages, is helping to ensure that both the contents and the communicating parties are protected. In cases such as the Iranian post-‐election protests, it is apparent that crowdsourcing can become a powerful tool for friends and foes alike. Nevertheless, the same can be said of the intelligence tradecraft as a whole. The fact that insurgents, criminals and foreign powers all conduct intelligence and covert operations, does not stop these operations from being useful. Crowdsourcing is not any worse than other methods used by intelligence services.
Some may think that using the open-‐call format may awaken suspicions as to what it
is intelligence services are focusing on, which in turn could lead to damaging counter-‐ intelligence operations. However, the effects of using the open-‐call format are not as ruinous as it would appear. It is not a secret that intelligence services focus on terrorism, WMD proliferation, the drug trade and foreign intent. Consequently, asking the crowd for information on these subjects is not so far-‐fetched. The information used by intelligence agencies is mostly public anyways. What needs to be secret is the result. Finished
127 Petrossian (2009), Iranian Officials ‘Crowd-Source’ P rotesters Identities. 128 Ibid.
37
intelligence must be kept secret as well as what will be done with it, but not necessarily the raw information that produces it.
Another problem is the possibility that opponents deliberately attempt to subvert
and defeat the crowd. One’s adversaries could infiltrate the crowd and try to spread disinformation. However, is it was demonstrated earlier, the very nature of crowdsourcing serves as a guarantee that any misinformation will be weeded out. By combining all contributions, whatever misinformation included will be canceled out by good information. Of course, safety and regulatory measures are of crucial importance for crowdsourcing intelligence so as to not give away valuable intelligence and even though there are mechanisms to ensure that the system will not be tampered with, there is a possibility that these mechanisms could fail. Verification of crowdsourced information is difficult and thus can unfavourably affect the process. In order for crowdsourced reports to be permanently integrated as legitimate and actionable sources of information, a system can be created to rapidly identify inaccurate information, whether intentional, exaggerated, or accidental.129 Limiting reporting to only verified participants or using multiple similar incident reports as a form of corroboration are two techniques that could be used to usurp the efforts of those attempting to game the system.130
Last but not least, it may not be that easy to actually draw a crowd to a project. In
Haiti, surveys of the population indicated that initially respondents were reluctant to report security-‐related events in fear of reprisal. It may not be possible to motivate sufficient numbers of participants either. Between February and August of 2010, only 100 of the 12,286 reports received by the disaster relief workers were security-‐related.131 However, with the right incentives, public engagement and education, the quality of reporting could well improve, along with the establishment of trust between participants and analysts. 129 Heinzelman, Waters (2010), Crowdsourcing Crisis Information. 130 Mumm (2012), Crowdsourcing: A N ew P erspective. 131 Heinzelman, Waters (2010), Crowdsourcing Crisis Information.
38
-‐ Crowdsourced Situational Awareness
Crowdsourcing has been shown to be an effective problem-‐solving model that can
be employed at all stages of the intelligence production cycle. It is the IC’s best bet to manage the effects of the information technology revolution. Crowdsourcing capitalises on the fact that the crowd is becoming an increasingly networked force highly capable of sharing information. This sharing of information also greatly enhances its quality. Better information in turn strengthens the understanding of events, which allows for a shared understanding of the environment. This shared knowledge then enables collaboration and self-‐coordination amongst the individual members of the crowd. Subsequently, their operations gain speed and become more sustainable inasmuch as adaptable because they can rapidly evolve to meet new requirements. The final outcome is dramatically increased effectiveness.132 The Intelligence Community cannot afford to pass up the benefits of applying crowdsourcing to its work. The information technology revolution is putting the Intelligence Community at risk of failure. With its current organisational structures and operational procedures, the IC cannot process enough information to cover all possible threats. The sheer amount of information that needs to be processed is costing the IC its role as information leaders. From now on, what will make intelligence services successful is whether they can process all the available information or not. Their success will be measured in terabytes. Hence, if they can cope with the massive increase in information, they can achieve a better understanding of reality in the immediate present. This allows for better situational awareness because more information can be processed and shared with more people. Indeed, shared situational awareness is the key concept for thinking about how intelligence services now need to operate.133 Situational awareness (SA) refers to the analyst’s cognitive understanding of the current situation and its implications.134 It involves being aware of what is happening in the operational environment to understand how information, events and operations will impact goals and objectives, both immediately and in the near future. If intelligence services can leverage the power of crowdsourcing, they 132 Office of Force Transformation, T he Implementation of N etwork-Centric Warfare. 133 Ibid. 134 Vidulich (1995), T he R ole of S cope.
39
will be able to rapidly build up situational awareness. It will help provide information crucial to understanding what is currently happening as well as what is about to happen and how to prepare for it.
Through the robust networking of well informed geographically dispersed forces,
crowdsourcing is useful to intelligence services in that it can translate the IC’s problem of information overload into a competitive advantage.135 Intelligence services could integrate contributions from an enormous amount of collaborators into a single all-‐source intelligence picture. This extraordinary competence consists of the ability to aggregate, in near-‐real-‐time, a wide variety of intelligence collected via HUMINT, IMINT and SIGINT, to plot these on a geospatial map or chart, and to make reasonably reliable predictions as to where each of the pieces being plotted is going to go.136 In turn, by increasing the close coupling of intelligence and information technology via crowdsourcing, it is possible to achieve precise effects and gain temporal advantage with dispersed forces. -‐ Model and Applications
Writing for the Small Wars Journal, Nicholas Mumm gives an example of what
crowdsourced situational awareness looks like in practical, operational terms.137 As a situation unfolds, intelligence services can broadcast a call-‐to-‐action to the crowd via conventional means: media advertisement, leaflet distribution, cell-‐phone distribution, or word-‐of-‐mouth to solicit text message responses from the populace within the area of operations. As members of the community respond via text message (SMS) or email, intelligence analysts will capture their information (phone number and address), establish a social network, and initiate an ongoing, two-‐way conversation with the public through their mobile device. For instance, a human source could then report the location of an insurgent safe haven via SMS. Upon receipt of this report, the analyst could further query the particular source, the crowd or other intelligence services for additional information 135 Office of Force Transformation, T he Implementation of N etwork-Centric Warfare. 136 Steele (2002), T he N ew Craft of Intelligence, 20. 137 Mumm (2012), Crowdsourcing: A N ew P erspective.
40
including pictures of the building(s) and/or the person(s). As the information begins to arrive from the community, an analyst can use a relatively inexpensive software package to aggregate, map, and analyse the reporting. Using the large amounts of data collected, the commander should be able to analyse trends and develop further priority information requirements to push back to the community. As the network becomes more robust, the complexity of both the intelligence requirements and intelligence reporting could evolve to the point where both could lead to real-‐time responses to crises. All of this information, once aggregated, could quickly be shared with a ground force, thus providing enough information for immediate action. Such was the case when this technology-‐enabled dialogue created between the populace and analysts allowed for the early detection and prevention of violence in Haiti.138 On two occasions, the US Marine Corps responded to crowdsourced reports of the formation of angry mobs near food distribution sites. In both cases, security forces dispersed crowds before the situations became violent. Finally, after the ground force responds to the reporting of such events, analyst can broadcast the results of the operation to the entire community providing them with immediate, positive feedback for their collaboration and asking for other such contributions, thereby reinitiating the intelligence cycle.139
Using the same principles established by Nicholas Mumm, Christopher Ford
describes how crowdsourcing applications were successfully tested in another case by the Defence Advanced Research Projects Agency (DARPA). In 2009, DARPA held a competition designed to, in their words, ‘explore the role the Internet and social networking plays in the timely communication, wide area team-‐building and urgent mobilization required to solve broad scope, time-‐critical problems.’ The so-‐called ‘Red Balloon competition’ required participating teams or individuals to find ‘10 8-‐foot balloons moored at 10 fixed locations in the continental United States.’140 Just before the competition opened, the balloons were surreptitiously floated at random locations in nine states, including: California, Tennessee, Florida, Delaware, Texas, Virginia, Arizona, Oregon, and Georgia. The winning team, 138 Ibid. 139 Ibid. 140 Ford (2011), T witter, Facebook and T en R ed Balloons.
41
comprised of five students from the Massachusetts Institute of Technology, found all ten balloons in less than nine hours. Ford notes that: Incredibly, the team learned of the competition only four days before it started, and in less than two days they had a plan, a website, and more than 5,000 signed up to help them. They then applied that network to an extraordinarily complex problem spanning the United States. The results were shockingly accurate and swift. The significance and potential application of their system is remarkable. In a period of less than one week, five students constructed a productive, precise, layered, networked enterprise involving thousands of citizens.141 The challenge was to find out where the balloons were in the immediate present and the crowd fulfilled the task. The potential of crowdsourcing techniques to produce current intelligence and gain situation awareness is perfectly illustrated here. The crowd rapidly organised itself and coordinated its efforts to set requirements, collect information, process it and then share their findings through the network.
This crowdsourcing model could be used to locate a wanted person as well, says
Ford.142 Presented in the August, 2009 edition of Wired Magazine was the Vanish Competition.143 The competition accompanied an article by Evan Ratliff, which examined instances in which people had attempted to make themselves disappear. The competition had Ratliff go into hiding for thirty days. During that time, he travelled around the United States in disguise, not making contact with family, friends or editors. He ditched his cell phone, credit cards, and online accounts. He used physical disguises and masked his movement and communications online using various technical tools. A $5,000 prize was awarded to the first person to identify Ratliff, take his picture, and say the word ‘fluke.’ Ford describes the competition as going like this: 141 Ibid. 142 Ibid. 143 Ratliff (2009), A uthor E van Ratliff i s on the Lam.
42
Almost instantly, thousands of people became actively involved in the hunt. The participants self-‐organised into dozens of teams, pooling resources to find Ratliff. The teams and individual participants extensively used social networking tools such as Facebook and Twitter to connect and share information. It took twenty-‐five days for a team to track Ratliff down on a street in New Orleans – more than 2,000 miles from where he started. Throughout his time on the run, Ratliff continuously checked up on the social networking sites to track the trackers. He was eventually caught by team members who were able to identify him online and hack through the measures he had set up to protect his identity. Other team members, physically located in New Orleans, approached him and ended the contest.144 Again, not only did teams form naturally, a seemingly efficient division of labor developed as well. Ford further notes that in both challenges, existing social networks were used extensively to share information towards the completion of the challenge. Lastly, the winning team in the Vanish challenge was able to find Ratliff despite abundant misinformation provided by other teams and Ratliff himself. The winning team devised a way to vet information and team members, thereby guaranteeing the accuracy of information received.145 These two examples demonstrate how the crowd rapidly went through the whole intelligence process to obtain quality results. Crowdsourcing captures the experience base of users and systems and allows problem partitioning, quantitative confidence assessment, and validation in environments that may be also partially compromised by adversaries.146 That being the case, it becomes apparent that crowdsourcing is very well suited for the production of current intelligence. -‐ Going Forward: From Analysts to Amateurs
In conclusion, this paper has demonstrated that in a globalised environment,
specialised intelligence agencies are unable to dominate all topics. In the case of truly 144 Ford (2011), T witter, Facebook and T en R ed Balloons. 145 Ibid. 146 Drummond (2010), DARPA’s N ew P lans.
43
global topics, those that are pervasive, spanning many countries, industries, and cultures, no one nation, much less any central intelligence agency, is fully able to get a grip on reality.147 Intelligence services need an information campaign commensurate with the dynamic and fluid environment of the information technology revolution. The IT revolution has produced so much data and so much change that intelligence analysts are overwhelmed by the massive increase in information and the complexity of the environment they need to understand and operate in. They are at risk of being left flat-‐ footed, overtaken by individual, real-‐time reporting of what is happening and being done around the world, just like the printing-‐press industry has been overtaken by user-‐created content and citizen-‐journalists.148 Such is also the case with peer-‐to-‐peer sharing, which is causing the downfall of the recording industry. Free and open-‐source software, such as Linux, is challenging established giants like Microsoft. Therefore, the Intelligence Community needs to figure out ways to institutionalise the transformational changes brought about by the information technology revolution or it will be at an extreme risk of going the way of the ‘buggy whip’ manufacturers in the wake of Henry Ford.149 The IC needs to build change management into the very structure of its organisations, instead of training more analysts or handing data over to computers.150 The IC has to capitalise on the fact that coordination costs have fallen through the floor due to information technology.151 Novel frameworks, like crowdsourcing, allow for the better management and use of information. Crowdsourcing can improve how the Intelligence Community produces knowledge by turning it into an open call for contribution.152 In that way, the IC can better harness the collective intelligence of Internet users, using information and knowledge embedded in the Web in the form of data, metadata, user participation and creating links between these.153 147 Steele (2006), P eacekeeping Intelligence. 148 Woods, Serle (2012), H ow T witter M apped a ‘Covert’ US Drone. 149 Barger (2005), T oward a R evolution i n Intelligence A ffairs. 150 Drummond (2010), DARPA’s N ew P lans. 151 Benkler (2005), On the N ew Open-Source E conomics. 152 Drummond (2010), DARPA’s N ew P lans. 153 Organisation for Economic Co-‐operation and Development (2007), P articipative Web R esource.
44
Information technology should reshape organisational behaviour, not simply
enhance existing processes.154 Crowdsourcing serves this purpose. It complements and facilitates the job of intelligence professionals in addition to empowering non-‐professionals to contribute to security issues. It allows the crowd of amateurs to directly participate in the production of intelligence. Accordingly, citizens could now have a greater say in the decision-‐making process that affects their own lives. Robert David Steele believes that eventually -‐ over many political objections -‐ this will lead to reality-‐based budgeting and international public diplomacy that secure informed public support.155 In other words, crowdsourcing has the potential to democratise intelligence. Of course, professionals will always be necessary because the crowd can primarily help deal with information as opposed to act on it. Nevertheless, the end result is that amateurs from the crowd could very well become intelligence analysts.
In the business world, crowdsourcing has been proven to be a very effective and
profitable tool. In the intelligence domain, experiments like the Red Balloon and Vanish competitions demonstrate the sound theoretical principles of applying crowdsourcing methods to intelligence. However, these experiments barely scratch the proverbial surface of the myriad of security applications, both physical and cyber, for which crowdsourcing could be used.156 Now the Intelligence Community needs to conduct further research into other applications for crowdsourcing in the context of intelligence. It has been established that crowdsourcing can produce up-‐to-‐date intelligence which in turn allows for early warning. Can crowdsourcing also be used to forecast events and build strategic intelligence? A few initial steps have been taken to explore this question, for instance, by DARPA’s cousin at the CIA, the Intelligence Advanced Research Projects Activity (IARPA). IARPA’s new ACES program has tested crowdsourced predictive intelligence by testing two scenarios: Would the military repeal ‘Don’t Ask, Don’t Tell’ -‐ its policy barring homosexuals to serve openly -‐ before the end of September 2011? And Will Ali Abdullah Saleh step down as Yemen’s president before that same deadline? Gays can now serve openly in the military 154 Barger (2005), T oward a R evolution i n Intelligence A ffairs. 155 Steele (2006), P eacekeeping Intelligence. 156 Ford (2011), T witter, Facebook and T en Red Balloons.
45
while Saleh is still clinging to power. ACES correctly predicted both results, the first at 66 percent and the second at 80 percent.157 Therefore, the outlook appears to be promising, but before all doubts and questions are answered, much more research needs to be done to develop the idea that intelligence can be produced by crowdsourcing. Maybe the final result will be positive and maybe not, but one thing is sure, the Intelligence Community needs to adapt to the information technology revolution if it is to remain effective in the Information Age.
157 Parsons (2011), U.S. G overnment T urns to Crowdsourcing.
46
-‐ Bibliography Ackerman, Spencer, “Navy Crowdsources Pirate Fight To Online Gamers”, Wired M agazine, 2011. http://www.wired.com/dangerroom/2011/05/navy-‐crowdsources-‐pirate-‐fight-‐to-‐ online-‐gamers/ Anderson, Chris, “The Long Tail”, Wired M agazine, Issue 12.10, 2004. http://www.wired.com/wired/archive/12.10/tail.html Anderson, Chris, “The Long Tail, In a Nutshell”, Chris Anderson’s Long Tail blog. Web Resource. Accessed 10 June 2012. http://www.longtail.com/about.html Andrus, Calvin D., “Toward a Complex Adaptive Intelligence Community: The Wiki and the Blog”, Studies in Intelligence, Vol 49. No. 3, 2007. Bamford, James, “The NSA Is Building the Country’s Biggest Spy Center”, Wired M agazine, 2012. http://www.wired.com/threatlevel/2012/03/ff_nsadatacenter/all/1 Barger, Deborah G., “Toward a Revolution in Intelligence Affairs”, RAND Techincal Report, 2005. http://www.rand.org/content/dam/rand/pubs/technical_reports/2005/RAND_TR242.pdf Benkler, Yochai, On the New Open-Source Economics, TED Talk, 2005. http://www.ted.com/talks/yochai_benkler_on_the_new_open_source_economics.html Berkowitz, Bruce D. and Allan E. Goodman, Best Truth: Intelligence in the Information Age, Yale University Press, 2002. Best, Richard A., Jr., and Alfred Cumming. Open Source Intelligence (OSINT): Issues for Congress. Washington, DC: Congressional Research Service, Library of Congress, 2007. http://www.fas.org/sgp/crs/intel/RL34270.pdf Brabham, Daren C., “Crowdsourcing as a Model for Problem Solving”, The International Journal of Research into New M edia Technologies, University of Utah, 2008. http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf Broad, William J., “U.S. Web Resource Archive Is Said to Reveal a Nuclear Primer”, The New York Times, 2006. http://www.nytimes.com/2006/11/03/world/middleeast/03documents.html?pagewante d=all 47
Burke, Cody, “Freeing Knowledge, Telling Secrets: Open Source Intelligence and Development”, CEWCES Research Papers, Bond University, 2007. http://epublications.bond.edu.au/cgi/viewcontent.cgi?article=1010&context=cewces_pape rs CNET News, Q&A: J eff Howe on 'crowdsourcing', 2008. http://www.cbpp.uaa.alaska.edu/afef/howe.htm CNN Politics, Transcript of Condoleezza Rice's 9/11 commission s tatement, 2004. http://articles.cnn.com/2004-‐04-‐08/politics/rice.transcript_1_terrorist-‐threat-‐freedom-‐ hating-‐terrorists-‐response-‐across-‐several-‐administrations/9?_s=PM:ALLPOLITICS Crowe, Adam, Disasters 2.0: The Application of Social M edia Systems f or M odern Emergency Management, CRC Press, 2012. Defence Management Journal, M anaging the f lood of IMINT, Issue 55. Web Resource. Accessed 25 June 2012. http://www.defencemanagement.com/article.asp?id=541&content_name=Modernising percent20Defence&article=18128 Dixon, Nancy M. and Laura A. McNamara, Our Experience with Intellipedia: An Ethnographic Study at the Defense Intelligence Agency, DIA Knowledge Laboratory Pilot Project, Sandia National Laboratories, 2008. https://cfwebprod.sandia.gov/cfdocs/CCIM/docs/DixonMcNamara.pdf Drummond, Katie, ‘Darpa’s New Plans: Crowdsource Intel, Edit DNA”, Wired M agazine, 2010. http://www.wired.com/dangerroom/2010/02/darpas-‐new-‐plans-‐crowdsource-‐ intel-‐immunize-‐nets-‐edit-‐dna/ Finley, Bruce, “Intelligence Fixes Floated at Conference”, Denver Post, 22 August 2006. Ford, Christopher M., “Twitter, Facebook and Ten Red Balloons: Social Network Problem Solving and Homeland Security”, Homeland Security Affairs, Vol. 7, No. 3. pp. 1-‐8, 2011. http://www.hsaj.org/?fullarticle=7.1.3#fn1 GeoEye, Earth Imagery. Web Resource. Accessed 25 June 2012. http://www.geoeye.com/CorpSite/products-‐and-‐services/imagery-‐sources/ George, Roger Z., and James B. Bruce, Analyzing Intelligence: Origins, Obstacles, and Innovations, Georgetown University Press, 2008. 48
Gibson, Stevyn, “Open Source Intelligence: An Intelligence Lifeline”, The RUSI Journal, Vol. 149, No. 1, pp. 16-‐22, 2004. Gill Peter and Mark Phythian, Intelligence in an Insecure World. Polity Press, Cambridge, 2010. Google Search Results, Iran Nuclear. Web Resource. Accessed 8 May 2012. https://www.google.com/search?q=iran+nuclear&ie=utf-‐8&oe=utf-‐8&aq=t&qscrl=1 Hafner, Arthur W., Pareto’s Principle: The 80-20 Rule, Ball State University, 2001. Handel, Michael, War, Strategy and Intelligence. Routledge, London, New York, 1989. Heinzelman, Jessica and Carol Waters, “Crowdsourcing Crisis Information in Disaster Affected Haiti”, U nited Nations Institute of PEace Special Report, 2010 Herman, Michael, Intelligence Power in Peace and War. Cambridge University Press, Cambridge, 1996. Howe, Jeff, Crowdsourcing: Why the Power of the Crowd Is Driving the Future of Business, Crown Business, 2008. Hulnick, Arthur S., Fixing the Spy M achine: Preparing American Intelligence f or the Twenty- First Century, Westport, CT: Praeger Publishers, 1999. Jervis, Robert, Why Intelligence Fails: Lessons from Iranian Revolution and the Iraq War, Cornell University Press, Ithaca, NY, 2010. https://www.cia.gov/library/center-‐for-‐the-‐ study-‐of-‐intelligence/csi-‐publications/csi-‐studies/studies/vol.-‐54-‐no.-‐3/why-‐intelligence-‐ fails-‐lessons-‐from-‐the-‐iranian.html Kanfer, R., “Motivation Theory and Industrial and Organizational Psychology”, Handbook of Industrial and Organizational Psychology, Chichester: Wiley, 1990. Leadbeater, Charles, We-Think: M ass innovation, not m ass production: The Power of M ass Creativity, Profile Books, 2008. Lessing, Lawrence, The Future of Ideas: The Fate of The Commons in Connected World, Vintage, 2002.
49
Magnusson, Stew, “Feds Lagging in Most Disaster Scenarios, McHale Says”, National Defense, 2006. http://www.nationaldefensemagazine.org/archive/2006/November/Pages/SecurityBeat2 819.aspx?PF=1 Magnusson, Stew, “Military Swimming In Sensors and Drowning in Data”, National Defense, 2010. Marshall, Jeffery E., “Fusing Crowd Sourcing and Operations to Strengthen Stability and Security Operations” Thermopylae Sciences and Technology. Web Resource. Accessed 8 May 2012. http://t-‐sciences.com/pdf/white_paper/All-‐Source-‐Intelligence-‐and-‐ Operational-‐Fusion.pdf Martin, Alex and Peter Wilson, “The Value of Non-‐Governmental Intelligence: Widening the Field”, Intelligence and National Security, Vol. 23, Issue 6, pp. 767-‐776, 2008. Mathur, Pankaj, Crowdsourcing, InfoGroup. Web Resource. Accessed 11 June 2012. http://www.infogroup.com/LinkClick.aspx?fileticket=D9rDF56Q23Q=&tabid=70 Meier, Patrick, “Crisis Mapping Syria: Automated Data Mining and Crowdsourced Human Intelligence”, iRevolution, 2012. http://irevolution.net/2012/03/25/crisis-‐ mapping-‐syria/ Mercado, Stephen C. “Sailing the sea of OSINT in the information age”, Studies in Intelligence, Vol. 48, No. 3, 2004. Merriam-‐Web Dictionary, Definition of Crowdsourcing. Web Resource. Accessed 5 May 2012. http://www.merriam-‐webster.com/dictionary/crowdsourcing Mumm, Nicholas, “Crowdsourcing: A New Perspective on Human Intelligence Collection in a Counterinsurgency”, Small Wars Journal, 2012. http://smallwarsjournal.com/node/12036 New Oxford American Dictionary, Definition of Amateur. Web Resource. Accessed 16 June 2012. New Oxford American Dictionary, Definition of Expert. Web Resource. Accessed 16 June 2012.
50
Office of Force Transformation, The Implementation of Network-Centric Warfare, United States. Web Resource. Accessed 6 August 2012. http://www.au.af.mil/au/awc/awcgate/transformation/oft_implementation_ncw.pdf Office of the Director of National Intelligence, The National Intelligence Strategy, United States, August 2009. http://www.dni.gov/reports/2009_NIS.pdf Omand, David, “OSINT’s Place in All Source Intelligence”, Open Source Intelligence m odule, October 20, 2011, Department of War Studies, King’s College London. Omand, David. Securing the State. London: Hurst & Company, 2010. Organisation for Economic Co-‐operation and Development (OECD), Participative Web Resource: U ser-Created Content, 2007. http://www.oecd.org/internet/interneteconomy/38393115.pdf Parsons, Dan, “U.S. Government Turns to Crowdsourcing for Intelligence”, National Defense, 2011. http://www.nationaldefensemagazine.org/archive/2011/December/Pages/USGovernmen tTurnstoCrowdsourcingforIntelligence.aspx Petrossian, Fred, “Iranian officials ‘crowd-‐source' protester identities” Global Voices, 2009. http://globalvoicesonline.org/2009/06/27/iranian-‐officials-‐crowd-‐source-‐protester-‐ identities-‐online/ Random House, Q&A with J ames Surowiecki, 2004. http://www.randomhouse.com/features/wisdomofcrowds/Q&A.html Ratliff, Evan, “Author Evan Ratliff is on the Lam. Locate Him and Win $5,000,” Wired Magazine, August 2009. http://www.wired.com/vanish/2009/08/author-‐evan-‐ratliff-‐is-‐ on-‐the-‐lam-‐locate-‐him-‐and-‐win-‐5000/ Robson, John, “$1 market offers more insight into U.S. presidential race than opinion polls”, 24H Vancouver, 2012. http://tippie.uiowa.edu/iem/media/story.cfm?id=2793 RSA Laboratories, The RSA Factoring Challenge. Web Resource. Accessed 2 August 2012. http://www.rsa.com/rsalabs/node.asp?id=2094 Ryan, R and Deci, E., “Intrinsic and Extrinsic Motivations: Classic Definitions and New Directions”, Contemporary Educational Psychology, Vol. 25: pp. 54–67, 2000. 51
Shirky, Clay, Here Comes Everybody: The Power of Organizing Without Organizations, Penguin Press, 2008. Shirky, Clay, Institutions v s. Collaborations, TED Talk, 2005. http://www.ted.com/talks/clay_shirky_on_institutions_versus_collaboration.html Sims, Jennifer E. and Burton Gerber, Transforming U .S. Intelligence, Georgetown University Press, 2005. Smith, Graeme, “How social media users are helping NATO fight Gadhafi in Libya”, The Globe and M ail, 2011. http://www.theglobeandmail.com/news/world/africa-‐mideast/how-‐ social-‐media-‐users-‐are-‐helping-‐nato-‐fight-‐gadhafi-‐in-‐libya/article2060965/ Steele, Robert David, “Peacekeeping Intelligence and Information Peacekeeping”, International J ournal of Intelligence and CounterIntelligence, Vol. 19, Issue 3, pp. 519-‐537, 2006. Steele, Robert David, “The New Craft of Intelligence: Personal, Public, & Political-‐-‐Citizen's Action Handbook for Fighting Terrorism, Genocide, Disease, Toxic Bombs, & Corruption”, OSS Pr, 2002. Surowiecki, James, The Wisdom of the Crowds, Anchor, 2005. Syria Tracker, Crowdmap. Web Resource. Accessed 20 June 2012. https://syriatracker.crowdmap.com/ Tapscott, Don and Anthony D. Williams, M acrowikinomics, Atlantic Books, London, 2010. Tapscott, Don and Anthony D. Williams, Wikinomics: How M ass Collaboration Changes Everything, Portfolio, 2008. The Washington Post, Top Secret America: A Washington Post Investigation, 19 July 2012. http://www.pulitzer.org/files/entryforms/WashPost_TSA_Item1.pdf Treverton, Gregory F., “Reshaping National Intelligence for an Age of Information”, RAND Studies in Policy Analysis, 2003. Turner, Michael A., “Issues in Evaluating U.S. Intelligence”, International J ournal of Intelligence and Counterintelligence, Vol. 5, No.3, 1991. 52
University California Berkley, About SETI@HOME. Web Resource. Accessed 2 August 2012. http://setiathome.berkeley.edu/sah_about.php Vidulich, M. A., “The role of scope as a feature of situation awareness metrics”. International Conference on Experimental Analysis and Measurement of Situation Awareness, Embry-‐Riddle Aeronautical University Press, 1995. http://www.raes-‐hfg.com/crm/reports/sa-‐defns.pdf Wei, William, “T-‐Shirt Startup Threadless's Offices: Almost As Cool As Its Profitable, Multi-‐ Million Dollar Business”, Business Insider, 11 February 2009. http://www.businessinsider.com/threadless-‐office-‐tour-‐2011-‐2?op=1 Woods, Chris and Jack Serle, “How Twitter mapped a ‘covert’ US drone operation in Yemen”, The Bureau Investigates, 2012. http://www.thebureauinvestigates.com/2012/05/18/how-‐twitter-‐mapped-‐a-‐covert-‐us-‐ drone-‐operation-‐in-‐yemen/
53