Playing to the Crowd
J.D. King
Crowdsourcing has shown great success, great promise and a number of red flags. In very short order, crowdsourcing has begun to affect just about everything. It has been five years since Jeff Howe’s article in Wired magazine coined the term to describe the practice of outsourcing traditionally in-house functions to large groups of outsiders in the form of an open call. What was then a burgeoning phenomenon exemplified by the open-source software movement, Wikipedia and eBay has proliferated into a “diverse landscape offering over a dozen distinct models for creating value with crowds,” according to Ross Dawson, a futurist and crowdsourcing expert. Today, for all kinds of start-ups, established businesses, and government agencies, some form of crowdsourcing is the model of choice for performing rote operational tasks at minimal cost as well as for analyzing data, messaging and marketing, funding, managing risk, driving innovation, and designing — and manufacturing — products. “The reality of global distributed work is going to change how we work, the work we do and the shape of organizations,” said Dawson. One of the more established crowdsourcing models is the service marketplace, an online meeting
place where companies post projects and freelancers bid for the work. Many of these are “microtask” platforms where large-scale projects are broken into many bitesize, web-based tasks which, though simple, require human intelligence. Perhaps the best known microtask platform is Amazon’s Mechanical Turk, launched publicly in 2005, which at any given time offers hundreds of thousands of human intelligence tasks, or “HITs” — requiring skills such as transcription, categorization or translation. Registered users perform the tasks for, in most cases, pennies per HIT, making their money through volume and providing the businesses or individuals that use the service with an inexpensive, on-demand, scalable workforce. A number of companies founded in the last several years are based on variations
of the service model. oDesk, Elance, Freelancer.com, Clickworker, Crowdflower and Serv.io have expanded the concept to offer their user-clients task coordination and oversight, workflow management, guaranteed turnaround times and quality assurance. Clients are able to use them as on-demand agencies for special projects or to virtually staff departments or handle ongoing functions. Fees paid to workers generally correspond to the nature and complexity of the job. oDesk, for example, reports an average job fee of $5,000. Another popular crowdsourcing model, known as “crowdfunding,” is evolving rapidly as social media and micropayment technology have made it easy to secure donations from supporters at very low cost. At first it was primarily the province of political campaigns, charities and the arts, but more recently sites such as crowdcube.com are focusing on funding businesses rather than creative ventures and finding ways to offer investors equity. Increasingly, companies and organizations are using an open-competition model to tap into the crowd’s — or their customers’ — ability to make contributions to messaging, R&D and product development. In 2010, General Electric’s Ecomagination Challenge yielded promising ideas
The Latest Thinking
12
Q2.2012
1,300 challenges and solved half of them, including identifying a biomarker to measure the progression of amyotrophic lateral sclerosis in patients, designing new methods of oil spill recovery, and inventing a solar-powered light for areas without electricity. There has been a boom in similarly ambitious crowd-competition sites in recent years. Kaggle, founded in 2010, is a platform that allows organizations to post their data and have it scrutinized by scientists around the world. Competitors are offered prizes to develop algorithms that predict outcomes based on the data, such as predicting bankruptcy from credit scores or hospitalization rates from medical claims. A newer site, Forecasting Ace, asks registered volunteers to predict future events and outcomes based simply on their informed, but not necessarily expert, judgments. The project is sponsored by a U.S. government agency, the Intelligence Advanced Research Projects Activity, to test the premise that aggregating a large volume of such judgments can yield accurate forecasts about politics, business, technology, medicine and social trends. Despite the breadth and seemingly irresistible momentum crowd-
sourcing has gained, or perhaps because of it, many have voiced legitimate concerns — about the quality, controllability and accuracy of the work, the logistical obstacles presented by integrating the crowd into a company’s workflow, and the wisdom of having outsiders work with information that may be sensitive or proprietary. Increasingly vocal professionals, especially in creative or highly specialized fields, contend that crowdsourcing devalues their profession by increasing the amount of unpaid “on-spec” work and puts such severe pressure on prices that it will destroy companies and spawn “digital sweatshops.” Most crowdsourcing advocates do not dismiss such concerns, but they tend to believe equitable standards and best practices will emerge. For John Winsor, founder and CEO of Victors & Spoils, an ad agency built on crowdsourcing principles, it will require “a delicate balance between encouraging participation and maintaining clarity of overall business objectives. Every company will develop its own way of handling that debate. For now, the most important thing is to jump in and try.”
By the end of 2011, more than 250,000 InnoCentive solvers worldwide had worked on more than 1,300 challenges and solved half of them, including identifying a biomarker to measure the progression of amyotrophic lateral sclerosis in patients, designing new methods of oil spill recovery, and inventing a solar-powered light for areas without electricity.
J.D. King
for efficient energy production, in which GE invested $200 million. When the Defense Advanced Research Projects Agency wanted to design a new Humvee in 2011, it held an open contest that drew 160 entrants in just two weeks. Fourteen weeks later, the winning design — which looked like a futuristic, jackedup SUV — was built and hailed as the first crowd-sourced military vehicle. When Goldcorp, a financially struggling Canadian mining company, was unable to find gold on its land in northern Ontario, it put all its geological data online and offered $500,000 in prize money for accurate suggestions. The company received submissions from all over the world, including some using 3D computer-modeling techniques, and found $3 billion worth of gold, making Goldcorp one of Canada’s biggest mining companies. The granddaddy of crowd-competition platforms is InnoCentive, launched in 2001 by Eli Lilly as a way to connect with innovators outside the company. It soon became clear that InnoCentive was attracting a wide variety of experts who were adept at solving problems outside their disciplines, so Lilly decided to open up its brain trust, for a fee, to other companies. Those companies posted their problems and challenges and offered thousands of dollars for solutions — sometimes tens or hundreds of thousands, depending upon the complexity of the problem. By the end of 2011, more than 250,000 InnoCentive solvers worldwide had worked on more than
T h e K o r n / F e r r y I n s tit u t e