Comipidigest4 2016

Page 1

International, Public & Corporate Communication Quarterly Digest of Public Affairs News Issue # 4 - 2016 FOREWORD This newsletter is aimed at providing Public Affairs practitioners with a short selection of recently posted stories, papers, etc. which may be useful to remain abreast of new trends or to stimulate a debate. Sources are linked and any copyright remains with the authors.

In this issue:

The 10 commandments of fake news clickbait p. 2 Everyone Wants to Stop ‘Fake News,’ but No One Seems to Know What Exactly It Is p. 5 Fake news! A single topic clearly dominated the interest of our PR community during this quarter: the spread of fake news and its impact – also thanks to social media – on world events like the US elections. We re-print here a story we published in November on fake news and the associated clickbait echonomy, together with posts that better addressed this topic. Particular reference is made to Facebook, as the medium that got most accusations of spreading false information for the sake of business. Included there are also quotable quotes relevant to the matter.

‘Fake news’ – why people believe it and what can be done to counter it p. 8 Blame the Echo Chamber on Facebook. But Blame Yourself, Too p. 11 Make a Better Facebook, for You and the Nation p. 14 How does misinformation spread online? p. 20 Quotable quotes p. 25 Infographic: Advertsising vs Pubblic Relations

p. 26

Quite popular during this quarter was also an infographic illustrating the main differences between PR and advertising. It really captured the essence of it. The editor wishes you all a Happy Holiday Season!

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

1


The 10 commandments of fake news clickbait By Franco Veltri

I have recently been inundated by a growing number of fake news. Half of them are generated by sites supporting political ideas or people. The rest are just leveraging on my curiosity or on my empathy. The first category may appear a legitimate attempt to promote what would be a recommended best choice. Unfortunately, that is rarely true.

During the American election it was discovered that a good share of web sites promoting ideas close to what Trump’s supporters would concur with, were just small commercial enterprises based around the world. BuzzFeed News identified more than 100 pro-Trump websites being run from a single town in Macedonia. The aim was just to fish viewers so as to generate ad revenues. This is the impact of click-bait economy: the content does not matter, just the number of clicks. Editors of other sites – and of ad-hoc Facebook accounts – just search the net for current or past stories whose content can be plagiarized, maybe changing location and date, to solicit the viewer to read the story. In most cases the link brings to a page containing a few lines of text and lots of ads.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

2


To read an entire article one may have to click several times to go to the following page. Unfortunately, even legitimate online political magazines have access to the same revenue sources. But, how much can be earned playing this game? According to Mani Gandham, the content itself, especially at 1 article a day, is not going to get anywhere near enough traffic to matter. They’ll probably make cents, or maybe dollars on the high end, per month. Many clickbait sites, however, aren’t about the content but about playing the ad industry through arbitrage and fraud. Google has a leading role in this business. This year, Google will generate $57.80 billion in total digital ad revenue worldwide, an increase of 9.0% over last year. Every time someone does a search on Google, an AdWords auction is created. Advertisers bid on keywords in order to serve an ad which, when clicked upon, leads the searcher to a website landing page where a conversion goal, such as a lead generation or purchase, can be completed. The Google AdWords auction is based around two fundamental elements: keywords and cost-perclick (CPC) bidding. When any user clicks on an ad that appears in the search results, you get paid by Google.

A key role in phishing is played by headlines. Both to attract interest and because of included keywords that will increase the probability for the story to appear high among search results. Is this a problem if the author does not have a great imagination and deep knowledge of how search engines work? Of course not. There are many sites where you just insert your possible headline and you will get one optimized for the purpose. For instance, Poynter mentions “Clickbait Headline Generator“, which quickly gives you content like “Is Netflix CEO Reed Hastings getting high with Vladimir Putin?” and “Is John Kerry teasing Ben Affleck at your parents’ place?” Throw some pictures under them, fire up Chartbeat and watch your Christmas bonus grow! Actually, the headline of the post you are reading now is a hook, generated with one of the online services. The original headline was “Fake news and click bait are a plague.” If you are still looking for ten commandments just stop reading. There are none listed here.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

3


Can anything be done, aside from debunking ourselves each story that may be part of a perception campaign, or spreading fake stories? There are indeed online tools that can be of help. If you use Chrome as browser, you can install the extension Stop Clickbait. This extension works across multiple websites to identify clickbait content and notify it to the user. The user can block similar clickbait content (which works similar to adblock). It also has the option to remain undetected links as clickbait. The user can also report misclassified links which are detected as clickbait as not clickbait which helps in improving the extension. Currently works only on links in Facebook and Twitter.

Something indeed is moving. Google announced on 15 November that it will stop allowing fake news sites to use its ad software. Facebook followed with a similar policy. “In accordance with the Audience Network Policy, we do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news,” Facebook said in a statement. They certainly do it for commercial, not ethic reasons. Losing credibility means losing clicks, hence ads revenue. But maybe there is some light at the end of the tunnel. Suggest to read also https://toinformistoinfluence.com/2016/11/18/i-think-donald-trump-is-in-the-white-housebecause-of-me-facebook-fake-news-writer/ and https://toinformistoinfluence.com/2016/11/15/false-misleading-clickbait-y-and-satiricalnews-sources/

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

4


Everyone Wants to Stop ‘Fake News,’ but No One Seems to Know What Exactly It Is Like “terrorism,” we are rushing to stop a politically loaded, abstract concept without bothering to define it. By Adam H. Johnson

Two headlines from USA Daily News 24, a fake-news site registered in Veles, Macedonia. (AP Photo / Raphael Satter)

Like “terrorism” and “WMD” before it, “fake news,” through sheer repetition of the concept, is now officially a thing. Something to be combated. Something we must all get behind and destroy. Since the election—and to an extent right before it—the scourge of fake news as a unique, distinct, and morally urgent threat to our democracy has been thrust upon the public. In this discussion the primary focus has been how, not if, we stop it. “Officials” are “worried” and “governments” are “looking into” curbing the “epidemic.” Hillary Clinton emerged Thursday for the first time since her concession speech to warn that “lives are at risk” over fake news and that the “danger” must be “addressed quickly.” Everyone’s on board: Fake news must be stopped. But what exactly is “fake news”? The reality is, no one seems to know. Or, at least, those who have publicly attempted to define the “phenomenon” have failed spectacularly at it. Obvious examples of fake stories are easy to find: the “Pizzagate” theory that resulted in a North Carolina man opening fire at a DC-area restaurant after online rumors spread the owners were running a child-sex slavery ring connected to the Clintons; the popeendorsing Donald Trump; Hillary Clinton snuffing out FBI agents. These are all manifestly false stories being cited as examples of the scourge of fake news. But from here, many in the media and nominally independent research groups have pivoted to a much broader, far more pernicious definition of what is and isn’t “fake,” with little notice. Over the past month, three separate lists of “fake news” websites—boosted and shared by major media outlets, journalists, and pundits—have gone viral, despite the fact that all three lists included legitimate outlets well within the mainstream. These “fake news” outlets include RawStory, Consortiumnews, Crooks and Liars, Antiwar.com, naked capitalism, Truthout, and

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

5


Truthdig. Several of these outlets are presently threatening a lawsuit against the most egregious peddler of fake news lists, PropOrNot, an anonymous group of researchers whose blacklist The Washington Postcredulously vouched for with little care or consideration. Ironically, in this panic, editorial standards have fallen in inverse proportion to the fear of fake news. The most heavily criticized “fake news” list, by PropOrNot, was promoted by The Washington Post in a story about how the Russian government purposefully “echoed and amplified” right-wing fake news sites to influence the presidential election. The piece, which was shared by dozens of high-profile names—such as Robert Reich,Joy-Ann Reid, and Jonathan Weisman—stated that PropOrNot had identified “more than 200 websites as routine peddlers of Russian propaganda during the election season.” As The Intercept was quick to point out, PropOrNot’s blacklist included “WikiLeaks and the Drudge Report, as well as Clinton-critical left-wing websites…[and] libertarian venues.” In a subsequent interview, a researcher from PropOrNot advanced the bizarre theory that Russian President Vladimir Putin had Labor MP Jo Cox killed in some type of “Manchurian candidate” plot. When asked by The New Yorker’s Adrian Chen why they chose to remain anonymous, the PropOrNot spokesperson evoked Cox, insisting that “Russia uses crazy people to kill its enemies.” While many who legitimately think fake news is a problem have sought to distance themselves from PropOrNot as a cooky one-off, the speed and scope with which their absurd list was spread—and the complete lack of skepticism on display when this is done—shows how much of this story is a problem in search of evidence. Standards were lowered when vetting PropOrNot because the reporter and editors and those who spread the story needed it to be true. The plague of Fake News had to be addressed and the market for newsy, empirical collateral that reinforced this narrative was in high demand. The scale of the problem is also often misunderstood. Probably the most cited article on the scope of the fake news issue is a BuzzFeed piece from November 16 that showed fake news “outperformed” real news on Facebook in the run-up to the election. BuzzFeed showed that the top 20 fake news stories garnered more social media engagements on the site than the top 20 real news stories. The problem with the ambiguous framing is that many took it to mean that people, in the aggregate, engaged with more fake news than real news—which isn’t at all the case. Executive Director of Human Rights Watch Kenneth Roth, for example, breathlessly tweeted out, “As US election progressed, fake news engaged Facebook users more than real news.” But this isn’t remotely true; it can’t be, since real news outlets produce far more content than fake ones do. To interpret this report to mean fake news engaged Facebook users more overall is like saying the Dallas Cowboys scored more points than the Chicago Cubs did runs in 2016 by taking their top three games and adding up the totals. While it’s true in this limited scope, overall the Cubs played 163 more games and have thus scored far more on the whole. Indeed, the BuzzFeed article itself notes in the very last line that “large news sites overall see more engagement on Facebook than fake news sites.” But this not how journalists and outlets framed BuzzFeed’s findings. The types of fake news people usually cite, broadly speaking, fall into three categories: (1) outright fabrications (the pope endorsing Trump), (2) deliberate political propaganda (how many view Russia Today), and (3) political clickbait that isn’t entirely true nor is it necessarily part of a foreign scheme.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

6


It’s the casual conflation of (1) with (2) and (3) that should raise alarm for those who value the free flow of ideas and information. Take, for example, a recent BuzzFeed piece on “US intelligence officials” expressing concern that Russia steered fake news in the buildup to the election. The headline of the piece and intro vaguely address the problem of “fake news” (without, it should be mentioned, citing one example of a fake news story Russia was behind); then it drops the “fake news” hook about one-third in and moves on entirely to “Russian propaganda”—neither a new or unique threat. CNN’s breakdown of the problem of “Russian fake news” likewise goes back and forth from “fake news” to routine propaganda and disinformation. A Guardianrundown of the “insidious trend” of fake news around the globe casually switches from deliberate lies to vague notions of misinformation when it suits the writer’s thesis: Samuel Laurent, head of Le Monde’s fact-checking section, Les décodeurs, said: “In France, there isn’t a wide presence of totally invented fake news that makes money through advertising, as seen in the US.” But he said France was seeing increasing cases of manipulation and distortion, particularly during election periods. But wait. The story is about the “insidious trend” of fake news. Now it’s about something else entirely different, something that, again, is neither new nor on topic. This water muddying isn’t happening in a vacuum either. Those wanting to proceed with plans to curate and monitor information online—a long held impulse of all governments—are using the specter of “fake news” as a PR bludgeon to justify these broader efforts. On November 29, The Washington Post’s David Ignatius relayed that the US State Department was working on plans to protect “the truth,” including floating the idea of a “global ombudsman for information.” BuzzFeed reported that Congress, in the context of combating Russian fake news, will soon bring back the Cold War–era Active Measures Working Group, originally set up in concert with the CIA and the Defense Department to combat Soviet disinformation. The troubling effects of such efforts, as anyone who’s operated outside the mainstream of acceptable political opinion will tell you, cannot be overstated. One reason so many bluecheckmark pundits reflexively share fake-news blacklists—despite them having numerous false positives—is because they, themselves, have never held an opinion that veers too far off the editorial page of The New York Times. But it’s those opinions, those that push back against Official Truths, that are likely to get caught up in otherwise good-faith attempts to counter deliberate lies. With such a broad charge, “the cure for fake news,” Politico’s Jack Shafer notes, may be “worse than the disease.” In theory, building mechanisms to weed out outright lies is good, but the speed and broad scope of the fake-news fervor reveals that these efforts are easily hijacked, co-opted, and fudged not to protect truth but to curate it. Not to stop the spread of outright lies but to label websites deemed outside the mainstream with a scarlet letter. Given that some of the most public and viral attempts to build such systems have failed at making this crucial distinction, perhaps before attempting to solve the problem of fake news we should, at least, try to figure out what exactly it is.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

7


‘Fake news’ – why people believe it and what can be done to counter it By Simeon Yates Barack Obama believes “fake news” is a threat to democracy. The outgoing US president said he was worried about the way that “so much active misinformation” can be “packaged very well” and presented as fact on people’s social media feeds. He told a recent conference in Germany: “If we are not serious about facts and what’s true and what’s not, if we can’t discriminate between serious arguments and propaganda, then we have problems.” But how do we distinguish between facts, legitimate debate and propaganda? Since the Brexit vote and the Donald Trump victory a huge amount of journalists’ ink has been used up discussing the impact of social media and the spread of “fake news” on political discourse, the functioning of democracy and on journalism. Detailed social science research is yet to emerge, though a lot can be learnt from existing studies of online and offline behaviour. Matter of trust Let’s start with a broad definition of “fake news” as information distributed via a medium – often for the benefit of specific social actors – that then proves unverifiable or materially incorrect. As has been noted, “fake news” used to be called propaganda. And there is an extensive social science literature on propaganda, its history, function and links to the state – both democratic and dictatorial. In fact, as the investigations in the US and Italy show, one of the major sources of fake news is Russia. Full Fact, a site in the UK, is dedicated to rooting out media stories that play fast and loose with the truth – and there is no shortage.

British poster from World War I attacking

An argument could be made that as the “mainstream” media have become seen as less trustworthy (rightly or wrongly) in the eyes of their audiences, it makes it hard to distinguish between those who have supposedly got a vested interest in telling the truth and those that don’t necessarily share the same ethical foundation. How does mainstream journalism that is also clearly politically biased – on all sides – claim the moral high ground? This problem certainly predates digital technology.

Bubbles and echo chambers This leaves us with the question of whether social media makes it worse? Almost as much ink has been used up talking about social media “bubbles” – how we all tend to talk with people

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

8


who share our outlook – something, again, which is not necessarily unique to the digital age. This operates in two distinct ways. Bubbles are a product of class and cultural position. A recent UK study on social class pointed this out. An important subtlety here is that though those with higher “social status” may congregate, they are also likely to have more socially diverse acquaintance networks than those in lower income and status groups. They are also likely to have a greater diversity of media, especially internet usage patterns. Not all bubbles are the same size nor as monochromatic and our social media bubbles reflect our everyday “offline” bubbles. In fact social media bubbles may be very pertinent to journalist-politician interactions as one of the best-defined Twitter bubbles is the one that surrounds politicians and journalists. This brings back into focus older models of media effects such as the two-step flow model where key “opinion leaders” – influential nodes in our social networks – have an impact on our consumption of media. Analyses of a “fake news story” appears to point – not to social media per se – but to how stories moving through social media can be picked up by leading sites and actors with many followers and become amplified.

The false assumption in a tweet from an individual becomes a “fake news” story on an ideologically-driven news site or becomes a tweet from the president-elect and becomes a “fact” for many. And we panic more about this as social media make both the message and how it moves very visible. Outing fake news What fuels this and can we address it? First, the economics of social media favour gossip, novelty, speed and “shareability”. They mistake sociability for social value. There is evidence that “fake news” that plays to existing prejudice is more likely to be “liked” and so generate more revenue for the creators. This is no different than “celebrity” magazines. Well researched and documented news is far less likely to be widely shared. The other key point here is that – as Obama noted – it becomes hard to distinguish fake from fact, and there is evidence that many struggle to do this. As my colleagues and I argued nearly 20 years ago, digital media make it harder to distinguish the veracity of content simply by the physical format it comes in (broadsheet newspaper, high-quality news broadcast, textbook or tabloid story). Online news is harder to distinguish. The next problem is that retracting “fake news” on social media is currently poorly supported by the technology. Though posts can be deleted, this is a passive act, less impactful than even the single-paragraph retractions in newspapers. In order to have an impact, it would be Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

9


necessary not simply to delete posts but to highlight and require users to see and acknowledge items removed as “fake news”. So whether or not fake news is a manifestation of the digital and social media age, it seems likely that social media is able to amplify the spread of misinformation. Their economics favour shareability over veracity and distribution over retraction. These are not technology “requirements” but choices – by the systems’ designers and their regulators (where there are any). And mainstream media may have tarnished their own reputation through “fake” and visibly ideological news coverage, opening the door to other news sources.

Understanding this complex mix of factors is the job of the social sciences. But maybe the real message here is that we as societies and individuals have questions to answer about educating people to read the news, about our choice not to regulate social media (as we do TV and print) and in our own behaviour – ask yourself, how often do you fact-check a story before reposting it?

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

10


Blame the Echo Chamber on Facebook. But Blame Yourself, Too Many blame Facebook's algorithms for pushing each of us towards an echo chamber, where we mostly receive news we would agree with. If this is true, the blame should rather be on us. Because we deliberately choose actions that push us down that echo chamber. Facebook has a business to run: If we appear to be satisfied seeing news from sources we like, why should they show us stories we do not engage with? While Facebook can still do a lot to discriminate false news, we should take the time to also click on stories we may disagree with. That by itself would blow off the bubble we are pushed into. And would also give us a better balanced perception of reality. [editor’s note] By KARTIK HOSANAGAR

EVER SINCE THE Presidential elections, everyone I know seems to be worrying about their social media echo chamber. And no one seems to see the irony of discussing it on Facebook.

GETTY IMAGES

Many of us seem to feel trapped in a filter bubble created by the personalization algorithms owned by Facebook, Twitter, and Google. Echo chambers are obviously problematic; social discourse suffers when people have a narrow information base with little in common with one another.

The Internet was supposed to fix that by democratizing information and creating a global village. Indeed, the democratization has occurred. Social newsfeeds on Facebook and Twitter and algorithms at Google perform the curation function that was previously performed by news editors. By some estimates, over 60 percent of millennials use Facebook as their primary source for news. But, in recent weeks, many commentators have voiced concerns that these trends, instead of creating a global village, have further fragmented us and are destroying democracy. Mark Zuckerberg, Facebook’s CEO, believes we are overestimating the impact of the social network on our democracy. My collaborators and I have studied echo chambers for some time now. And I know Mark Zuckerberg is wrong. Facebook can do more to help us break free from the filter bubble. But we are not helpless within it. We can easily break free but we choose not to. The Evidence In studies in 2010 and 2014, my research group at Wharton evaluated media consumption patterns of more than 1,700 iTunes users who were shown personalized content recommendations. The analysis measured the overlap in media consumed by users—in other words, the extent to which two randomly selected users consumed media in common. If users were fragmenting due to the personalized recommendations, the overlap in consumption across users would decrease after they start receiving recommendations.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

11


After recommendations were turned on, the overlap in media consumption increased for all users. This increase occurred for two reasons. First, users simply consumed more media when an algorithm found relevant media for them. If two users consumed twice as much media, then the chance of them consuming common content also increased. Second, algorithmic recommendations helped users explore and branch into new interests, thereby increasing overlap with others. In short, we didn’t find evidence for an echo chamber. But political content is different from other forms of media. For example, people are less likely to hold extreme views about or polarize over, say, music than political ideologies. Further, social newsfeeds are different from the personalized recommendations one might see on iTunes. So the question is whether our results generalize to social media as well. The answer emerges from a 2015 study by researchers at Facebook who analyzed how the social network influences our exposure to diverse perspectives. The researchers evaluated the newsfeeds of 10.1 million active Facebook users in US who self-reported their political ideology (“conservative,” “moderate,” and “liberal”). The researchers then calculated what proportion of the news stories in these users’ newsfeed was cross-cutting, defined as sharing a perspective other than their own (for example, a liberal reading a news story with a primarily conservative perspective). On a social network like Facebook, three factors influence the extent to which we see crosscutting news. First, who our friends are and what news stories they share; second, among all the news stories shared by friends, which ones are displayed by the newsfeed algorithm; and third, which of the displayed news stories we actually click on. If the second factor is the primary driver of the echo chamber, then Facebook deserves all blame. In contrary, if the first or third factor is responsible for the echo chamber, then we have created our own echo chambers. If we acquired our news media from a randomly selected group of Facebook users, nearly 45 percent of news seen by liberals and 40 percent seen by conservatives on Facebook would be cross-cutting. But we acquire these news stories from our friends. As a result, the researchers found that only 24 percent of news stories shared by liberals’ friends were cross-cutting and about 35 percent of stories shared by conservatives’ friends were cross-cutting. Clearly, the like-mindedness of our Facebook friends traps us in an echo chamber. The newsfeed algorithm further selects which of the friends’ news stories to show you. This is based on your prior interaction with friends. Because we tend to engage more with likeminded friends and ideologically similar websites, the newsfeed algorithm further reduces the proportion of cross-cutting news stories to 22 percent for liberals and 34 percent for conservatives (see figure below). Facebook’s algorithm worsens the echo chamber, but not by much. Finally, the question is which of these news stories do we click on. The researchers find that the final proportion of cross-cutting news stories we click on is 21 percent for liberals and 30 percent for conservatives. We clearly prefer news stories that are likely to reinforce our existing views rather than challenge them.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

12


Source: “Exposure to ideologically diverse news and opinion on Facebook,” Science, 5 June 2015.WIRED

The authors conclude that the primary driver of the digital echo chamber is the actions of users—who we connect with online and which stories we click on— rather than the choices the newsfeed algorithm makes on our behalf. Should we believe a research study conducted by Facebook researchers that absolves the company’s algorithms and places the blame squarely on us? I think the study is well-designed. Further, even though the studies I describe were conducted in 2015 or earlier, the underlying mechanisms investigated in these papers—design of personalization algorithms and our preference for like-mindedness—have not fundamentally changed. So I am confident that all three studies will share qualitatively similar findings in November 2016. That said, I disagree with a key conclusion of the Facebook study. It is true that our friendship circles are often not diverse enough, but Facebook can easily recommend cross-cutting articles from elsewhere in its network (e.g. “what else are Facebook users reading?”). That the news being shown our feeds is from our friends is ultimately a constraint that Facebook enforces. That doesn’t mean you and I are acquitted. The primary issue is that we deliberately choose actions that push us down an echo chamber. First, we only connect with like-minded people and “unfriend” anyone whose viewpoints we don’t agree with, creating insular worlds. Second, even when the newsfeed algorithm shows cross-cutting content, we do not click on it. Why should Facebook be obligated to show content with which we don’t engage? Finally, Facebook has a business to run. If I appear to be satisfied on Facebook seeing news from sources I like, how is the echo chamber Facebook’s responsibility? Why should Facebook not optimize for their business if we are unwilling to vote with our clicks? In The Big Sort, Bill Bishop shows how, over the last 30 years, Americans have sorted themselves into like-minded neighbourhoods. The same appears to be happening on the web. All the empirical research to date suggests that the reason is not the use of personalization algorithms per se. Algorithms can easily expose us to diverse perspectives. It is the data being fed to the personalization algorithms and the actions we take as end users. Technology companies like Facebook and Google should do more. But so should you and I.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

13


Make a Better Facebook, for You and the Nation By Jake Swearingen

Lately, Facebook has been, well, depressing. I find myself avoiding it more, ignoring its little pings to get my attention. (I really, really don’t care how many event invites or notifications I’ve missed, Facebook.) Some of this can be blamed on the recent election. But I also think there are some deeper reasons as well, not strictly related to the election. Over the last decade — and in the last few years in particular — Facebook has taken on an increasingly important role in our public and private lives. For many people, it’s the first place to make important family and job announcements, or to send out invitations; at the same time, it’s a central location for sharing and discussing news and politics. And if you’re spending that much time and attention on a single company, it’s important to take control of it as best you can. Why You Might Want to Change Your News Feed Facebook’s News Feed algorithm changed in midsummer, rejiggering the dials to lower the number of posts from news organizations showing up in my News Feed and raise the number of posts from my friends. In some ways this is great; I see more stuff from actual people, fewer posts from the The Wire fan page I “liked” in 2007. But this also means that many of the news stories I see on Facebook aren’t from any of the publications I’ve chosen to follow, but instead from the publications my friends chose to share — which in practice mainly means emotionally charged and highly partisan information. And after an election where 95 percent of my Facebook feed was stunned to discover that there are people who would actually vote for Donald Trump, life in that kind of filter bubble no longer seems so comfortable. (This Chrome extension can easily show you just how much of a bubble your Facebook feed really is.) It’s also troubling to think that I’m actively supporting a company that has quickly eaten the media and seems unwilling to take on the responsibilities that entails. It’s not just the fake news that bubbles up all over the site — it’s that Facebook, which is very firm about not being Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

14


a media company but just a neutral platform, is where 44 percent of Americans say they get news. Filter bubbles (under a different name) were recognized as a problem long before Mark Zuckerberg cobbled together code that allowed Ivy League kids to share pics of binge drinking, but Facebook has taken those bubbles and turned them into walls. Since moving over to an algorithmically controlled News Feed, Facebook has always rewarded stories that engage on an emotional level (and caused the rise, and sometimes fall, of several media organizations). But in 2016, most stories that I come across now read like New York Post front pages without the puns: Everything is awful or wonderful, sickening or heartwarming, diseased or destined to save us all — and none of it is ever challenging to my basic worldview. It’s not just unhealthy for me, personally, to see that constantly. It’s damaging for the 170 million Americans on Facebook, all of whom live in a country where the citizens are deeply divided and actively digging deeper into their ideological foxholes. It’s true that this phenomenon isn’t something that Facebook created, but it’s also true that Facebook is only making the problem worse. But it’s still the place where I see baby pictures and engagement announcements and job changes and all the other assorted small but meaningful glimpses of other people’s lives. It’s also been a place where I’ve seen and participated in some successful organizing, whether that’s making calls to my representatives or being moved to donate to some key causes. It’s not something I’m ready to give up on quite yet (though I don’t blame anyone who does). So, then, how to make it better? Here’s what I hit on. See Your Actual News Feed In 2006, Facebook just showed you the most recent things people and pages in your network had done, chronologically. By 2011, it had started to sort posts into “Top News,” algorithmically picking what Facebook thought you’d be most interested in. Facebook’s sorting formulas are opaque, and the company believes that it’s giving you what you most want to see (judged by some combination of clicks, shares, likes, comments, and even how much time you spend lingering on a given post). But it can be clarifying to see the stuff you’re missing out on — stuff you might find valuable even if you’re not clicking or commenting on it. In 2016, it’s easy to forget that you even can see you Facebook News Feed chronologically. But it’s an important first step. To get started, head over to left-hand column, and hit the drop-down button for News Feed.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

15


If you haven’t done this in a while, you’re probably in for a surprise. There will be people you’ve forgotten you were friends with on Facebook, publishers and pages for which you don’t remember hitting “Like.” But it’s also the best way to start to get a handle on what’s actually coming down from your News Feed. Spend a few days with this turned on and you’ll start to get a sense of what’s actually being posted. You’ll probably need to reset it to Most Recent a few times; Facebook automatically reverts to Top Stories after a set amount of time unless you have a browser extension like FB Purity installed. Alternatively, you can set https://www.facebook.com/?sk=h_chr as a bookmark, which will always load up your News Feed with Most Recent selected. If you have a decent amount of friends or have liked a lot of pages, you’ll likely find Most Recent kind of exhausting — Facebook quickly turns into a content stream with just slightly less churn than Twitter. But switch back and forth between Top Stories and Most Recent for a bit. See what Facebook is choosing to show — and what it’s choosing to hide. Prune Your News Feed The first step of trying to get my News Feed to be slightly less depressing was cutting out some of what comes up. Because Facebook wants your experience to be as low-effort as possible, you generally only have vague control over what you see in the News Feed — but even that is better than nothing. This is where the small drop-down arrow on the top right of posts come in handy. What I was aiming for was sending signals to Facebook’s algorithm about what I actually wanted to see. You have three basic options:

You can hide a post from friends, which signals to Facebook you want to see less of someone’s posts when looking at Top Stories in your News Feed. It also (theoretically) signals to Facebook that you want to see less of these types of posts.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

16


You can unfollow a friend who is only posting WikiLeaks conspiracy theories or rants about how Bernie Bros destroyed the Democratic Party, which means you won’t see them in your News Feed again (but will still remain friends with them, just in case).

Or, if you want to keep seeing pictures of someone’s dog but not their links to a particular site you think is misleading, toxic, or just plain bad, you can select “Hide All from American Freedom Eagle” (or whatever the site may be). This also has knock-on benefit of blocking anything else that your other friends share from that publisher — with a few clicks, you can make sure you’ll never see stories from your least favorite publishers on your Facebook feed ever again. For me, this mainly meant going through my Most Recent and Top Stories News Feeds and trying to tone down the rage and despair from some, while leaving people and pages posting more informative stuff alone. I also took the time to make sure to click through to stories and hitting “Like” on things I did want, which signals to Facebook you want to see more of this. To speed up the process, I also hit the upper right drop-down menu on my profile page and clicked Manage News Feed, which let me batch unfollow (or, in some cases, refollow) various people. You also can go through all the pages you’ve liked and just ax the ones you no longer want to see stuff from. To do this, go to Pages in the left-hand side bar, and then hit Like Pages. Once you’ve clicked through, hit the right-most tab for Liked Pages. Then hit Review Liked Pages in the top right-hand pane.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

17


You’ll get a long list of everything you’ve liked on Facebook. You can sort this a few different ways (by when you liked a page, the last time you interacted with a page) and then start clicking away on the pages you don’t want to follow anymore.

Break Out of Your Bubble It’s easy to zap what you can see, but harder to find what you aren’t seeing. For me, this meant two things: finding pages that run counter to my own political viewpoint and liking them, and boosting how much I see from my conservative friends. Finding and liking sites outside my own little piece of the political landscape was the easy part of it. I went for breadth here. To pull in more right-leaning stuff, I liked the Weekly Standard and the National Review, and then pulled in more from Facebook’s suggestions for similar pages (while avoiding the places I find truly noxious). Pretty quickly I was seeing stuff that I fundamentally disagreed with — a refreshing change of pace on Facebook. The second part, boosting how much I see updates from friends on the other end of the political spectrum, was harder and is probably something I’ll have to signal to Facebook for a while to get its algorithm to take notice. (Apologies to people who discovered I had liked one of their posts from, like, late October — I’m just trying to train an algorithm to show me more

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

18


of your point of view!) But I’ve already started seeing more of their posts show up in my News Feed’s Top Stories, where before I never saw them at all. The goal here, for me, isn’t to find people to tangle with online about politics. It’s just about widening up what I see, and pushing back against Facebook’s natural inclination to keep anything that I might find disagreeable away from me. … Or, Just Delete It All All that said, this is an awful lot of work to make Facebook contort itself into something it clearly doesn’t want to be: a place to encounter ideas and things you don’t agree with or like. If, for whatever reason, you want to scrap the whole thing, that’s also doable. You have two basic options. If you just want to put your Facebook profile on ice, you can deactivate it (and reactivate it at some later point if you change your mind). Get there by clicking the upper-right drop-down menu on your profile page and scrolling down to Settings. Once in Settings, hit Security and then Deactivate Your Account. Facebook will throw up a bunch of warnings, but click through them and your account is now in suspended animation. But if you’re a believer in no half-measures, you can just delete the whole thing. In that case, head to https://www.facebook.com/help/delete_account and get started. It may take some time for Facebook to delete everything you’ve uploaded over the years, but eventually you’ll be gone. (If you have some stuff you want to save before you burn it all down, you can download your Facebook data by hitting the upper-right drop-down menu on any Facebook page, going to Settings, and then in General Account Settings clicking “Download a copy of your Facebook data.”) Facebook’s Not Going to Change, So if You Want to Keep Using It, You’ll Have To I don’t fault anyone who decides they’d rather just back out of Facebook entirely rather than deal with it anymore. That said, I do think there’s some small value in taking steps to indicate to Facebook you want to see more stuff from different points of view. In some gauzy movie version of this, millions of Americans would start actively seeking out diverse viewpoints, rewarding articles they might not agree with but can respect, slowly shifting Facebook’s algorithm toward displaying a News Feed that’s overall less blinkered and more open. But I don’t think that’s going to happen, at least not anytime soon. Facebook’s ultra-efficient sorting and managing of what you see is a technological marvel, something very smart people have dedicated a good portion of their lives to. It’s a business driven by the bottom line and by needing to show stockholders that it can continue to grow its advertising revenue — and the News Feed being a welcoming warm bath of bland and agreeable stories is a big part of that. If you want something different out of Facebook, you’ll need to do it yourself.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

19


How does misinformation spread online? By Walter Quattrociocchi

Image: A man poses with his iPad tablet as he sits in a bar, in this photo illustration taken in Rome September 20, 2012. REUTERS/Tony Gentile

In the run up to the 2013 Italian elections, a social media post exposing the corruption of parliament went viral. Italian politicians were quietly certain that, win or lose, they would be financially secure by taking money from the taxpayer. Parliament had quietly passed a special welfare bill specially designed to protect policy-makers by ensuring them an incredible unemployment package should they lose their seat in the upcoming election. The bill, proposed by Senator Cirenga, allocated an amazing €134 billion to political unemployment. The Italian Senate had voted 257 in favour and 165 in abstention. The post caused considerable and understandable uproar. It was covered in several leading newspapers and cited by mainstream political organizations. But there was one problem: the entire story was fake. Not even a good fake at that. For those interested in Italian politics, there were a number of obvious problems with the story. First of all, there is no Senator Cirenga. The number of votes doesn't work either, because Italy doesn't even have that many senators. And the incredible sum would have accounted for roughly 10% of Italy's GDP. So what happened? How did such an obvious fake fool so many people? Walter Quattrociocchi, the head of the Laboratory of Computational Social Science at IMT Lucca in Italy, has been studying the phenomenon of misinformation online. His work helped to inform the World Economic Forum’s latest Global Risks Report. We spoke with him to find out more.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

20


Why is this happening? Before the web, you got your information from magazines, television and the newspapers. Now anyone can create a blog, have a Tumblr page or post their opinions online. From there, you can spread that information rapidly through Twitter, Facebook and a whole smorgasbord of other social media platforms.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

21


The problem is that while traditional media had editors, producers and other filters before information went public, individual publishing has no filter. You simply say what you want and put it out there. The result is that everyone can produce or find information consistent with their own belief system. An environment full of unchecked information maximizes the tendency to select content by confirmation bias. Recent studies that focus on misinformation online pointed out that the selective exposure to specific content leads to "echo chambers" in which users tend to shape and reinforce their beliefs. What is an echo chamber? An echo chamber is an isolated space on the web, where the ideas being exchanged essentially just confirm one another. It can be a space of likeminded people sharing similar political views, or a page about a specific conspiracy theory. Once inside one of these spaces, users are sharing information that is all very similar, basically "echoing" each other. We have studied the dynamics inside a single echo chamber. What we found is that the most discussed content refers to four specific topics: environment, diet, health and geopolitics. Content belonging to the different topics are consumed in a very similar way by users. Likes and shares remain more or less the same across topics. If we focus on the comments section however, we notice a remarkable difference within topics. Users polarized on geopolitics are the most persistent in commenting, whereas those focused on diet are less persistent. We also found that users "jump" from one topic to another. Once they begin to "like" something, they do this more and more, like a snowball effect. Once engaged in a conspiracy corpus, a user tends to join the overall conversation, and begins to "jump" from one topic to another. The probability increases with user engagement (number of likes on a single specific topic). Each new like on the same conspiracy topic increases the probability to pass to a new one by 12%. What kind of rumours are spreading? Pages about global conspiracies, chem-trails, UFOs, reptilians. One of the more publicized conspiracies is the link between vaccines and autism. These alternative narratives, often in contrast to the mainstream one, proliferate on Facebook. The peculiarity of conspiracy theories is that they tend to reduce the complexity of reality. Conspiracy theories create (or reflect) a climate of disengagement from mainstream society and from officially recommended practices - e.g. vaccinations, diet, etc. Among the most fascinating social dynamics observed is trolling. Before, trolls were mainly people who just wanted to stir up a crowd, but the practice has evolved. Trolls today act to mock the "believe anything" culture of these echo-chambers. They basically attack contradictions through parody.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

22


Trolls’ activities range from controversial and satirical content to the fabrication of purely fictitious statements, heavily unrealistic and sarcastic. For instance, conspiracist trolls aggregate in groups and build Facebook pages as a caricature of conspiracy news. A recent example was a fake publication of "findings" that showed chemtrails had traces of viagra in them. What makes their activity so interesting is that, quite often, these jokes go viral and end up used as evidence in online debates from political activists. How have you been studying this phenomenon? On Facebook, likes, shares, and comments allow us to understand social dynamics from a totally new perspective. Using this data, we can study the driving forces behind the diffusion and consumption of information and rumours. In our study of 2.3 million individuals, we looked at how Facebook users consumed different information at the edge of political discussion and news during the 2013 Italian elections. Pages were categorized, according to the kind of content reported on. 1. Mainstream media 2. Online political activism 3. Alternative information sources (topics that are neglected by science and mainstream media) We followed 50 public pages and their users’ interactions (like, comments and shares) for six months. Each action has a particular meaning. A like gives positive feedback; a share expresses the will to increase the visibility; and comments expand the debate. What we found was that neither a post’s topic nor its quality of information had any effect on the outcome. Posts containing unsubstantiated claims, or about political activism, as well as regular news, all had very similar engagement patterns. So people are reacting to posts based on their beliefs, regardless of where those posts originated from? Exactly. It's not that people are reacting the same way to all content, but that everyone is agreeing within their specific community. People are looking for information which will confirm their existing beliefs. If today an article comes out from the WHO supporting your claims, you like it and share it. If tomorrow a new one comes out contradicting your claims, you criticise it, question it. The results show that we are back to "echo chambers", there is selective exposure followed by confirmation bias. To verify this, we performed another study, this time with a sample of 1.2 million users. We wanted to see how information related to very distinct narratives - i.e. mainstream scientific and conspiracy news - are consumed and shaped by communities on Facebook.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

23


What we found is that polarized communities emerge around distinct types of content and consumers of conspiracy news tend to be extremely focused on specific content. Users who like posts do so on the pages of one specific category 95% of the time. We also looked at commenting, and found that polarized users of conspiracy news are more focused on posts from their community. They are more prone to like and share posts from conspiracy pages. On the other hand, people who consume scientific news share and like less, but comment more on conspiracy pages. Our findings indicate that there is a relationship between beliefs and the need for cognitive closure. This is the driving factors for digital wildfires. Does that mean we know what will go viral next? Viral phenomena are generally difficult to predict. This insight does allow us to at least understand the users that are more prone to interact with false claims. We wanted to understand if such a polarization in the consumption of content affects the structure of the user's friendship networks. In another study, Viral Misinformation: The role of homophily and polarization we found that a user’s engagement in a specific narrative goes hand in hand with the number of friends having a similar profile. That provides an important insight about the diffusion of unverified rumours. It means that through polarization, we can detect where misleading rumours are more likely to spread. But couldn’t we combat that by spreading better information? No. In fact, there is evidence that this only makes things worse. In another study, we found that people interested in a conspiracy theory are likely to become more involved in the conversation when exposed to "debunking". In other words, the more the exposure to contrasting information a person is given, the more it reinforces their consumption pattern. Debunking within an echo chamber can backfire, and reinforce people’s bias. In fact, distrust in institutions is so high and confirmation bias so strong, that the Washington Post has recently discontinued their weekly debunking column. What can be done to fight misinformation? Misinformation online is very difficult to correct. Confirmation bias is extremely powerful. Once people have found "evidence" of their views, external and contradicting versions are simply ignored. One proposal to counteract this trend is algorithmic-driven solutions. For example, Google is developing a trustworthiness score to rank the results of queries. Similarly, Facebook has proposed a community-driven approach where users can flag false content to correct the newsfeed algorithm.

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

24


This issue is controversial, however, because it raises fears that the free circulation of content may be threatened and the proposed algorithms might not be accurate or effective. Often users denounce attempts to debunk false information, such as the link between vaccination and autism, as acts of misinformation. The Global Risks Report 2016 is available here. QUOTABLE QUOTES:

“There were people, too, who added to the real perils by inventing fictitious dangers: some reported that part of Misenum had collapsed or another part was on fire, and though their tales were false they found others to believe them.” Pliny, The Destruction of Pompeii, 79 AD

“Each person can share what he or she knows with the others, making the whole at least equal to the sum of the parts. Unfortunately, this is often not what happens... As polarization gets underway, the group members become more reluctant to bring up items of information they have about the subject that might contradict the emerging group consensus. The result is a biased discussion in which the group has no opportunity to consider all the facts, because the members are not bringing them up... Each item they contributed would thus reinforce the march toward group consensus rather than add complications and fuel debate.” Patricia Wallace, The Psychology of the Internet, 1999

Quarterly Digest of Public Affairs News – 4-2016

“The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it.” Francis Bacon, 1620

“Social media gives legions of idiots the right to speak when they once only spoke at a coffee-bar after a glass of wine, without harming the community. Then they were quickly silenced, but now they have the same right to speak as a Nobel Prize winner. It’s the invasion of the idiots…. The drama with Internet is that it promoted the village’s fool to bearer of the truth.” Umberto Eco, 2015

“From the Facebook and Twitter feeds we monitor, to the algorithms that determine the results of our Web searches based on our previous browsing history and location, our major sources of information are increasingly engineered to reflect back to us the world as we already see it. They give us the comfort of our opinions without the discomfort of thought. So you have to find a way to break out of your echo chambers.” Samantha Power, United States Ambassador to the United Nations, 2016

Edited by ComIPI – www.comipi.it

25


Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

26


ComIPI is a no-profit study center aimed at developing and implementing advanced techniques to communicate with the public while respecting ethical principles. Edited by Franco Veltri info@comipi.it www.comipi.it our Blog: http://comipi.wordpress.com/

Quarterly Digest of Public Affairs News – 4-2016

Edited by ComIPI – www.comipi.it

27


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.