5 minute read

Book Review: An Ugly Truth – Inside Facebook’s Battle for Domination

An Ugly Truth

Inside Facebook’s Battle for Domination By Sheera Frenkel and Cecilia Kang

Advertisement

Reviewed by Michael Attard

Mark Zuckerberg is one of the best-known people on the planet and one of the most despised. How could someone who is “a staunch believer in free expression,” an individual trying to connect people, create so much controversy? In their book An Ugly Truth, Cecilia Kang and fellow investigative journalist Sheera Frenkel delve into the machinations of corporate Facebook to not only elucidate the prolonged, heated debate whirling around Facebook, but to conclude with condemnation. In the Authors’ Note they make clear that: “This book is the product of more than a thousand hours of interviews with more than four hundred people.”

The book is replete with names, dates, and situations. I suggest that readers not let this distract from the argument. For the sake of simplicity, I have focussed on two broad areas of controversy. These are the algorithms used and Facebook’s response to certain events. You can judge for yourself whether Facebook has done anything wrong or responded in an inappropriate manner. In September 2006, Facebook introduced “News Feed,” which “would draw from posts, photos, and status updates.” Users had already entered these on their profiles, but previously, one would have to click onto a friend’s profile to see an update. With News Feed, Facebook was now broadcasting. Many were furious. The claim is that Zuckerberg wanted to keep users logged onto the platform as much as possible. Zuckerberg responded, as he would continue to do in the future, by saying that he was sorry. He also claimed that, “nothing about users’ privacy settings had changed.”

In November 2007, Zuckerberg announced the use of the program “Beacon,” which “took information about a Facebook user’s purchases from other sites – movie tickets…a hotel booking, etc. and published it on the News Feeds of… friends.” Users were now being used as “sales agents” for companies who paid Facebook for the data. Zuckerberg said, “Nothing influences a person more than a recommendation from a trusted friend.”

There was an immediate outcry. “When Facebook responded that the feature could be turned off, users produced contrary evidence.” Facebook was tracking and monitoring users. “Zuckerberg apologized… and announced that he would change the settings.” It appears that choosing who users share information with on the platform is one thing, but “behind the platform,” the sharing of information with advertisers persisted. About two years later, Beacon was discontinued.

And how about the Like button? This actually met with little resistance. It is a quick and simple way to send a “positive affirmation” to friends. Facebook found that people used the site more when the Like button was available. The user is subsequently presented with other similar content on an endless scroll keeping the user engaged on the site and thus, of course, exposed to more ads. Facebook then made the feature available outside of the site. “Companies got information about which users were visiting their sites, and Facebook received information about what its users did once they left its site.” It all may seem harmless enough, but do users have any awareness of the personal data being harvested?

At a congressional appearance, Zuckerberg said, “We do not sell data to advertisers. What we allow is for advertisers to tell us who they want to reach. And then we do the placement.” So, if I understand Zuckerberg correctly, Facebook has information that advertisers would love to use for direct advertising. But Facebook does not give the advertisers this information; rather, Facebook is happy to get its hands dirty through the hijacking of the user.

Turning to controversy over events, most of us will recall that during the US presidential campaign of 2016, Donald Trump said, “Donald J. Trump is calling for a total and complete shutdown of Muslims entering the United States…” How did Facebook respond? Facebook technically barred hate speech but felt the need to appear unbiased. Facebook was already not trusted by many Republicans. Facebook’s decision was to not defend Trump’s statement, but claimed, “that political speech could be protected under a newsworthiness standard.” Zuckerberg said it was a hard issue, “but he was a staunch believer in free expression.” Trump’s post was not removed. Was Facebook putting politics above principle?

Four years later, Facebook would confirm that political speech included paid campaign ads. As political speech, these ads would not be fact-checked. Politicians could now “pay to place lies on the site.”

With the outbreak of COVID-19 in early 2020, Facebook had the opportunity to play a key role in the dissemination of information. A center for information from the CDC and the World Health Organization was created. Facebook “went public with its plan to remove harmful misinformation.” Dr. Anthony Fauci was hosted on Facebook. Then in April, Trump “suggested that disinfectants… were possible treatments for the novel coronavirus.” Posts sprang up, viewed by millions of people. Trump’s account remained untouched. Facebook claimed that Trump had been musing rather than issuing a directive. Was Facebook putting politics and company above principle and truth?

An Ugly Truth is 300 pages detailing Facebook controversies. I have only spoken of a few cases. But the reader cannot miss that there are common elements to each incident. Facebook does what it wants to optimize its business model of keeping users on its site for as long as possible, and while doing so, presenting them with paid advertisements determined through the utilization of users’ private data. When outcry follows, Facebook apologizes and says that it will do more and that you can control your privacy settings. When situations arise where changing an algorithm, instituting a policy, or simply adhering to existing policy will affect the company’s bottom line, it will always be company over everything else, supported by the claim of free speech.

The next time you scroll through Facebook, ask yourself two questions. Is there something better that I could be doing with my time? And, is Facebook really your friend?

The Reviewer

Michael Attard is a Canadian who has lived in Gwangju since 2004. Though officially retired, he still teaches a few private English classes. He enjoys reading all kinds of books and writes for fun. When the weather is nice, you may find him on a hiking trail.

This article is from: