8 minute read

Big Tech Manipulation

Next Article
The Slow Life

The Slow Life

THOUGHT LEADERS

Dr. Robert Epstein, senior research psychologist at the American Institute for Behavioral Research and Technology.

Big Tech’s Hidden Manipulation

Robert Epstein on the battle for privacy and freedom online

If we monitor, capture, archive, and expose these companies,” Dr. Robert Epstein said, “then they’ll stay out of our lives.”

In a recent episode of “American Thought Leaders,” host Jan Jekielek sits down with Dr. Robert Epstein to discuss how tech giants influence human behavior and politics. Epstein is a senior research psychologist at the American Institute for Behavioral Research and Technology, best known today for his research on the ways in which companies such as Google secretly manipulate everything from what we read and watch online to the outcomes of our elections.

JAN JEKIELEK: For almost a decade now, you’ve been studying how Big Tech manipulates people in ways they aren’t aware of. Where are things right now?

DR. ROBERT EP-

STEIN: We’ve made more progress in the past year and a half than in the previous eight years combined. For the 2020 presidential election, we recruited field agents, mainly in swing counties in four swing states, and with their permission, we installed special software on their computers that allowed us to look over their shoulders as they were doing anything that was election-related on their computers. We preserved more than 1.5 million ephemeral experiences on Google, Bing, Yahoo, YouTube, Facebook, and more.

What started as a tiny project in the 2016 presidential election has grown into something much more sophisticated. We’ve made a lot more discoveries; we’ve got lots more numbers, and they’re all terrible. They’re telling us over and over again that we’re pawns. We’re being manipulated in ways we can’t see and in ways we can’t counteract, that don’t leave a paper trail for authorities to trace.

MR. JEKIELEK: You mentioned ephemeral experiences. Please explain what that means.

DR. EPSTEIN: They’re brief experiences we have online, such as a newsfeed flashed before our eyes, search suggestions, or all kinds of things. A sequence of YouTube videos, for example, with a suggestion of what video to watch next, affects us, and then they disappear. They’re gone. It’s the ideal form of manipulation. People have no idea they’re being manipulated, No. 1— and No. 2, authorities can’t go back in time to see what people were shown.

We’re trying to figure out the power that ephemeral experiences have to change thinking, behavior, and votes.

of psychology for decades. How did you end up focusing on this?

DR. EPSTEIN: I noticed marketers were finding out that if you can just get up one more notch in Google Search results, that could make the difference between the success and failure of your company. If you’re on the second page of search results, you’re done. You’ve got to be on that first page. So I wondered, “Could search results be used to change people’s opinions? Could they even be used to change their votes, their voting preferences?” And so I began a series of experiments that adhere to the highest research standards.

I thought, “Let’s see if I can change people’s opinions. I’ll randomly assign people to two groups. In one group, the search results will favor one candidate, and in the other, the search results will favor the opponent.” That means that if someone clicks on a high-ranking search result, they’re going to get to a webpage that makes their candidate look really good and might make the other candidate look really bad. And in these experiments, we used real search results and webpages we got online from Google. People were randomly assigned to these different groups, and I expected we could shift people’s voting preferences by 2 percent or 3 percent. The very first experiment we ran, we shifted voting preferences by more than 40 percent. So I thought, “That’s impossible.” We repeated the experiment with another group and got a shift of more than 60 percent. So I realized, “Wait a minute, maybe I’ve stumbled onto something here.”

MR. JEKIELEK: What does it mean to shift people’s opinion by, say, 40 percent?

DR. EPSTEIN: Let’s say we start with 100 people— and we always use people who are undecided about their vote, because those are the people who can be influenced. We divide them into pro candidate A or pro candidate B groups. So a 40 percent shift means 20 people move to the other group. So now, I’ve only got 30 left here, and over here I’ve got 70. I’ve taken a 50/50 split and turned it into a 30/70 split. I now have a win margin of 40 percent.

In other words, tech companies can put one candidate ahead of another.

“We’ve made a lot more discoveries; we’ve got lots more numbers, and they’re all terrible.”

MR. JEKIELEK: Won’t people just say, “Well, that’s just the natural consequence of the algorithm.”

DR. EPSTEIN: Most people don’t know how algorithms work or even what algorithms are. But one thing should disturb a lot of people. Algorithms incorporate the biases of the programmers. Right now, for example, 96 percent of donations from Google and other tech companies go to the Democrat Party. There’s a lot of political bias in these companies, and that bias can get programmed into the algorithms.

One main concern here is the blacklists. One of the simplest ways to make an adjustment in what an algorithm is doing is to have your algorithm check a blacklist before it displays any results. And when certain points of view aren’t acceptable, they’re banned.

In 2019, a Google vice president appeared before a Senate Committee where he was asked under oath, “Does Google have blacklists?” And this man replied, “No Senator, we do not.”

A few weeks later, Zach Vorhies, a senior software engineer, walked out of Google with more than 950 pages of documents and a two-minute video. Three documents were labeled ‘Blacklists.’ This Google vice president had lied under oath to Congress.

“We’re trying to figure out the power that ephemeral experiences have to change thinking, behavior, and votes.”

MR. JEKIELEK: Let’s jump to monitoring now. Broadly speaking, from that data set from 2020, what is it that you found?

DR. EPSTEIN: We found a strong liberal bias. On YouTube, 93 percent of the videos recommended by Google’s Up Next algorithm came from strongly liberal news sources. At that point, we had about half a million ephemeral experiences preserved, and we decided to go public. A New York Post journalist took all our content and started writing a fabulous piece about how the tech companies are rigging our elections.

But when her editor tried to get a comment from Google on some of the factual content, which is normal, two things happened. No. 1 is the New York Post killed the piece. I couldn’t believe it, but then I found some 40 percent of their traffic comes from Google. You can’t attack Google on this scale without risking your business. So okay, they killed the piece. No. 2, in those couple of days before the presidential election, Google turned off its manipulations in the presidential election, and we thought, “That’s interesting.”

Along the way, I had contacted someone I knew in Sen. Ted Cruz’s (R-Texas) office. On Nov. 5, two days after the election, three U.S. senators sent a strong threatening letter to the CEO of Google summarizing our preliminary presidential election findings. And then a miracle occurred. The field agents monitoring the content from tech companies prior to the January 2021 Senate runoff elections in Georgia found that all the bias was gone from Google. I mean, gone. For example, not a single Go Vote reminder.

This is the solution to the way in which these companies interfere in our democracy, in our lives, and with our children. If we monitor, capture, archive, and expose these companies, then they’ll stay out of our lives.

MR. JEKIELEK: So how do we avoid being influenced this way?

DR. EPSTEIN: What

Witnesses

Amazon CEO Jeff Bezos (top C), Facebook CEO Mark Zuckerberg (top R), Google CEO Sundar Pichai (bottom L), and Apple CEO Tim Cook at a hearing on Capitol Hill in Washington on July 29, 2020. you can do fairly easily is just go to my website, myprivacytips.com, and read my article on how to avoid targeted ads. Over the years, I’ve learned how to protect my privacy, and I share some information there. And the research institute where I work has set up a new organization, Internet Watchdogs, which is designed to help people protect the privacy of their children and families.

MR. JEKIELEK: Count me in on signing up for Internet Watchdogs. But a question: How will Internet Watchdogs and these monitoring systems stay pure to their mission?

DR. EPSTEIN: We’re like Google was at the beginning, but Google then got twisted into a greedy version of itself because of money, because it’s a for-profit, and they found ways to get obscenely rich. Internet Watchdogs and any project I’ve ever touched are nonprofits. There’s no ownership.

So we’re doing creative, wild stuff and having a ball while we’re doing it. My team loves coming in and working long hours. And I think in the future, we’re still going to be excited about what we’re doing because we’re in this cat-andmouse game where we’re using good tech to fight bad tech, to protect kids, to protect humanity, to protect democracy. I think the future will be even better than what we’re seeing now.

Unwind

Not all treasure lies deep beneath the sea or in distant deserts; those who regularly haunt garage sales, estate auctions, and thrift stores sometimes find valuable art and antiques.

PHOTO BY ARIS-TECT GROUP/SHUTTERSTOCK

Hunting for Long-Lost Treasures 60

LOCATED IN CAPETOWN’S

wine country near Table Mountain, this magnificent, recently renovated colonial-style estate is ready for new owners. 56

MALLORCA’S SUN-WASHED

beaches, clear waters, small towns, and lush gardens are ideal for those seeking to spend time at a slower, more pleasurable pace.  58

INVASIVE LIONFISH POSE

a huge threat to native reef fish; letting the public know they’re delicious may be one way to help control them.  66

This article is from: