THE
I S S U E DAT E 2•13•18
SOUTHERN MAINE COMMUNITY COLLEGE
BEACON
VOLUME 14 NO. 9
BY THE STUDENTS FOR THE STUDENTS
Art Club Hosts Bob Ross Craft Day By The Beacon Staff This past Sunday the Art Club hosted a craft day with SMCC’s very own Bob
Ross look alike, Wylie Holt. SMCC students and friends came down to the Art Studio to make Valentine’s Day cards with Wylie and other members of the
Art Club while watching “The Joy of Painting.” Bob Ross has inspired many, and Sunday was no exception. Students created all
types of cards, from hearts, to bees, to trees, to three dimensional dogs. With glitter and cupcakes for all in attendance, the art studio was filled with love, creativity and Bob Ross.
Photos courtesy of the Art Club.
By Troy Hudson
Welcome to Deepfake Reality
On Jan. 8, 2018, a Reddit user known as deepfakes publicly shared an app he had developed called FakeApp, which allows anyone to take existing footage of a person and convincingly swap that person’s face with another face. Deepfakes has been sharing examples of videos (also known collectively as “deepfakes”) produced with the algorithm for about a year, but now anyone can download the app and, by taking advantage of neural-network cloud computing, create realistic simulations without expensive hardware or coding skills. Unsurprisingly, the first and most widespread application of this technology has been porn. The faces of stars like Daisy Ridley, Emma Watson and Jessica Alba have been superimposed on adult film actors with results so convincing many people wouldn’t know they were looking at a fake. Reddit’s r/deepfakes thread is now full of such user-generated videos. The app may be new, but the idea isn’t. Anyone who’s seen 2015’s “Star Wars: The Force Awakens” knows that Hollywood can digitally recreate a 19-year-old Carrie Fisher at will, albeit with mixed results. Many fans of the series bemoaned the new technology as gimmicky, while others were perfectly happy with the cinematic trickery. What is different this time is that it no longer takes a Disney-movie budget to produce these fakes; anyone with a decent computer can create their own in about eight to 12 hours. The ethical implications of creating realistic pornographic fakes of unwilling participants are certainly troubling—all the more so as our culture comes to grips with accounts of sexual harassment and abuse
in the entertainment industry and beyond. But as is often the case, porn is just the beginning of what a technology like FakeApp can be used to create. Political deepfakes cropped up almost as soon as the app was released, and include President Donald Trump’s face on German Chancellor Angela Merkel and Ad-
and tacitly accept it. A recent social-media meme got laughs at the expense of a Vanity Fair photoshoot where a bad edit seems to show Oprah with a third hand. There was no outrage that the photo had been manipulated, just ridicule that the editor had done such a poor job concealing the trick. And if deepfakes do come to further
This could bring us to peak skepticism, when we doubt even what we see with our own eyes.
olf Hitler’s face on Argentinian President Mauricio Macri. These examples are obviously meant as satire, but it is not difficult to foresee a day very soon when these fakes become so convincing that they are all but impossible to recognize. In an era that has already been described as “post-truth,” this could bring us to peak skepticism, when we doubt even what we see with our own eyes. Of course, visual fakery itself is nothing new. Heavily Photoshopped images appear on virtually every movie poster, magazine and billboard in existence, so much so that we have come to expect such manipulation
erode our crumbling faith in the media, it is only the final nail in the coffin of credibility. Escalating rumors and leaked information by anonymous informants are peddled daily by news networks in an endless battle for ratings, and the current administration has consistently worked to undermine any negative reporting by dismissing it as “fake news.” On Feb. 5, President Trump took time out from a speech in Blue Ash, Ohio to wave hello to the “fake news media,” saying he was glad the speech was being broadcast live so his remarks couldn’t be edited and misconstrued.
Gallup reported in fall 2016 that only 32 percent of Americans believe the news media presents the truth “fully, accurately and fairly,” and there are no signs that faith is being restored. It is worth pointing out that experts believe this may not entirely be the fault of the media, as people are known to distrust stories that conflict with previously held beliefs, regardless of accuracy. But if the power to create fakes so convincing that no one can tell them apart from reality becomes fully democratized, then the media will be just as helpless as the rest of us in distinguishing fact from fiction. Who can you trust to tell the truth when you can’t even believe your own eyes? Now that FakeApp is nearly universally available (except, of course, in countries with heavily restricted internet access like North Korea), it is a virtual certainty that the phenomenon is already, or soon will be, unstoppable. Perhaps technology to detect the fakes will keep pace with the simulations, or perhaps not. There may, however, be an upside to this new deepfake reality: In order to ascertain the truth when nothing is certain, it will become more necessary than ever to weigh new information against what we already know (or think we know), and to take nothing for granted. It is possible that our deepening skepticism could inspire a more discerning public, no longer so quick to react to the latest scandalous revelations which may, after all, turn out to be nothing more than cinematic sleight of hand. But if, as we now know to be true, foreign powers like Russia have already succeeded in infiltrating American politics using nothing more than Facebook ads and blog posts, the future of deepfakes could hold some very troubling realities indeed.