15 23
The Consequences of Surveillance by Catherine
Pham
art by Joy
In this contemporary digital age, we are constantly sharing personal information: our interests, photos, consumer history, location, desires, and countless other facets of our digital footprints. Technology is developing faster than it can be controlled and there is a lack of regulation in how our private information is sold. But what happens if
technology enforces the law, weaponizing our personal data to uphold a white, capitalist surveillance state? In January 2020, the New York Times revealed that over 600 law enforcement agencies have been using Clearview, a facial recognition app that is still in its covert start-up stage. Clearview’s facial recognition algorithm compares uploaded photos to similar photos in its comprehensive database, along with the links to the photos’ source. The database is a collection of faces from across the Internet: news sites, employment sites, and social networks such as Facebook, Twitter, and even Venmo. Many companies ban this scraping of users’ data from their websites, and Twitter has explicitly prohibited use of its data for facial recognition. But Clearview has deliberately disregarded these other companies’ data privacy policies in order to capitalize on the pervasive police state. Clearview’s massive photo network, which stores photos even after accounts are made private, gives its clients an unprecedented efficiency in identifying subjects, whose faces can even be partially covered in the photos. When testing out Clearview’s technology, one Indiana State Police force was able to identify a suspect from a fight filmed by a bystander within 20 minutes of using the app. Many fear the release of this invasive technology to civilians, but placing this power
in the hands of the police already renders it ripe for abuse. Police officers may use this technology for personal reasons without consequence as they have historically done so with other weapons granted by the state. Multiple stud-
Chen
ies show that 40 percent of police officer families experience domestic violence in comparison to 10 percent of families in the general population. Furthermore, this technology facilitates the racism that has long pervaded law enforcement. Police are now able to identify activists from one picture, or figure out the location of any citizen they racially profile. Immigration and Customs Enforcement (ICE) uses Amazon’s Rekognition software, which matches photos to real-time footage of police body cameras and city camera networks in order to identify and arrest undocumented immigrants. Yet studies have found that Amazon’s software is more likely to misidentify faces with darker skin, and similar gender and racial biases have been found in facial analysis software from Microsoft and IBM. The solution to these inherent biases is not to improve the accuracy of surveillance technology, but to eliminate its use as an instrument of oppression and injustice entirely. Personal surveillance technology like Ring, Amazon’s home surveillance technology, forces people to exhibit police behavior in their own communities. Law enforcement agencies are allowed to access people’s Ring surveillance footage without a warrant, and as of July 2019, more than 200 police departments were partnered with Ring. These police departments have special access to Ring’s accompanying app, Neighbors, where people can share their surveillance footage with anyone else
Dialogue & Opinion