Ubiquitous

Page 1

NO. 4 NOV '25

ADVERSARIAL FASHION VS SURVEILLANCE


U·BIQ·UI·TOUS

Pronunciation: /yo͞oˈbikwədəs/ /juˈbɪkwədəs/

Adj. Present, appearing, or found everywhere. ”New ubiquitous information and communication technologies, in particular recording-enabled smart devices and social media programs, are giving rise to a profound new power for ordinary people to monitor and track each other on a global scale.“ (Weissman, 2019)

YOU’


’RE ALWAYS BEING WATCHED

BUT YOU’RE NEVER SEEN


ING TO HIDE WHAT IS THE EOPLE WONDER: IS THE ALGO S EVERYWHERE YOU GO AUTO AW ENFORCEMENT IS USING Among the most concerning uses of facial analysis technology involve the bolstering of

Automated facial recognition is a type of facial analysis technology which is used to automatically recognize specific faces in public places such as a park, a concert or a mall.

mass surveillance, the weaponization of AI, and harmful discrimination in law enforcement contexts.� - Joy Bwolamwini

Automated facial recognition is made possible by software that is trained with datasets of photographs and videos of faces. The AI learns to distinguish faces by comparing thousands of images which are tagged and classified by different demographic attributes such as race, age, ethnicity and gender. Many of these training images are sourced without the knowledge of the people in these photos, they come from databases from universities, social media sites and public cameras. This software is sold mainly to law enforcement departments such as police, military and migration authorities around the world. Some of these software companies claim to be able to detect attributes like sexuality and criminality. Can you imagine what would happen if an oppressive surveillance regime had these kinds of facial recognition tools? It is already happening.


BIG ISSUE? I HAVE NOTHI ORITHM RACIST? OR ARE P OMATED A.I WILL FIND YOU FACIAL RECOGNITION? IS G It has been shown by researchers such as Joy Buolamwini that popular facial recognition software such as Amazon’s “Rekognition” isn’t accurate when trying to recognize the faces of people of colour, especially women of colour. Bowlamwini explains that there is bias in the way these algorithms are trained, as the way to test for accuracy is often focused on the system’s capacity to identify white male’s faces (Buolamwini, 2019). In her master’s thesis she discovered that the leading algorithms performed better at identifying male faces than female faces, and lightskinned faces thank dark-skinned faces. These systems also have problems recognizing the gender of the person portrayed, in the case of women of colour, they are often recognized by these systems as “male”. What happens when an algorithm mistakenly recognizes you as a criminal? if you are a marginalized person it could cost you your life.

Joy Buolamwini is a researcher who founded the Algorithmic Justice League, you ca follow her work at ajl.org


HAS YOUR FACE BEEN USED TO TRAIN AUTOMATIC FACIAL RECOGNITION ALGORITHMS?

A visualization of 2,000 of the identities included in the MS Celeb database from Microsoft. Credit...Open Data Commons Public Domain Dedication and License, via Megapixels


Now that you know that automatic facial recognition algorithms are trained with the use of datasets of photographs and videos you might wonder where these datasets come from. Journalist Cade Metz explains that these datasets are created by universities and software companies using images extracted from social media sites such as facebook, twitter and dating sites. Institutions such as Stanford University have created datasets from images captured in public places. In a notorious example researchers from Stanford placed a camera in a public café capturing at least 10,000 images of people who weren’t aware they were being filmed for these purposes. Later this dataset was used by a chinese company that sold face recognition software to the chinese government, who used it to identify and monitor Uighurs. (Metz, 2019) Even places close to you are capturing your face and selling that data without your knowledge or consent. Cadillac Fairview Illegally collected the information of at least 5 Million people with cameras placed in various malls. The Office of the Privacy Commissioner of Canada expressed worry that Cadillac Fairview refused to ask for the consent of shoppers, and weren’t willing to ask for it if they re-deployed the technology in the future. (Warburton, 2020) Cadillac Fairview didn’t have to pay any fines for this Companies, universities and researchers might be using your face to train algorithms that are used by law enforcement and state surveillance in other parts of the world.

You haven’t consented to this. Would you consent to this? Can you withdraw your consent?


HOW MANY SURVEILLANCE CAMERAS ARE THERE IN TORONTO? In an interview with University of Toronto professor Andrew Clement and CTV Toronto it was reported that in 2015 there were 15 police cameras, 500 in or near the Eaton Centre and 13,000 cameras that monitor us in the TTC (D’Mello, 2015), this accounts for at least one camera for every 400 or so metropolitan residents (Rickwood, 2019). These numbers don’t include all the cameras from private establishments, such as shopping malls, cafés, office buildings and convenience stores. Professor Clement explains that cameras are required by law to have signs that inform people they are being watched, but he says most people don’t complain about this because they don’t know the law around surveillance cameras in public spaces (D’Mello, 2015) In 2018 Toronto’s Mayor John Tory asked the city council to increase the amount of police cameras to close to 80. This was made to heavily police places where there was an increase in gang activity (Goffin, 2018). These places are also in neighbourhoods where minorities are already heavily surveilled and adding more cameras doesn’t mean it’s going to be safer for them, especially in places where the relationship with police enforcement is tense. In a blog post by Comparitech, a research group out of the U.K, there is a graph mapping a correlation between the amount of CCTV cameras and the crime index, the group found that “a higher number of cameras just barely correlates with a lower crime index. Broadly speaking, more cameras doesn’t necessarily reduce crime rates.” (Bischoff, 2020)


SurveillanceRights is a research project from the Faculty of Information at UofT that looks to inform canadians about their rights in relation to surveillance. They have developed an app to map surveillance cameras through crowdsourcing. Learn more about it at http://surveillancerights.ca/

Surveillance Watch app, web screenshot by author


HOW TO

PROTECT YOURSELF

WITH FASHION

© Adam Harvey 2010 For DIS Magazine (2010) Creative direction by Lauren Boyle and Marco Roso Model: Jude Hair: Pia Vivas - found in https://cvdazzle.com/


ADAM HARVEY Designers, technologists and hackers have been working on ways to protect themselves from pervasive surveillance and automated facial recognition software by creating design products that confuse, block or distract facial identification algorithms.

© Adam Harvey 2010 Look N° 5 (a) For New York Times Op-Art

Makeup has also been used for these purposes as exemplified by Adam Harvey’s project “CV dazzle”, a way of using makeup and hairstyles as facial camouflage that disturbs the way facial analysis algorithms find patterns in faces (Meyer, 2015) Adam Harvey explains that these looks were made specifically for the ViolaJones Haar Cascade algorithm, but shouldn’t be expected to work against more advanced algorithms such as those that use neural networks to identify faces, he emphasizes that “CV Dazzle designs are relative to the algorithm they are being used against” (Harvey, 2010) He also cautions about using these designs and uploading photos of the makeup to social media, because they can be used to train algorithms that detect faces with makeup. Even if these looks aren’t enough to trick a modern facial recognition algorithm they make a powerful statement about the creative ways people are using to protest the prevalence of surveillance technologies.

© Adam Harvey 2010 Look N° 3 For DIS Magazine (2010)

© Adam Harvey 2010 Look N° 3 For DIS Magazine (2010)

© Adam Harvey 2010 Look N° 2 For DIS Magazine (2010)


KATE ROSE “Adversarial fashion” is a term that was created by digital security professional and designer Kate Rose , a digital security professional and designer who designed t-shirts to confuse automatic plate readers, she explains that her invention confuses the automatic detection software by feeding false plate numbers into the system, these plate readers are ubiquitous and can track people around cities, this becomes dangerous when people visit sensitive places for example rehab centres and immigration clinics (Cole, 2019). She explains on her website that the patterns she created trigger automatic plate readers, but the plates on her designs are not real, they’re used to feed erroneous information into the system, as to overload it with additional false information. Rose presented her project at DEFCON 27 Crypto & Privacy conference where she explained why textiles are a good tool to confront surveillance: • • • •

Everyone Wears Clothes Not everyone can Learn how to use the technology, but everyone cna buy clothes Engaging in a tactile way motivates people to want to learn about the technology If we want people to care about surveillance we have to put examples in their hands (Rose, 2019)

On her website Rose goes beyond selling her designs, she also explains how to do your own DIY plate pattern, by using open source software, libraries and tutorials. https://adversarialfashion.com/


Š Kate Rose 2020 recovered from https://adversarialfashion.com/


IS FASHION ENOUGH?


No. But it’s a powerful begining Torin Monahan argues that some of these fashion innitiatives are an aestheticization of resistance which fails to challenge the violence and discrimination from surveillance society, and fail to take into account race, sex, gender and class in their analysis. He explains that some of these initiatives are based on an individualistic approach to surveillance technologies and fail to take into account that “it must be recognized that a host of surveillance functions are reserved for those who threaten the status quo, principally those classified as poor or marked as Other” (Monahan, 2015, 160). In order to make these artistic interventions more impactful, Monahan argues that “ countervisuality would instead challenge forms of violence and oppression, acknowledging differential exposures and effects.” (Monahan, 2015, 160). for example if they could be used to look back at the establishment that is doing the surveillance and uncovering that which the establishments want to keep hidden.

In this zine we have included projects that go beyond products or art: The surveillance rights project looks back at the institutions and places who are watching us by mapping the location of these cameras. Kate Rose not only designs products for sale, also gives interested people the tools to create their own antivisual recognition patterns. Adam Harvey doesn‘t only create futuristic looks that confuse facial recognition software, he also shares ways to protect your face from this kind of software. We hope this Zine makes you reflect about the ways automated facial recognition is ubiquitous in your life and what you can do to protect yourself, but especially what can you do to protect your community and others.


References: Buolamwini, J. (2019, April 24). Response: Racial and Gender bias in Amazon Rekognition  -  Commercial AI System for Analyzing Faces. Retrieved November 11, 2020, from https://medium.com/@Joy. Buolamwini/response-racial-and-gender-bias-in-amazon-rekognitioncommercial-ai-system-for-analyzing-faces-a289222eeced Metz, C. (2019, July 13). Facial Recognition Tech Is Growing Stronger, Thanks to Your Face. Retrieved November 13, 2020, from https://www.nytimes.com/2019/07/13/technology/databases-facesfacial-recognition-technology.html Cole, S. (2019, August 15). This Hacker Made Clothes That Can Confuse Automatic License Plate Readers​ . Retrieved November 12, 2020, from https://www.vice.com/en/article/qvgpvv/adversarialfashion-clothes-that-confuse-automatic-license-plate-readers Monahan, T. (2015). The Right to Hide? Anti-Surveillance Camouflage and the Aestheticization of Resistance. Communication and Critical/ Cultural Studies, 12(2), 159–178. https://doi.org/10.1080/14791420. 2015.1006646 Meyer, S. (2015, August 20). How I Hid From Facial Recognition Surveillance Systems. Retrieved November 30, 2020, from https:// www.theatlantic.com/technology/archive/2014/07/makeup/374929/ Radway, J. (2016). Girl Zine Networks, Underground Itineraries, and Riot Grrrl History: Making Sense of the Struggle for New Social Forms in the 1990s and Beyond. Journal of American Studies, 50(1), 1–31. https://doi.org/10.1017/S0021875815002625 Warburton, M. (2020, October 29). Canada privacy agency raps mall owner for illegally collecting shoppers’ personal data. Retrieved November 02, 2020, from https://www.reuters.com/article/us-canadacadillac-fairview-privacy-idUSKBN27E33U Harvey, A. (2010). Computer Vision Dazzle Camouflage. Retrieved November 27, 2020, from https://cvdazzle.com/ D’mello, C. (2015, February 27). How many cameras are watching you? Toronto professor concerned about privacy. Retrieved November 27, 2020, from https://toronto.ctvnews.ca/how-many-cameras-arewatching-you-toronto-professor-concerned-about-privacy-1.2255985 Goffin, P. (2018, July 19). Mayor Tory wants to ‘more than double’ security cameras in bid to stem gun violence. Retrieved November 28, 2020, from https://toronto.citynews.ca/2018/07/19/mayor-torywants-double-security-cameras-bid-stem-gun-violence/ Bischoff, P. (2020, November 26). Surveillance Camera Statistics: Which City has the Most CCTV Cameras? Retrieved November 28, 2020, from https://www.comparitech.com/vpn-privacy/the-worlds-mostsurveilled-cities/ Rickwood, L. (2019, October 16). Toronto Among the Most Surveilled Cities, But More CCTV Cameras Coming. Retrieved November 29, 2020, from https://whatsyourtech.ca/2019/10/16/toronto-among-the-mostsurveilled-cities-but-more-cctv-cameras-coming/ Rose, K. (2019). DEFCON 27 Crypto & Privacy Presentation. Retrieved November 29, 2020, from https://adversarialfashion.com/ pages/defcon-27-crypto-privacy-presentation


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.