Business Anthropology ● ISSUE 11 ● JUNe 2021
Anthropology
Illustration: Ana María González
and Artificial Intelligence
Issue N°11
Staff Founders: Giovanna Manrique, Natalia Usme Content Director: Natalia Usme Editor in Chief: Carolina Serrano Art Director: Cristi De Matos Ilustrator: Ana María González Columnists: Jesús Contreras Carolina Serrano Natalia Usme Nelson Polanía Translator: Natalia Usme Proofreader: Carolina Serrano * Follow us on Social Media: Facebook: Flipa Consultora Twitter: @FlipaConsultora Instagram: @FlipaConsultora Youtube: Flipa Antropología de Negocios Web: Flipa Consultora Flípate © Magazine, June 2021. Issue No. 11. All rights reserved. Flípate Magazine is not responsible for the publication or distribution of international editions, unless the edition has been authorized by Flipa’s administrative staff. Do you want to receive the magazine, or send us some comments? Please, email us at contacto@flipaconsultora.com
2 | Flípate
Anthropology and Artificial Intelligence
— Issue 11° —
*
It is sunny and we are walking through a long desert, suddenly we see a shining object. We walk towards it. When we arrive we realize that it is a large bottle of water. We open it and refresh ourselves by drinking it. Suddenly, the bottle disappears from our hands. It was a mirage! In this edition we are going to delve into algorithms based products that often become mirages in a business desert, feeding beliefs, practices and human biases in digital environments to determine the future of the end customer. We invite you to burst these mirages and think about the challenges that arise from working with algorithms.
*
Founder
|3
Issue N°11
p06
p20
The Digital Biases of Love
Tiktok: Making the World Dance
By Natalia Usme
By Nelson Polanía and Natalia Usme
p13
p26
Lil Miquela and the Digital Influencers of the Future By Carolina Serrano
In the magnifying glass
4 | Flípate
By Jesús Contreras
Anthropology and Artificial Intelligence
Our writers Natalia Usme Business Anthropology Manager and Co-owner at Flipa Consultora. She is the pioneer of Business Anthropology in Colombia. Natalia has more than 8 years of experience. She focuses on designing present and future strategies for companies. She has a Master of Arts in Applied Cultural Analysis from Lund University in Sweden. At Flipa, she leads international and national projects. Natalia is the creator of the first Online Summit on Business Anthropology in Latam: The Flipa Summit.
Jesús Contreras Founder of the GOST Project, an initiative that uses photography as an instrument for change. He holds a B.A in Communication, Social and Cultural Anthropology. Jesús has more than 10 years of experience on media. He specializes in print journalism and photography. In 2008 he won the National Journalism Award in Venezuela with mention in Photography. He focuses on visual arts, culture and inclusive education.
Carolina Serrano Anthropologist from Externado de Colombia University. Social researcher with experience in private and non-for-profit organizations. 5+ years experience in international tourism, passionate about intercultural communication and cultural and community-based tourism initiatives. Co-hosted the I Online Business Anthropology Summit held by Flipa in August 2020.
Nelson Polanía Anthropologist in process from the Pontificia Universidad Javeriana (Bogota), with emphasis in Anthropology of health. Digital influencer in platforms like Instagram and TikTok. Experience in Social Media management and consolidation of linguistics’ database in Universo Ciencias Sociales (UNICISO), a Colombian educational platform. Interested in Social Media, Marketing, Advertisement and Digital communities.
*
|5
Photo by Caique Silva from Pexels
Anthropology and Artificial Intelligence
The Digital Biases
of Love By Natalia Usme
|7
Issue N°11
I
t was 2016 when I first signed up on Tinder. Unlike most, I was there to analyze this digital bar from an anthropological perspective.
To set up my profile, Tinder asked if I wanted to meet men or women. Then it prompted me to adjust age and geographic ranges according to my preference. Initially, I found this overly functional, as these settings would most likely bias my matches. Finally, I uploaded a photo and wrote a short biography. And so, men ranging from 26 to 35 years old began popping up on my phone’s screen.
Over time, the app begins to “learn” –as if it had a brain– about its users and therefore finds suitable options for each profile based on their usage patterns and the frequency of matches with certain types of people. We could argue that, at a point, the machine begins deciding who we should like. Although the app creates the illusion of giving us the power of choice, in reality, it is already filtering our possible matches even before we imagine. Human emotions such as love or desire intersect with the machine.
8 | Flípate
Photo by Cottonbro from Pexels
For those of us feeling curious about how Tinder works, here is my anthropological vision. Algorithms and Deep Learning are at the core of this app. From moment one, the machine begins analyzing the location, age, and gender preferences of the person behind the screen. It also “reads” visual information –such as people’s clothing and their non-verbal language. This information is crossed with the biographic description that the persona shares. With this input, the app starts creating possible matches.
Anthropology and Artificial Intelligence
In my story, not only did I open a profile on Tinder but I also had the opportunity to interview fourteen people to understand how they used the app. Let’s delve into their stories, analyze what underlies them, and, above all, learn applicable lessons to transform the way we design our products. I first interviewed Daniel, a 25-year-old male who lived in the Colombian city of Manizales. Shortly after starting our session, he revealed his strategy for success on Tinder: “It is a like mask: my perfect alter ego. I decide when and, especially, who I talk to. I love being able to filter by geographic location –you don’t want to date a random girl from god knows where.” That last bit –“a random girl from god knows where–“ had me thinking. Daniel was not the only one who thought like this; The other thirteen interviewees shared his opinion. All of them used geographical filters to “request” Tinder to show people living in what they considered were the exclusive areas in their places of residence. Beyond a functional feature, the geographic filter appealed to the interactions and hidden meanings between social classes. These are the contextual and social biases that digital products reinforce in humans. What we are seeing here is a location bias that is linked to Colombian cultural understandings. I wonder if this happens in other countries. If so, what is the impact on social interactions between “users”? It seems to me that it is time for technology and design teams to analyze what positive or negative notions they validate in their digital products. Later, I interviewed Cristopher, a young man with a very particular approach to |9
interacting with women on Tinder. He said that on day one of the match he only spoke to them for five minutes. He would then wait until they initiated conversations, but he would not reply at all. After two weeks or so, he would begin talking to these women again, apologizing for not responding. He would then proceed to send them photos of a pair of shoes he said he had recently bought and asked them for advice on how to wear them. From an anthropological and even psychological standpoint, we could argue that Cristopher used Tinder to replicate the concept (or complex) of what it means to be a macho in some Latin American contexts and countries. Judith Butler’s concept on the heterosexual matrix serves to further explain this idea. Butler argues that binary-minded societies (where things in the world are arranged in pairs such as light v. dark, up v. down, inside v. outside), those people who identify themselves as male are interested –and to a degree forced by their cultural contexts –to appear as conquerors as opposed to women who are to be ruled and dominated by them. In Cristian’s case we could argue that Tinder replicates the macho complex by endorsing associated behaviors. If we turn this issue to a design question, we could ask ourselves what would have happened if Tinder had carried out concept tests in regards to “gender personality” archetypes. A gender personality archetype is an artifact that allows us to identify whether the person using our product has gender biases that can impact his/her/they consumption decisions, as in Christopher’s case.
10 | Flípate
Photo by RF._.studio from Pexels
Issue N°11
Anthropology and Artificial Intelligence
There are contextual and social biases that digital products reinforce about the world. It is time for technology and design teams to think very hard about this and analyze what positive or negative notions they validate in their products.
| 11
Issue N°11
By conducting this kind of tests, we can better understand the use dynamics that will unfold with the product, and how the vision that each person has about their gender identity could affect it. The results could open room to create functional mechanisms in our products or services that do not endorse identity biases. I interviewed Verónica next. Her case caught my attention. She had always liked people of both genders, but she had always been inclined to have more boyfriends than girlfriends. However, she used Tinder to fuel her fantasies with women. Thanks to the application, she had been in a platonic relationship with a woman for a year. They spoke on the phone as if they were a couple, but they had never met in person. Veronica’s digital girlfriend knew that the latter had a boyfriend in real life. In this case, Tinder acquired a new meaning for Verónica: It became a portal that permitted satisfying her hidden needs. And although the machine thinks that Verónica is a lesbian, in reality, she is not. Veronica’s gender fluidness exceeds the binary understanding of the machine. There are already applications like Grindr that specialize in LGBTQ communities, but is Verónica interested in downloading and creating a profile in this type of app? Perhaps not, because she would not want to leave digital traces that confirm what she is not ready to tell the world. From a business perspective, we could say that there is a gap in the market for these dating apps: there are no current genderfluid apps. Tinder is binary and Grindr is at the other extreme. This means that the algorithm and Deep Learning with which these applications have been designed are following social constructions that reduce the scope of 12 | Flípate
❢ Before releasing a product, let’s think about the dynamics of use it will have (even those that seem unlikely) that mindset will help us develop new perspectives and even in taking better decisions
the product and that do not adapt to people’s realities. If at present a company made an in-depth analysis of elements like the ones I just mentioned, surely in six months or a year we would find a new dating app that does not restrict people’s sexual identity and, therefore, adapted to the plurality of inclinations that people have. When we stop to analyze the contexts, uses, preferences, and meanings of a product or service, we open the doors for detecting business opportunities, innovating, and generating value propositions for unexplored markets. ❢
Anthropology and Artificial Intelligence
| 13
Issue N°11
Lil miquela
and the Digital Influencers of the Future
By Carolina Serrano
C
an machines think? With this provocative question, posed in a famous essay titled Computing Machinery and Intelligence published in 1950, British mathematician Alan Turing glimpsed the field known today as artificial intelligence, or AI. Although it has been over seventy years since then, this issue remains unresolved and, indeed, seems more relevant than ever. I like to think that we currently face the most spectacular display of possible answers to this question. It seems that the futures imagined by science fiction, where the boundaries between humans and machines merge and intertwine, have become new realities. Today, computers assist us in all kinds of daily operations and routines. For some
14 | Flípate
–including myself– life seems unmanageable without them. For the first time in history, humans are capable of complex interactions with them: think of mobile phones, marketing bots, or Alexa – Google’s popular (but controversial) virtual assistant. In movies, for instance, we can find a good source of creative, futuristic inputs related to the complexity of human-machine bonds. A few examples that come to mind are Bladerunner, directed by Ridley Scott, Her by Spike Jonze, and Bicentennial Man by Chris Columbus. These films explore issues around the “humanization” of cyborgs –such as developing self-awareness, an appreciation for life, or even love relationships with humans. Those visions of possible futures were the
Photos from Instagram: Lil Maquela
Anthropology and Artificial Intelligence
| 15
Issue N°11
source of inspiration for this article. As I was reading an interview with Genevieve Bell, Vice President of Corporate Strategy for Intel and Ph.D. in Anthropology, I came across a reflection that resonated with a recent experience I had on social media: According to Bell, we have reached an intersection, where the concern about the nature of our bond with machines has shifted from how we interact with them to how we relate to them (Lalwani, 2016). The Future of Influencers
The experience I have just referred to is my first “encounter” with one of the most relevant 16 | Flípate
influencers of the moment: Miquela Sousa, better known as Lil Miquela. With nineteen years old, three million followers on Instagram, and a sponsorship list that includes top brands such as Samsung, Givenchy, Calvin Klein, and Prada, Lil Miquela seems to live a dream life filled with fame, power, and an unrivaled sense of fashion. There is something special about Miquela, though. She is not a person of flesh and blood: She is a computer-generated image (or CGI) created by the Californian start-up Brud. Miquela defines herself as an artist robot, and her life unfolds in the same way as that of any young woman: She hangs out with
Anthropology and Artificial Intelligence
her best friends, is obsessed with K-pop, gets emotional and sometimes depressed, has fears, but also has dreams and goals. She even has a Snapchat channel called Get Real, Miquela, in which she shares her views on the world and answers questions from her followers about her experience as a robot. When I met Miquela, my head spun. How is it possible that an influencer is not human? Why do her photos look so real? Why do I feel like I can relate to a computer-generated image? These questions continue to haunt my head, but as I go through more of her online content, I get more used to her “virtual humanity.”
How is it possible that an influencer is not human? Artificial intelligence is capable of creating digital entities that move masses, as is the case of the influencer Lil Miquela, a computergenerated image that has an Instagram account and millions of followers
| 17
Issue N°11
Judging by the number of followers and the type of sponsors she has, it is clear that there are millions of people who, like me, identify well with her too. This new version of internet star is having a huge welcome among netizens. To me, this seems to indicate that characters like her are here to stay. Could we be facing a new archetype of influencer? I dare to say: yes. Designing Tomorrow’s Digital Influencers
to. Her nineteen years of age put her at the core of Generation Z, or those born between 1997 and 2012. Miquela truly embodies certain values that we can associate with young Americans of her age: She uses her iPhone to document her daily life and share anecdotes on social networks, her best friends have a special place in her life and she shows it, she cares about her looks and sense of fashion (for example, she never wears an outfit twice in her Instagram photos), and also has a sense of social awareness and stands for causes such as the BLM (Black Lives Matter) movement. Regarding her personality, Miquela is bold, outgoing, and outspoken.
It is interesting to imagine a future where these characters dominate social networks. This idea is not at all far-fetched from the point of view of Danika Laszuk, executive at Betaworks –the start-up that has given life to platforms such as Giphy and Bitly–, who takes this idea one step further and considers that the future of influencers is digital beings generated with artificial intelligence (Alexander, 2019).
Her looks are what I find most intriguing. She is almost perfect. Like a model. She is slim, has thick and well-defined eyebrows, slightly slanted eyes, straight hazelnut hair, and a freckled face. Her teeth, although white and symmetrical, show a gap between the central incisors.
Could Miquela be some sort of time traveler announcing the future? Although she herself does not integrate artificial intelligence or robotics technologies, I think we can see her as the forerunner of virtual beings that could do so in the future –or a signal, as futurologists would say.
In my opinion, the freckles and the gap in her teeth are her most humanizing features, as they remind us of the unique and imperfect beauty of humankind. Perhaps her creators drew inspiration from supermodels such as Lauren Hutton or Gisele Bündchen when they thought of giving her these traits.
At this point, I suggest we examine her Instagram profile (@lilmiquela) for a moment, using the ethnographic gaze –that is, analyzing and interpreting what we observe and hear in an attempt to understand others. In our case, that other is a virtual being, not a human in the strict sense of the word. What makes her a success? Are there any areas of improvement for her as an influencer?
However, we must not forget that Miquela is handled by humans. Laszuk, from Betaworks, imagines that, soon, we will develop machine learning technologies that will allow digital influencers (or virtual creators, as she prefers to call them) to generate healthy and autonomous relationships with people on Instagram or Twitter, for example, without requiring programmers to intervene at all.
The first thing that struck me about her Instagram profile is the audience she speaks
In this scenario, a deep understanding of what consumers want and need from
18 | Flípate
Anthropology and Artificial Intelligence
influencers would be a key element that would allow the design of these virtual beings. We could develop this knowledge by generating an alliance between the ethnographic perspective of anthropology –which focuses on the details, nuances, and particularities of social relationships– and artificial intelligence –which can capture and process massive amounts of data associated, for example, with interactions between users on social networks. Imagine, for instance, that we could teach Lil Miquela to constantly learn about her audience’s value system and moral precepts, so that, in turn, she generates content that is increasingly resonant with them without generating interactions that can potentially be harmful to them (such as inciting hatred or violence).
Considering the commercial potential that virtual creators may represent in the future, I think it is safe to assume that more and more brands will be looking to design and implement them in creative ways. The question that the anthropological approach can help us solve is: What do people need from the influencers of tomorrow, and what do we want our relationship with machines to look like in the future? ❢ REFERENCES Alexander, J. (30 de enero de 2019). Virtual creators aren’t AI — but AI is coming for them. From The Verge: https://www.theverge.com/2019/1/30/18200509/ai-virtual-creators-lil-miquela-instagram-artificial-intelligence Lalwani, M. (16 de agosto de 2016). The next wave of AI is rooted in human culture and history. From Engadget: https://www.engadget.com/2016-08-16-the-next-waveof-ai-is-rooted-in-human-culture-and-history.html
| 19
Photo by Wesley Carvalho from Pexels
Issue N°11
20 | Flípate
Anthropology and Artificial Intelligence
Tiktok:
Making the World Dance
By Nelson Polanía and Natalia Usme
E
very morning, we check all the notifications on our mobile phones. As we scroll through WhatsApp chats, emails, and Facebook tags, we come across the videos that our friends share on TikTok. Some are comic, others are dance challenges, and there are even tips to study and optimize time. Although these do not exceed a minute, they manage to entertain thousands or even millions of people every day.
editing, and sharing videos? We hypothesize that, in its deepest layers, this social platform has changed the way we view and interact with the world.
Have you ever realized how TikTok revolutionized social media posts ever since its release? For instance, when Instagram made its debut, it was all about posting photos, filters, and hashtags. Nowadays, things are different. Something has changed. Nowadays, social Today, TikTok is one of the most popular media posts mainly consist of videos of people social networks in the world. With more than dancing or doing voiceovers around funny 1.190 million active consumers (Santos, 2021) situations. and a revenue of nearly $6 billion by the end People went from posting “flat” photos and of 2021 (Albornoz, 2020), it has become ByteDance’s star product, surpassing Facebook, videos in the past to trying to attract attention using dance moves and other theatricalities in Instagram, and Twitter’s success. the present. To us, this suggests that TikTok acts as a tool that indoctrinates bodies. In other Understanding TikTok words, it trains people to use their bodies for What lies behind this platform for recording, consumption within digital environments. | 21
People are not only asked to dance, complete challenges, or dramatize scenes on TikTok. Subtly, this platform also asks its users to be friendly. We will not find any written rules about this but, when we analyze it closely, we see that the funnier or cuter you act in a video, the more engagement you will achieve. This is where algorithmic power and human fragility intersect. Humans, in our strive for acceptance, enter a posting loop. Simultaneously, we become a version of what we think others want to see in us. The algorithm works as a stimulus to this behavior, rewarding the most popular posts with exposure to larger audiences. In some ways, Tiktok’s business model relies on commodifying people’s emotions and bodies to degrees that other social networks have failed to do so far. What happens when we stop consuming products and services and begin consuming ourselves? For years, Tiktok and other social networks have been sending us signals about the future of consumer practices: Humans increasingly want to buy reflections of themselves. Considering this level of internalization and hyperpersonalization of consumerism, how should companies react? Does the great organizational movement consist of building upon connections between products, services, and the human sense of “being”? This question has been posed in the past, and the answer has always been, “Yes!” Companies must aim at the core of human identities. A detail to keep in mind is that people’s sense of being is not so much under the control of individuals but also relies on what technology makes individuals believe about themselves. Taking this point to an extreme, some would say that people’s identities now depend on what 22 | Flípate
Photo by Cottonbro from Pexels
Issue N°11
Anthropology and Artificial Intelligence
| 23
Photo by Ivan Samkov from Pexels
Issue N°11
24 | Flípate
Anthropology and Artificial Intelligence
the algorithm thinks of them. If the algorithm defines individuals, and therefore their consumption practices, what should companies focus on? Our answer is split in two. Firstly, they must pay attention to the human behaviors on social networks: What is the technological layer that designs or modifies these behaviors? Secondly, they must consider that algorithms have personalities –yes, we said it: personalities. When engineers program algorithms, they train them to process data according to certain parameters. Algorithms depend on programming bases, or starting points that are mostly based on a human perspective. These baselines create very defined “characters” or “personality traits” that determine how that algorithm acts, thinks, and even feels. Companies that are currently developing algorithms should analyze the digital entities they are creating and their effects on the market and the human behavior.
Tiktok is based on commodifying people’s emotions and bodies to a degree that other social networks have failed to do so far. What happens when what we consume is not a product or service, but ourselves?
Possibly, there will come a moment of collective consciousness in which humans realize that we are not just consuming external products or services, but we are consuming a part of our essence and own selves. This small (although huge) revelation could provoke new consumption practices, away from algorithms or towards mindful algorithms, so to say. The more conscious and ethically designed algorithms will “win” the market battle. Who will take the first step in this dance? That is the question that we hope some company will soon answer. ❢ REFERENCI A S Albornoz, M. (20 de agosto de 2020). El dueño de TikTok, el enemigo público de Trump. Obtenido de ámbito: https://www.ambito.com/informacion-general/ tiktok/el-dueno-el-enemigo-publico-trump-n5126348 Santos, D. (22 de febrero de 2021). Qué es TikTok cómo usarlo y por qué unirte en 2021. Obtenido de HubSpot | Marketing: https://blog.hubspot.es/marketing/tiktok
| 25
Issue N°11
In the magnifying glass ///
Her (2013)
Ex Machina (2015)
Candid, melancholic and poignant, that’s Her. An unusual love story between a human being and an Artificial Intelligence operating system. It takes place in a futuristic world where people lack the ability to express their feelings, so they use technology to alleviate their loneliness. My hunch tells me that you will enjoy this movie because you will be able to analyze the virtual phenomenon from a social anthropology perspective. Once you watch it, try to answer: what practices and social discourses do we create around technology? What relationships do the new digital humans shape?, and in what type of society do we live? This love story presents ways of expressing feelings and emotions that go beyond interfaces.
Rating
The plot delves between the power and domination that our society exercises over women. It criticizes the male gaze over the future. You can enjoy this film from a gender studies perspective, by doing so you will be able to identify learned patterns on topics such as the feminine and its place in technological development and the objectification of the female body in society.
What did you think about the recommendations? Do you have a suggestion? Let’s talk at
26 | Flípate
The Minority Report (2002)
This sci-fi thriller tackles issues such as sexuality, humanity, and morality in a haunting neo-futuristic context. Is it possible for a machine to develop consciousness? This question marks the beginning of Caleb’s journey, a young programmer who wins a contest at his company and participates in the digital experimentation process with AVA, a female robot that leads us to reflect on the limits of the body and human consciousness.
Rating
By Jesús Contreras
This film is based on a short story by the American writer Philip K. Dick, it introduces us to a crime-free society. The use of artificial intelligence, combined with facial recognition systems and big data analysis, is used by law enforcement agencies to prevent crimes before they occur. Does it sound familiar to you? What once seemed a bit unrealistic is now a reality. Currently there are many countries that use this type of technology to detect behavioral patterns and assess if someone could become a criminal. The idea of crossing social anthropology and criminology can go a long way towards improving existing biases in facial recognition algorithms.
Rating
contacto@flipaconsultora.com
and Artificial Intelligence — Issue N° 11Anthropology —
— Business Anthropology — | 27