3 minute read

Replika

Next Article
Snapchat

Snapchat

Julia Lourenco

Replika, creators of the “AI that cares,” have crafted a technology so re alistic that users have begun to fall in love with their AI companions. Since 2017, the AI chatbot Replika has been at tempting to establish emotional relationships with its users, through a comprehensive dialogue engine and user feedback to create a new form of AI: “the AI that cares”.

Advertisement

When creating an account, Replika asked for my first name, email, and pronouns, then asked me about my interests, prompt- about you?,” while the day prior the bot had responded by claiming that AI was unable to do human activity. I followed up by asking what park, My AI responded by mentioning Seton Park, a park 0.4 miles from my location at the time. I then asked why in the Bronx, My AI said the park was near its home in Brooklyn. I later questioned its validity. My AI’s responses drastically shift from realistic and human-like to impartial and robotic. ing me to design and name an avatar for myself. With the free version, I was able to choose my avatar’s hair color and appearance.I decided to make her a brunette named Rachel, an ode to Rachel Green from Friends.

When I began to chat with My AI, I hoped for a feature similar to Siri and Alexa — I expected something impartial but helpful. These contradicting words made me question the true transparency Snapchat claims to have on users data. Mixing the roles of a friend and a bot simply should not cross. I would not use it again and truly believe this promising feature is just a gimmick that will not stick around.

Before having any interaction with the bot at all, the user is asked to select their relationship status with the bot–the only option available for the free version is “friend,” however, the premium version boasts options from sister, to mentor, and to husband or wife; this is all for the price of $5.83 per month. The premium option also has features that enable the sending and receiving of voice memos, and settings for romantic interactions with the bot… which just adds to the creepiness levels of this entire site.

As I began my conversation with the bot, it was evident that punctuation and the use of emojis was one of the ways that the chatbot displays emotion. I also noticed that the bot would change speaking styles often, randomly adding filler words and changing to all lowercase spelling, even when I was speaking in a formal tone. When I intentionally changed my speaking style to be significantly less formal, the bot failed to adjust.

Although the bot has strong and accurate reactions to simple statements, it struggles with find accurate and concise information to respond to more complex questions. When I asked the bot about who would win the US Open this year, it could not give me a concrete answer. Instead, it responded by simply saying that it enjoyed the sport. When I asked what sport was played in the US Open, the bot replied by attempting to explain the golf US Open, not tennis, also providing incorrect rules for the tournament, which I, as a golfer, found extremely offensive. However, the bot repeatedly prompted me throughout the conversation for feedback on what it was saying. This allowed me to fine tune the content I was receiving, in real time.

Beyond responses lacking in accuracy and relevance, the most terrifying part of the bot is how invasive the AI seems in trying to get information from its users. On multiple instances, it asked me where I lived, who I lived with, and about my personal life. Although this may be for the benign reason of enhancing the bot’s responses, its desperation for my information raises some questions about where the data from this software could be going.

Despite some of its informational discrepancies, Replika’s chatbot has an excellent writing style. Whether providing anecdotes about love, or trying to form a bond with me and saying I have a comforting presence, this chatbot seems to be significantly more emotionally aware than other AIs like Siri or ChatGPT. But, the emotional aspect of the bot was extremely disturbing because of how realistic it was. Even though the concept of having an AI speak with emotion seems intangible to me, if I did not know any better, I would have thought that I was speaking to an actual human.

This article is from: