
2 minute read
CYBERVILLE
Some of the understandable impacts of a virtual partner could be as follows –
1. Resistance to having a relationship with a human who is unpredictable and cannot be programmed
2. Dependency on a virtual partner that may not last and is a technology-driven tool
3. Distancing from physical support offered by a human being
4. Limiting oneself to the trained and defined behaviors and reactions of the AI human partner
5. Creating false expectations from humans based on AI human reactions
This could bring several reactions to the person when they meet people outside of the confines of the machine. For instance, in school, work, or a reactional space where people express their feelings in ways that are not programmed through codes, it can seem different to deal with.
This can lead to disappointment and agony for someone who likes or has become dependent on a human AI partner.
Human empathy vs AI chatbot
Can a bot offer emotional support? After designing a lookalike through deepfake or having a good-looking AI human partner, does it help reduce loneliness and emptiness among individuals?
AI chatbots have been tested for their ability to offer emotional support, disclosure of social cues, and providing emotional validation.
A study investigated how an individual looking for support during a stressful time responded to an AI chatbot in comparison to a human offering emotional support.
“The emotional support from a conversational partner was mediated through perceived supportiveness of the partner to reduce stress and worry among participants, and the link from emotional support to perceived supportiveness was stronger for a human than for a chatbot,” the study confirmed
Having a human being offer reciprocal self-disclosure created better positive effects in terms of emotional support on worry reduction. However, these observations about offering support or the lack of it were noted as follows –
1. In the absence of emotional support, a self-disclosing chatbot reduced lesser stress than a bot offering no response at all.
2. Human partners were more likely to be taken as real sources of support than AI bots.
3. Human partners may be more beneficial than AI human partners.
4. AI partners may need to depend on the data fed to them and the social cues present in the conversation.
Humans can gauge and understand a human’s conversations to connect information with their own experiences and offer support that is not explicitly asked.
They can think about the past, and present and make an estimate about what the person might be asking without explicitly making a reference to it or saying the same.
In such situations, an AI human partner will respond with a fed statement like, “I am sorry. I do not understand that. Perhaps we can discuss it in more detail.” Besides emotional limitations, an AI human partner will stop seeming desirable when it would become clear that it would not innovate or think for the progress of the individual beyond a certain limit.
Marrying
AI human partner


Technophilia or the strong urge to try technological gadgets and devices has led to a widespread growth in the development of bots and AI-powered tools.
Humans have gone ahead and forayed into marriage with a virtual partner, however, the longevity, impact, and law behind such interactions will be known in time.
A problem with an AI human partner is if it gets hacked. All the data fed by the individual would become accessible to the hacker which could land up on the dark web, creating more remorse than joy.