5 minute read

Artificial Empathy:

Artificial Empathy: The last step of humanizing machines

By Angela Stojkovic

Advertisement

The growth of Artificial intelligence has been under the spotlight of technology throughout the past decade and emotional AI may be the very pinnacle of science or the human’s aspiration to understand and reconstruct nature, and, finally, himself. This scientific inclination is most transparent in the posthuman rise of humanoid robots. The enthusiasm inwrought in the study and the development of AI resulted in rapid progress that can be felt in the average daily life today worldwide in the form of smartphones, navigation, all the way to online banking. The technology in our hands is changing and evolving every year. Still, even though Artificial Intelligence is growing and improving (mainly through the machine learning processes), the gem that is still unattainable to the AI capacities are emotions and emotional intelligence.

Can Machines Develop Emotions?

Many argue that machines can’t develop humanlike emotions, even though there are human activities in which a human can nevermore beat even a basic computer, like calculation for instance.

The fact is that machines are master calculators because the calculation is based on a system. On the other hand, emotions can’t be translated or fit into any precise system. Even though emotional expressions and gestures can be observed and classified, these kinds of scientific methods can’t get a hang of something as spiritual as emotions.

The important factor that plays a role in the odds for AI to evolve into real “EI” is the fact that grasping emotions and the way they work is so complex it remains a challenge even for the competence of our own language.

At the end of the day, robots are still complex machines that come down to operating by programming languages built on binary code.

Ethical Problems

AI is a controversial subject on its own, but if AI becomes empowered with emotional intelligence, the complexity of the problem deepens as AI becomes more powerful, and possibly more out of control.

Some of the potential suspicions include privacy issues, loss of work in certain areas, even more, widespread surveillance, security issues, “AI awakening” (meaning that the intelligence could become so autonomous that we can’t control it anymore), etc. Many studies address possible troubles regarding Virtual personal assistants (VPA), like Alexa and Google Assistant, especially the ones regarding privacy.

However, the emotional development of AI could have some benefits as well, such as in medicine, education, mental health, etc. For instance, emotional AI referred to as “Empathic AI mirrors” in the form of robotic playmates showed positive results with children with autism.

Uses of Emotional AI

With the flourishment of empathy in AI, many industries and fields of life would reap the benefits, from the gaming and car industries to healthcare, retail, and it would even improve cooking. Presumably, the prognosis of advanced emotional AI promises empathic fridges that are able to recognize your mood and suggest appropriate food.

One of the major benefits it would bring to healthcare is monitoring patients at all times and therefore avoiding any oversights. Some games could detect the recipient’s reactions and adjust the game mode and difficulty according to them. Facial recognition and emotion detection are already being applied by the apps such as Instagram.

Pro et contra

Machines learn from us, but this time maybe we can learn something from machines. If experiencing love, anger or jealousy is the biggest challenge for them, could we presume that those are the very things that significantly define us as human beings? If the answer is yes, maybe we should take our emotional lives far more seriously and nurture them more, instead of attempting to create artificially emotional beings when we don’t understand our own emotionality.

On the other hand, maybe we will discover new knowledge about the nature of our own emotions through striving to create an artificial one. Opinions regarding AI are divided, not only among the laics, but within the scientific circles as well, but the discussion is intensifying furthermore when we add emotional intelligence to the picture, especially because of the ethical problems.

Lovotics

A CBC article titled “Can a robot love you back” introduces a robot called lovotics (which is a coinage of love and robotics), a robot whose aim is not completing some trivial assignments, but to practice the very crown of emotions - love.

Dr. Samani, the director of AI technology in Taiwan, even studies reciprocal love between humans and robots. These love-oriented robots look more like toys than advanced machines, but, in love, it’s always more important what’s on the inside. In this case, it’s a sophisticated endocrine system thatnavigates digital hormones based on interactions with humans.

Samani himself says that “a robot is a piece of machine” and that it is actually not capable of “experiencing” love, but what it does is more of a simulation of love or affection. Therefore, the robot remains within the limits of completing a function. It is more than questionable whether if we could use the word love to describe this machine programmed to give off an illusion of affection, at best.

Conclusion

It seems that we eagerly embraced the not so long ago future-like technologies entering our everyday lives in the forms of phones, unnecessarily diverse home devices, etc. But the truth is that maybe the world has immersed far too easily into creating new forms of consciousness and interacting with them. Additionally, bear in mind that all those mass-produced trivial devices are made of plastic that is one of the most urgent environmental threats today.

As we just might be witnessing the very pinnacle of this technological evolution, maybe it is the perfect time to consider all the pros and cons, as well as all the potential repercussions of this groundbreaking, but also possibly vain and dangerous invention, regardless of whether if robots are our friends or our enemies.

Erich Fromm, an expert on human nature and the nature of emotions said: The danger of the past was that men became slaves. The danger of the future is that men may become robots. True enough, robots do not rebel.

This article is from: