Temporal Dynamics of Antrophomorpiha by Alara Çalışkan

Page 1

Temporal Dynamics of Emotion and Action Perception in Anthropomorphy Nimet Alara Nur Çalışkan, July 13, 2021


Temporal Dynamics of Emotion and Action Perception in Anthropomorphy Abstract Emotional perception depending on cues are stated crucial for survival during the evolutionary development via triggering the action tendencies to respond the various of stimuli obtained from the environment. In such a case, perceiving actions and distinguishing motions of other organisms or objects facilitate human life, not just at the survival point of evolution but also at the revolution of the second wave of technological improvement. As time passes and new technologies are developed to facilitate humans’ daily life, our perception about machineries and their observed human-like features are yet to be clarified. 1 1.1

Introduction Perception of Emotions: Evolutionary Perspective and Attributory Behavior to Human Characteristics Humans, especially the Homo sapiens species is well-known with its high ability to

perform group actions, higher sensitivity to emotional responses and powerful combination of the interactions of those two. For humans, recognizing and responding to the emotional cues sufficiently is survival, as stated from both evolutionary and prior research suggesting humans show attentional bias to emotions (Hortensius et al., 2018; Kret et al., 2018). In addition, emotions have continuity to be perceived, it is not possible to cry in another language differently or labeling negative emotions as positive ones, despite of the cultural differences. From that perspective, universal emotions do exist as Darwin suggested for survival (Darwin, 1872); but also, the rate of recognition can be influenced by cultural differences (Matsumoto, 1992). But overall, emotions and emotional perception depending on cues are stated as supporters of action on the evolutional aspect; assuming that emotions evolved to benefit the organism by providing more sufficient response to the threats via triggering the action tendencies depending on facial expressions of emotions (Kret et al., 2018). During our daily lives, we navigate and live through complex social environments such as school, work and other social interaction required places and events not listed. For short, social interactions of ours are dependent on other people and emotions are the cues for carrying out those interactions successfully (Hortensius et al., 2018). In that pattern, the importance of emotional recognition and perception of facial expressions which reflects emotions must be underlined to establish good interaction among society and adapting to the changing social situations. We can understand other people’s thoughts, emotions, and intentions via their facial expressions; in that case, recognizing facial expressions is the most important cue about emotional perception to regulate our behavior. However emotional perception is not limited 2


Temporal Dynamics of Emotion and Action Perception in Anthropomorphy with only Homo sapiens but it is widely known the fact that mammals and other vertebrates, even some invertebrate species have very similar to humans and facilitating survival via allowing the organism to adapt various situations quickly. Especially as one of the hot topics of behavioral biology studies, we know that domestic dogs are extremely talented to understand and discriminate human behavioral cues to perceive emotions (Müller et al., 2015). As interacting with domestic animals during daily life, humans tend to attribute their facial muscle movements as facial expressions we well-know and label them as smiling, crying, and even depressed. Pronging to attribute human characteristics to other organisms and even objects via physical resemblances to humans is known as anthropomorphism. We are not just perceiving other living organisms as they are showing emotions; but also within the development of technology, humans tend to develop robots with humanlike features and qualifications aiming them to serve priorly in education and healthcare fields (Wang et al., 2020). That tendency can be observed through a lot of fictional characters appearing in various of movies and animations such as CP3O of Star Wars, WALL-E of the animation with the same name and Baymax of Big Hero 6 (Mara & Appel, 2015). 1.2

Creating Human-Like Machines & Uncanny Valley Theory Depending on the development of new technologies, as the founder of Microsoft Bill Gates

stated, robotics and related technologies are the next hot field coming after the PC revolution. Not just being a part of the professional industry fields, increasing of the ageing populations drove robotics to not stop at industry centered applications but providing more efficient and integrated systems to increase life quality as well. So, affordable and effective social and service robots serving as the growing field all around the world. At the beginning, regarding of the robots visual features and overall appearance, many contemporary robot designs have been introduced as personal assistants, autonomous housekeepers and care bots that designed to support hospitalized patients. As time passes and industry evolves, robots have started to be designed with more human-like features and mimicking human beings. According to the data obtained from Institute of Electrical and Electronics Engineers (IEEE), nearly 70 out of the 158 robots shown so far are constructed to look and/or behave like humans. Within dependence of how easily robots to be distinguished from real individuals, human-like robots are generally referred as humanoids or androids. While humanoids have more mechanical look but also the extremities like arms and legs, androids are designed to mimic human behaviors as much as possible with dressing mechanical components with silicon skin (Hortensius et al., 2018). The reason behind creating more human-like robots is making 3


Temporal Dynamics of Emotion and Action Perception in Anthropomorphy possible to support their functioning in an environment that is built for the human beings. Since the overall appearance of robots will be determinative on the responses of the masters they are built to serve, human likeness and anthropomorphic look go beyond with just looking like humans but acting like them within the collaboration of artificial intelligent. But when it comes to user perceptions of humanoids or androids, things went different as it expected to be. For expressing the relationship between robots and humans, the phenomenon called uncanny valley brought to the table, which states that if robots display a higher similarity when it comes to physical appearance but acts more inhuman, it is associated with the uncanniness, the negative feeling that is assigned to robots from human users. Also, according to that theory, if a robot is made more human-like, the response to it becomes more empathetic and positive, until an upper point that reached. After that point, the robots (agent) become interestingly repulsive (Urgen et al., 2012). The reason of users feel odd about them is assumed as rising from the uncertainty of interacting with the blend of human-like and machine-like features (Hortensius et al., 2018). That uncertainty or difference between physical qualities and motion, generally referred as mismatches, give opportunity to study the uniqueness of neural responses given to agents’ form and motions. As supporting the simulation theory which suggests others’ actions are understood by an internal sensorimotor simulation the seen action in individuals own body representations, neural activity of action perception has regulation by the similarity degree between observed action (actor’s) and observer’s body. From that perspective, developing more human-like humanoids or androids would facilitate the simulation mechanisms more efficiently. According to a research conducted by Urgen et al. (2012), aiming to study the temporal dynamics of action perception and its modulation between seeing form and motion of the agent. The research completed with participation of 12 adults who are right-handed, have no history of

neurological

disorders

and

do

have

normal

or

corrected-to-normal

vision,

neurophysiological processes are observed via electroencephalography (EEG) data depending on the manipulation of the form and motion. The stimuli that given to participant were videos of three agents: human, android and humanoid (referred as robot in the mentioned research). Whereas human had biological appearance and motion; android had biological form and machine-like motion; humanoid had neither biological appearance nor biological motion. As a note, android and humanoid were identical one another within one difference of android was covered with silicon skin above the machinery look humanoids had. So, the kinematics and motions of those two were stated as the same, through the experiment. 4


Temporal Dynamics of Emotion and Action Perception in Anthropomorphy

Figure 1: The anthropomorphy of the agents from the research of temporal dynamics of biological appearance and motion

Three agents mentioned above performed five different upper body actions such as drinking, picking an object, hand waving, talking and nudging. Videos (stimulus) are displayed on a 22’ monitor at 60 Hz and beginning of the video onset, for preventing the enhanced visual evoked between clips that might had caused the subtle effects between conditions, consecutive gray screens were displayed. In order to minimize the eye movement artifacts, participants were asked to fixate at a fixation point at the center of the screen and their attention was controlled during experiment by comprehension question displayed randomly, about the clips had shown. Stated results showed that the EEG scalp topographies of three conditions of three agents differed both temporally and spatially. Firstly, humanoid (robot) were discriminated with increased activity at the occipital regions and strong negativity through central, frontal and centro-parietal areas; whereas human was distinguished via stronger positivity (increased activity) of frontal regions.

5


Temporal Dynamics of Emotion and Action Perception in Anthropomorphy

Figure 2: Spatial ant temporal activites obtained from the EEG measurement of the responses given to agents

As an overall evaluation, researchers stated that neural activity during action perception was regulated differentially depending on appearance and motion of the observed agent; activated neural patterns and brain regions were differed for agents. At the early visual processing that is known for its sensitivity to physical properties of visual stimuli, humanoid (robot) distinguished from android and human; in the right centro-parietal region called N170, anthropomorphism became determinative since that brain region is related with face and body perception (Urgen et al., 2012). According to results, face-sensitive region N170 modulated by face (appearance) but also showed context dependent activity via matching appearance and motions. It is assumed that revealing the neural activity patterns related with both simulation theory and uncanny valley theory would facilitate and enhance the android and more-human like social robotics design to serve understand the neural basis of action perception also the mimicking emotional responses of humans would be beneficial to lighten the neural dynamics of those processes.

2 References Darwin, C. (1872). The expression of the emotions in man and animals. John Murray. https://doi.org/10.1037/10001-000

6


Temporal Dynamics of Emotion and Action Perception in Anthropomorphy Hortensius, R., Hekele, F., & Cross, E. S. (2018). The Perception of Emotion in Artificial Agents. IEEE Transactions on Cognitive and Developmental Systems, 10(4), 852–864. https://doi.org/10.1109/TCDS.2018.2826921 Kret, M. E., Muramatsu, A., & Matsuzawa, T. (2018). Emotion processing across and within species: A comparison between humans (Homo sapiens) and chimpanzees (Pan troglodytes). Journal of Comparative Psychology, 132(4), 395–409. https://doi.org/10.1037/com0000108 Mara, M., & Appel, M. (2015). Effects of lateral head tilt on user perceptions of humanoid and android robots. Computers in Human Behavior, 44, 326–334. https://doi.org/10.1016/j.chb.2014.09.025 Matsumoto, D. (1992). American-Japanese Cultural Differences in the Recognition of Universal Facial Expressions. Journal of Cross-Cultural Psychology, 23(1), 72–84. https://doi.org/10.1177/0022022192231005 Müller, C. A., Schmitt, K., Barber, A. L. A., & Huber, L. (2015). Dogs Can Discriminate Emotional Expressions of Human Faces. Current Biology, 25(5), 601–605. https://doi.org/10.1016/j.cub.2014.12.055 Urgen, B. A., Plank, M., Ishiguro, H., Poizner, H., & Saygin, A. P. (2012). Temporal Dynamics of Action Perception: The Role of Biological Appearance and Motion Kinematics. Roceedings of the Annual Meeting of the Cognitive Science Society, 34. https://escholarship.org/uc/item/2bm7s8tv Wang, S., Cheong, Y. F., Dilks, D. D., & Rochat, P. (2020). The Uncanny Valley Phenomenon and the Temporal Dynamics of Face Animacy Perception. Perception, 49(10), 1069– 1089. https://doi.org/10.1177/0301006620952611

7


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.