
International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
A SYSTEMATIC REVIEW OF THE DESIGN AND IMPLEMENTATION OF EMOTIONALLY INTELLIGENT COMPANION ROBOTS
Sushanth Sujeer Kumar.1, Mohammed Apsal.1, Abdul Ashad Raishan.1, Rayhan Mohammed Jessy.1 & Vuppu Sai Krishna Prasad2
1 B.Tech Student, Department of Mechanical Engineering, Institute of Engineering & Technology, Srinivas University, Mangalore, India - 575001
2 Assistant Professor, Department of Mechanical Engineering, Institute of Engineering & Technology, Srinivas University, Mangalore, India – 575001
ABSTRACT
Purpose: This paper explores the design and implementation of emotionally intelligent companion robots, aiming to enhancehuman-robotinteraction(HRI)byintegratingadvancedemotionalrecognitionandadaptiveresponsesystems. Design/Methodology/Approach: The research employs a multidisciplinary approach, combining machine learning, affective computing, and human-computer interaction principles. A robust emotion detection system is developed using multimodal data, including facial expressions, voice intonation, and physiological signals. Adaptive algorithms are then designedtogeneratecontextuallyappropriateresponsesbasedonthedetectedemotions.Theimplementationistestedin various real-world scenarios to assess the robot's performance in terms of accuracy, user satisfaction, and interaction quality.
Findings: Theresultsdemonstratethattheproposed emotionallyintelligentcompanionrobotsignificantlyimprovesthe qualityofhuman-robot interactions.Theemotiondetectionsystemshowshighaccuracyacross diverse emotional states, andtheadaptiveresponsemechanismenhancesuserengagementandsatisfaction.Thestudyalsoidentifieskeychallenges andopportunitiesinthepracticaldeploymentofsuchrobots,providinginsightsforfuturedevelopments.
Originality/Value: This research contributes to the evolving field of emotionally intelligent robotics by presenting a comprehensive framework for the design and implementation of companion robots capable of understanding and responding to human emotions. The study offers novel insights into the technical and ethical considerations of creating empatheticrobots,pavingthewayfortheirbroaderapplicationinvariousindustries.
Paper Type: Literaturereview
Keywords:
Emotionally Intelligent Robots, Human-Robot Interaction, Affective Computing, Machine Learning, Adaptive Response, EmotionDetection,CompanionRobots,RoboticsDesign,NaturalLanguageProcessing(NLP),BehavioralModeling.
INTRODUCTION
The integration of emotionally intelligent companion robots into various aspects of human life marks a transformative stepintheevolutionofroboticsandartificialintelligence.Asthesetechnologiesbecomeincreasinglypervasive,theability of robots to engage in empathetic, emotionally resonant interactions with humans is no longer a luxury but a necessity. This research paper explores the intricate process of designing and implementing companion robots endowed with emotionalintelligence,capableofnotjustperformingtasksbutalsoestablishingmeaningfulconnectionswiththeirhuman counterparts.
Emotionally intelligent companion robots represent a sophisticated blend of machine learning, affective computing, and advanced sensor technologies. These robots are designed to perceive, interpret, and respond to human emotions by analyzing verbal and non-verbal cues, such as facial expressions, tone of voice, and body language. This real-time emotional feedback loop enables the robot to adapt its behavior, providing personalized and contextually appropriate interactionsthatenhancetheuserexperience.

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
Thedevelopmentofsuchrobotsposessignificanttechnicalandethicalchallenges.Thisresearchdelvesintothetheoretical
underpinningsofemotionalintelligenceinrobotics,exploringhowalgorithmscanbedesignedtomimichumanemotional understandingandresponse.Italsoaddressesthepracticalaspectsofimplementation,fromtheintegrationofsensorsand actuators to the creation of adaptive learning systems that allow robots to refine their interactions over time. Moreover, the paper examines the ethical considerations of deploying emotionally intelligent robots, particularly in sensitive environmentssuchashealthcare,eldercare,andpersonalassistance.
The broader implications of this research are profound. Emotionally intelligent companion robots have the potential to revolutionize industries by improving the quality of life for individuals, particularly those who require constant care or companionship. By fostering a deeper connection between humans and machines, these robots can build trust and empathy, making them invaluable partners in daily life. This paper contributes to the ongoing discourse in human-robot interaction, offering novel insights and practical guidelines for the future development of emotionally intelligent robotic systems.
REVIEW OF LITERATURE
S.No Area ContributionofAuthor References
1 The Emotionally Intelligent Robot: Improving Social Navigation in Crowded Environments
This research developed real-time algorithm for robots that can navigate around people while being aware of their emotions Using Bayesian inference, a special kind oflearningwithCNNs,&thePADmodel(thatstandsfor Pleasure-Arousal-Dominance), the robots estimate & predict how pedestrians feel They sort these feelings into four groups: happy, sad, angry, and neutral Guess what?Theyachievethiswithanaccuracyof85.33%.By adding emotion-based proxemic rules, they make robot navigation more socially-aware They tested their method in simulations and in real life with the Pepper robot. It really shows how well these robots can move around in places where there are lots of folks walking around.
Aniket Bera, Tanmay Randhavane, Rohan Prinja, Kyra Kapsaskis, Austin Wang, Kurt Gray, DineshManocha (2019)
2 Companion Robots Be Useful in Rural Schools
3
Socially intelligent robots that understand and respond to human touch
The study contributes by evaluating how companion robots, Paro and iRobiQ, could benefit rural schools It reveals positive student and teacher responses, highlighting robots' potential roles in comforting students, assisting those with autism, and repeating exercises.Italsoidentifiesgenderandagedifferencesin engagement and feedback, with girls and younger students responding more favorably. The findings suggest that companion robots can provide both educational and emotional support, guiding developers on features to enhance and indicating the robots' potential in addressing specific needs in rural educationalsettings
Atouchgesturedatasetforresearchandbenchmarking, along with associated benchmark results. They organized a machine learning challenge to promote advancements in social touch recognition and identified challenges in recognizing touch in naturalistic settings. The research informs the design of behavioral models for robot pets and explores healthcare applications. Overall, it establishes a foundation for developing socially intelligent robots capable of understanding and appropriatelyrespondingtohumantouch.
Danielle Alexis Feerst, Seung Ho Lee, Hayley Robinson, Jordi AlboCanals, Ho Seok Ahn & Bruce A MacDonald (2018)
MerelMadeleineJung (2017).

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
4
Affectivecomputing
5 Affective interaction: Using emotions as a user interface in games
6 Human-centricdesign
7 The impact of robot types on emotional engagement with a robot
8 Human–robot interaction:statusand challenges
9 Socialrobotics
Emphasizesthenecessityofimbuingcomputerswiththe ability to recognize, understand, and express emotions for genuine human-computer interaction. Her work establishes an intellectual framework in two parts: Part 1 covers foundational aspects such as human emotions, requirements for emotionally intelligent computers, applications,andsocietalimplications.Part2delvesinto technical aspects like signal-based emotion representation, affect recognition as a pattern recognition problem, and efforts in synthesizing emotions for computers, including wearable applications. Picard's approach bridges scientific rigor with accessibility, fostering advancements in emotional AI that facilitate more intuitive and empathetic humanmachineinteractions.
Christine L. Lisetti and Fatma Nasoz developed an adaptive system for recognizing and responding to users' emotions using multimodal inputs visual, kinesthetic, and auditory. This system aims to enhance human-computer interaction by integrating emotional recognition into user interfaces, allowing for real-time adaptiveresponsesbasedontheuser'semotionalstate.
RosalindWPicard (2000).
Yoones A Sekhavat, Milad JafariSisi,SamadRoohi (2021).
Mark Morrissey reviews the evolution of manikin software, highlighting its role in human-centric design. He discusses advancements from the Sammie model to virtual reality tools that improve design visualization, ease of assembly, and maintenance, and enhance communication among project teams, thus advancing human-factorsengineeringinproductdevelopment. MarkMorrissey (1998).
Jung JuChoi,Yunkyung Kim,andSonya S. Kwak'sstudy shows that tele-operated robots evoke a stronger sense of social presence and more embarrassment in users compared to autonomous robots. Their work highlights the impact of robot types on emotional engagement, offering valuable insights for human-robot interaction design.
Jung Ju Choi, Yunkyung Kim,SonyaSKwak (2014)
Review outlines the evolution and current challenges of human-robot interaction (HRI), emphasizing its diverse applications from hazardous environments to healthcare. He identifies research gaps in humanoid robots, self-driving cars, and social interaction with robots, providing a comprehensive overview of HRI's expandingroleacrossindustries. ThomasBSheridan.(2016).
Social robotics by surveying key research trends and challenges in creating socially interactive robots. They emphasize the need for robots to communicate naturally, engage emotionally, and possess socialcognitive skills to support humans effectively. Their chapter underscores the interdisciplinary nature of developing social robots, integrating robotics, AI, psychology, neuroscience, and other fields to advance human-robotinteractionfordiverseapplications.
Cynthia Breazeal, Kerstin Dautenhahn, Takayuki Kanda (2016)

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
10 A facial expression emotion recognition based human-robot interactionsystem.
11
Empathic robot for grouplearning:Afield study
12 Emotion recognition for human-robot interaction: Recent advances and future perspectives
The FEER-HRI system integrates facial emotion recognition and expression generation for robots using 2D-Gabor, LBP, and ELM methods, facilitating effective human-robot interaction in scenarios like home service andentertainment.
The study explores the impact of an autonomous empathicrobotongrouplearning,assessingitsabilityto fostercollaborativeeducational experiencesandsustain positive outcomes in sustainability education. Results suggest the robot facilitates meaningful discussions but indicatemixedlong-termeducationalgains,highlighting the need for continued research in educational robotics forgroupsettings.
Matteo Spezialetti, Giuseppe Placidi, and Silvia Rossi review recent advances in emotion recognition for human-robotinteraction,emphasizingtheimportanceof endowing robots with emotional intelligence. They discuss current emotional models, interaction modalities, and classification strategies, while highlighting future challenges and opportunities in improving robots' ability to intuitively interpret human emotions.
Zhentao Liu, Min Wu, Weihua Cao, Luefeng Chen, Jianping Xu, Ri Zhang, Mengtian Zhou, JunweiMao (2017).
Pascale Fung, Dario Bertero, Yan Wan, Anik Dey, Ricky Ho Yin Chan, Farhad Bin Siddique, Yang Yang, Chien-Sheng Wu, RuixiLin (2018).
Matteo Spezialetti, Giuseppe Placidi, Silvia Rossi(2020).
13 Emotion-driven analysisandcontrolof human-robot interactions in collaborative applications
Aitor Toichoa Eyam, Wael M. Mohammed, and Jose L. Martinez Lastra present an approach for adapting collaborative robot (cobot) parameters to human emotional states using EEG technology. Their work enhances human-robot collaboration by increasing safety, trust, and emotional awareness in industrial environments.
AitorToichoaEyam, Wael M Mohammed, Jose L MartinezLastra (2021). 14
Design for a robotic companion
Affective touch in socialrobots
15
16 Systems‐centered emotional intelligence: beyond individual systems to organizational systems
The authors contributed by exploring the design and control of robotic companions, specifically focusing on the FLASH robot developed at Wroclaw University of Technology. They conducted human-robot interaction experiments to assess human attitudes towards FLASH and tested its emotional system, advancing the field of socialrobotics.
Theauthorexaminestheroleofaffectivetouchinsocial robots, analyzing how tactile interaction influences intimacy and user experience. By comparing anthropomorphic and zoomorphic robots, the paper highlights the importance of both functional and imaginative aspects of touch, and how low-tech materials contribute to the sensory dynamics of robotic touch.
The authors introduce a systems-centered model for emotional intelligence, extending the concept beyond individuals to encompass work groups and entire organizations. They arguethatorganizational emotional intelligenceresultsfromsystemdynamicsandstructure, notjustindividualtraits,therebyshiftingthefocusfrom personal development to enhancing overall organizationalfunctioning.
Jan Kedzierski, Pawel Kaczmarek, Michal Dziergwa,KrzysztofTchon (2015).
ErikaKerruish (2017).
Susan P Gantt, Yvonne M Agazarian (2004).

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
17 Robotsinelderlycare
18 Personal assistant robots
The authors provide a comprehensive review of the impactofassistivesocialrobotsonelderlycare,focusing on their potential to enhance health and psychological well-being. They highlight positive effects but note the lack of robust evidence due to research design limitations.Theycallformorerigorousstudiestobetter assesstheserobots'effectiveness.
The paper introduces a personal assistant robot designed to integrate into smart homes, particularly benefiting elderly and disabled individuals. It aims to provide essential assistance, enhancing independence and quality of life through technology. The focus is on creating a supportive environment where advanced roboticscontributetodailyliving,emphasizingusability and practical application in improving the lives of vulnerablepopulations.
Alessandro Vercelli, Innocenzo Rainero, Ludovico Ciferri, Marina Boido, Fabrizio Pirri(2018).
AIAlexan,AR Osan,Stefan Oniga (2012).
19 The ethical risk of attachment how to identify, investigate and predict potential ethical risks in the development of social companionrobots
20 A survey of socially interactiverobots
The paper introduces the Triple-A Model, which categorizes human-robot interactions into Assistance, Adaptation, and Attachment levels to identify and mitigate ethical risks in the development of social companionrobots.Itfocusesontheimplicationsoflongtermattachmentbetweenhumansandrobots,providing a framework grounded in sociology and cognitive science to guide ethical robot design and implementation.
By conducting a comprehensive survey of socially interactive robots. They provide a taxonomy of design methodsandsystemcomponentsusedin buildingthese robots, highlighting their impact on human interaction and discussing key open issues in the field. Their work contextualizessociallyinteractiverobotswithinbroader research domains and outlines various forms and applications, offering a foundational resource for understanding and advancing the field of human-robot interaction.
Andreas Huber, Astrid Weiss, Marjo Rauhala (2016).
Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn(2003).
21 Artificial Intelligence Model of an Smartphone-Based VirtualCompanion
22 Empathy in human–robot interaction: Designing for social robots
The authors contribute a smartphone-based virtual companion AI model featuring Probabilistic Mood Estimation (PME) using SVM and DBNs. This enables adaptive behavior through a Behavior Network, responding to user mood and external factors. An internal mood state structure enhances believability, evaluatedviauserstudiesforeffectiveness.
AnaPaivaandcolleagues'surveyin"EmpathyinVirtual Agents and Robots" reviews how artificial agents simulate and evoke empathy, comparing these systems to human empathy mechanisms. Sung Park and Mincheol Whang's paper outlines a framework for designing empathic robots, emphasizing the recognition ofemotionalstatesandeffectiveresponsestrategies.
Elham Saadatian, Thoriq Salafi,HoomanSamani,Yu De Lim, Ryohei Nakatsu(2014).
Sung Park, Mincheol Whang(Feb8,2022).

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
23 Human–Robot Companionship: Current Trends and FutureAgenda
24 Robots as intelligent assistants to face COVID-19pandemic
25 A voice-controlled personal assistant robot
OrenZuckermanandcolleaguesinvestigatedtheimpact of a non-humanoid robot with a primary cognitive function on older adults' feelings of being "seen." Their study found that non-verbal gestures from such robots enhanced emotional engagement and acceptance compared to traditional companion functions, suggestingnewdesigndirectionsforcompanionrobots.
The authors of the paper "How can I help you? An intelligent virtual assistant for industrial robots" focus on enhancing human-robot interaction (HRI) through a novel virtual assistant, Max. They introduce a RESTful Client-Server architecture, implement conversation strategies for more natural dialogues, and test the system in real-world scenarios involving industrial robots.
The authors developed a voice-controlled personal assistant robot capable of executing commands like movements and object relocation based on real-time speech processing. The robot uses a micro-controller and Bluetooth network, with performance evaluated through experiments. Potential applications in home, hospital,andindustrialsettingsarealsodiscussed.
Eshtiak Ahmed, Oğuz‘Oz’ Buruk, Juho Hamari(24 july2024).
Valeria Seidita, Francesco Lanza, Arianna Pipitone, AntonioChella. (17December2020).
Anurag Mishra, Pooja Makula, Akshay Kumar, Krit Karan, VK Mittal (2015).
26 LXio: The mood detectionRobopsych
27 Automated facial recognition technology: Recent developments and approaches to oversight
The paper "LXio: The mood detection Robopsych" developsasystemusingpredictivelinguisticstoanalyze moodstatesthroughlanguage.Itmodelscognitivestates withaxiologicalvalues,processespatientdiscourse,and improvesmoodclassificationaccuracy.Thismethodhas implications for neurological assessments, humancomputerinteraction,andcognitiveresearch.
MoniqueMannandMarcusSmithanalyzetheexpansion and application of Automated Facial Recognition Technology(AFRT)insecuritycontexts,highlightingthe tension between privacy and security. They critique current privacy protections, identify governance gaps, and propose strengthening oversight mechanisms, such as enhancing the Office of the Australian Information Commissioner or establishing a Biometrics Commissioner.
Newton Howard, Mathieu Guidere (2012)
Monique Mann, Marcus Smith.(2017).
28 Asystematicreviewof research into how robotic technology canhelpolderpeople
29 Intelligent virtual assistant knows your life
By conducting a systematic review that highlights how robotic technology can benefit older adults. They synthesize existing research, emphasizing robotics' potentialinenhancingindependenceandaddressingthe growing costs of elderly care. Their work underscores the role of technology in supporting aging populations globally, offering insights into the effectiveness and futuredirectionsofrobotic
Thepaperinvestigatesprivacyimplicationsofintelligent virtualassistants(IVAs)likeAmazonAlexa,categorizing and analyzing data collected from users. It reveals how behavioral traces stored in IVA cloud systems can expose sensitive user details such as interests and daily routines, emphasizing potential privacy threats to both IVAvendorsandusers.
Majid Shishehgar, Donald Kerr, Jacqueline Blake(2018).
HyunjiChung,SangjinLee. (2018).

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
30 A smart virtual assistantforstudents
31 A Review of Textual and Voice Processing Algorithms in the Field of Natural LanguageProcessing
The authors introduce a smart virtual assistant tailored for students, aiming to provide continuous, instant supportforacademictaskssuchasgraduationplanning, major exploration, course information retrieval, and scheduling.Theyemphasizetheunderexploredpotential of chatbots in educational settings, contrasting their widespread use in business. A pilot project involving three academic institutions in Minnesota validates their model, highlighting its relevance in enhancing student, staff, and faculty support within higher education contexts.
The review discusses advancements in natural language processing (NLP), focusing on textual and voice processing algorithms. It categorizes these algorithms based on type and evaluates their applications in areas suchastextprocessing,speechrecognition,andmachine translation.Thestudyhighlightsprogressmadeoverthe past decade, identifies future research directions, and assesses the capabilities and limitations of current NLP technologiesinunderstandinghumanlanguage.
Mehdi Mekni, Zakaria Baani, Dalia Sulieman(2020).
MattBowden(2023)
32 Face recognition systems:Asurvey
33 Biometric hash: highconfidence face recognition
The authors review face recognition techniques, categorizing them into local, holistic, and hybrid approaches. They compare their effectiveness and discussfuturedirectionsinthefield.
Relly Victoria Petrescu's paper discusses the development and applications of facial recognition technology, emphasizing its use in security systems, access control, and commercial identification. It contrasts facial recognition with other biometrics like irisandfingerprint recognition, notingitsadvantagesin being contactless and noninvasive. The paper also highlights ethical concerns and the need for regulation, whileadvocatingforresponsibleusetoenhancesecurity and efficiency in various applications, including airport operationsandcrimeprevention.
Yassin Kortli, Maher Jridi, AymanAlFalou,Mohamed Atri(2020)
David CL Ngo, Andrew BJ Teoh,AlwynGoh(2006).
34 Automatic speech recognition
35 Voice recognition system:speech-to-text
The authors contribute a comprehensive overview of Automatic Speech Recognition (ASR), emphasizing its historical evolution from Gaussian mixture models (GMMs) to current deep learning techniques. They highlight the resurgence of interest in ASR driven by mobile device demands and advancements in speech applications.Theirwork focusesonsummarizingrecent innovations, particularly in large vocabulary continuous speech recognition (LVCSR), showcasing how deep learning has significantly enhanced ASR accuracy and usability in real-world settings like voice search and virtualassistants.
The author introduces a voice recognition system focusingonspeech-to-textconversion.ItutilizesHidden Markov Models (HMMs) and Mel Frequency Cepstral Coefficients (MFCCs) for acoustic signal processing and feature extraction. The system enables voice control of computerfunctionsandtext dictation, withapplications inhomeautomation.
DongYu,LinDeng(2016).
VijayPrasad[2015]

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
Research Objectives
1. ToInvestigatevariousdesignandimplementationstrategiesforemotionallyintelligentcompanionrobots.
2. ToAssesstheeffectivenessofemotionalrecognitionandresponsefeaturesincompanionrobots.
3. ToPerformaSWOCanalysis
4. ToSuggestVariousStakeholdersInvolvedintheDesign,Implementation,andDeploymentofEmotionallyIntelligent CompanionRobots.
METHODOLOGY
Thestudyprimarilyreliesonsecondarydatacollectedfromawidearrayofsourcestoexplorethedesignandutilizationof emotionally intelligent companion robots. The data was sourced from Google Scholar, along with other academic databases.
Discussion
Investigating Various Designs and Implementation Strategies for Emotionally Intelligent Companion Robots.
a. The survey of literature undertaken has led to several key insights into the design and implementation strategies for emotionally intelligent companion robots. These robots are engineered to interact with humans in a manner that is sensitive to their emotional states, thereby enhancing user experience and fostering more engaging and supportive interactions. Various strategies have been explored to achieve this objective, with a focus on multiple aspectsof emotional intelligence.Emotionallyintelligentrobotsoftenrelyon multimodal interactionsystemsthat combine facial recognition, voice analysis, and body language interpretation to assess the emotional state of the user. These systems integrate sensors and AI algorithms to interpret data from different modalities, providing a moreaccurateemotionalassessment.Forinstance,integratingcamerasforfacialrecognitionandmicrophonesfor voice tone analysis enables the robot to gauge the user's mood comprehensively. Additionally, context-awareness playsacrucialrole,withrobotsutilizingdatafromtheirenvironment,suchastime,location,andpriorinteractions, tobetterunderstandthecontextoftheuser'semotions.Thisallowstherobottotailoritsresponsesappropriately, whetherit’sacalmingconversationafterdetectingstressoracheerfulinteractionduringamomentofjoy.
b. Moreover, machine learning algorithms are essential for enabling robots to adapt to the emotional patterns of individual users. Over time, these robots can learn specific user preferences, emotional triggers, and typical reactions,allowingthemtopersonalizetheirinteractions.Thisadaptivebehaviorenhancestheeffectivenessofthe emotional support provided by the robot, making users feel understood and cared for. A human-centric design approach is also critical, prioritizing ease of use, safety, and comfort. Robots need to be designed with a nonthreateningappearance,intuitiveinterfaces,andempatheticbehaviortofostertrustandemotionalconnectionwith users. Accessibility is another important consideration, ensuring that people with disabilities or limited technological literacycan effectivelyinteractwith the robot. Ethical considerations, suchasprivacy,data security, andthepotentialforemotionalmanipulation,mustbeaddressedinthedesignandimplementation ofemotionally intelligent robots. Strategies should include robust data encryption, clear consent protocols, and transparency in how emotional data is used, ensuring that users’ rights are protected while benefiting from the robot’s emotional intelligence.
c. Assessingtheeffectivenessofemotionalrecognitionandresponsefeaturesincompanionrobotsisacriticalaspect of their ability to provide meaningful emotional support to users. This effectiveness can be evaluated through various methods, including user feedback and interaction studies. Observing how users interact with the robot in real-lifescenariosallowsresearcherstogatherqualitativedataontherobot’sabilitytocorrectlyidentifyemotions andrespondappropriately.Surveys,interviews,andfocusgroupsprovideinsightsintotheuser’sperceptionofthe robot’s emotional intelligence. Additionally, the technical performance of emotion detection algorithms can be measured through quantitative metrics such as accuracy, precision, recall, and F1 scores, which help determine how well the robot's algorithms identify different emotional states compared to a ground truth established by humanexpertsorpsychologicaltests.Assessingtheappropriatenessandtimingoftherobot'sresponsesisequally important, as an emotionally intelligent robot should not only recognize emotions but also respond in a manner thatiscontextuallyappropriateandtimely.Forexample,arobotthatidentifiessadnessshouldoffercomfortrather than continuing with a neutral or unrelated task. The timing of the response is crucial, as delayed reactions can diminishtheperceivedempathyandresponsivenessoftherobot.Theimpactoftherobot'semotionalinteractions

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
onuser behavior isanother importantmeasureof effectiveness.Ifthe robot's emotional support leadstopositive behavioral changes, such as reduced stress, increased user engagement, or improved mood, it can be considered effective.Longitudinalstudiescantrackthesebehavioraloutcomesovertimetoassessthesustainedimpactofthe robot’semotionalfeatures.Comparativeanalysiswithhumaninteractionisalsobeneficial,asithelpstobenchmark the robot's performance against the complexity of human empathy, providing a reference point for their effectiveness.
SWOC Analysis
Performing a SWOC (Strengths, Weaknesses, Opportunities, and Challenges) analysis provides a strategic framework to evaluate the current state and future potential of emotionally intelligent companion robots. Strengths include the enhanceduserexperienceofferedbytheserobots,astheirabilitytorecognizeandrespondtoemotionsmakesthemmore engaging and supportive, leading to increased satisfaction and trust. Their versatility in application across various domains, including healthcare, eldercare, education, and customer service, further underscores their strengths, as they provide emotional support and companionship where human resources are limited. Continuous improvement through machinelearningallowstheserobotstoadaptandenhancetheiremotionalintelligenceandinteractionqualityovertime. However, there are weaknesses to consider, such as the limited understanding of complex human emotions, which can lead to difficulties in interpreting subtleties or cultural differences in emotional expression. High development costs associatedwith designing andimplementing emotionallyintelligentfeaturesalsoposea barrierto widespread adoption. There is also a risk of dependency, where users may become overly reliant on robots for emotional support, potentially leading to reduced human interaction and social isolation. Opportunities exist in the expanding market demand for emotionally intelligent robots as society increasingly embraces technology in daily life. Continued advancements in AI, machine learning, and sensor technology present opportunities to enhance the emotional capabilities of robots, making them more effective and human-like. The ability to tailor interactions based on individual user profiles offers further opportunitiesfordevelopinghighlypersonalizedemotionalsupportsystems,increasinguserengagementandsatisfaction. Nonetheless, challenges remain, particularly in ensuring ethical use of emotional data, avoiding emotional manipulation, and addressing privacy concerns. Regulatory hurdles may also arise, especially in sensitive areas like healthcare, where safetyandefficacyareparamount.Gainingwidespreaduseracceptanceisanotherchallenge,aspeoplemaybeskepticalof the robot’s ability to genuinely understand and respond to emotions, or may have concerns about privacy and data security.
Research Gap
While there have been advancements in the development and implementation of emotionally intelligent companion robots, significant gaps remain in understanding their full potential and addressing existing challenges. Identifying and addressingthesegapsisessentialtocreatingmoreeffective,reliable,andethicallysoundcompanionrobots.
Mostcurrentresearchonemotionrecognitioninrobotsisbasedondatasetsfromlimitedculturalbackgrounds,leadingto potential biases in interpreting emotions. There is a need for more extensive research into how these robots can accurately recognize and respond to emotions across diverse cultures, ensuring that they are truly universal in their application.
Existing studies often focus on short-term interactions between users and companion robots. However, there's a significant gap in understanding the long-term dynamics of these interactions, particularly how emotional bonds evolve over time. Research is needed to explore how robots can maintain and adapt emotional connections over extended periods.
Whileemotionallyintelligentrobotsaredesignedtoprovidesupport,thereislimitedresearchontheethical implications oftheirabilitytoinfluenceuseremotions.Thisgapcallsforstudiesthataddressthepotentialforemotionalmanipulation andestablishethicalguidelinesforthedevelopmentanddeploymentoftheserobots.
Currentemotionallyintelligentrobotsoftenrelyongeneralizedmodelsofemotionrecognition,whichmaynotbeaccurate for every individual. There's a research gap in developing more personalized systems that can adapt to individual emotionalpatternsandprovidemoretailoredinteractions.
Whilesomeprogresshasbeenmadeinintegratingfacialrecognition,voiceanalysis,andgestureinterpretationinemotion recognition, there is still a gap in fully integrating these modalities to create a cohesive and accurate emotional profile.

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
Research is needed to enhance the seamless integration of these modalities for more accurate and reliable emotion recognition.
Research Agenda
EnhanceDataCollectionandAnalysison
EmotionalInteractions
Objective: Improve the understanding of emotional interactions with companion robots by gathering and analyzing comprehensive,context-specificdata.
ActionSteps:
a. Develop Data Collection Framework: Create a structured approach to gather data on user interactions, emotional responses,androbotperformanceacrossvarioussettingsanddemographics.
b. Establish Centralized Database: Develop a centralized system to store and manage collected data, facilitating easy accessandanalysis.
c. ConductComparativeAnalysis:Comparethecollecteddatawithexistingglobalstudiestoidentifyuniquechallenges andopportunitiesforimprovement.
d. Disseminate Insights: Share findings with researchers, developers, and policymakers to guide the development of moreeffectiveemotionallyintelligentrobots.
2. PerformComprehensiveEmotionalResponseAnalysis
Objective: Identifyunderlyingfactorsaffectingtheemotionalresponsesofcompanionrobotstoenhancetheiraccuracy andreliability.
ActionSteps:
a. Analyze Emotional Response Algorithms: Evaluate the algorithms used for detecting and responding to emotions, focusingontheireffectivenessandlimitations.
b. Collaborate with Experts: Work with emotional intelligence researchers and robotics experts to refine response modelsandaddressidentifiedgaps.
c. DevelopGuidelines:Createpracticalguidelinesforimprovingemotionalresponsesystemsbasedontheanalysisand expertfeedback.
Objective: Advance the integration of various modalities (e.g., facial recognition, voice analysis, and gesture interpretation)toenhanceemotiondetection.
ActionSteps:
d. Integrate Multimodal Data: Develop methods to effectively combine data from multiple sources to improve the accuracyofemotionrecognition.
e. StandardizeProtocols:Establishstandardizedprotocolsforintegratingandprocessingmultimodaldata.
f. ConductPilotStudies:Implementandtestmultimodal systemsinreal-worldscenariostoassesstheireffectiveness andmakenecessaryadjustments.
3. AddressEthicalandPrivacyConcernsinEmotionallyIntelligentRobots
Objective: Ensurethatthedesignandimplementationofcompanionrobotsaddressethicalconsiderationsandprotect userprivacy.
Action Steps: Develop Ethical Guidelines: Create guidelines for the ethical use of emotional data and interaction techniquesincompanionrobots.
a. Enhance Privacy Measures: Implement robust data protection measures to safeguard user privacy and ensure transparencyindatahandling.
b. Conduct Ethical Reviews: Regularly review and update ethical standards and privacy policies in response to new challengesandtechnologicaladvancements.
Research Proposal
After a through examination and evaluation of the research literature, the paper recommends that a study to be conductedtobetterunderstandhumanfactorsinpilotprogression.
● Proposed title: Literature Insights into the Design and Implementation of Emotionally Intelligent Companion Robots
Objectives:
a. To explore and enhance the human-robot interaction (HRI) paradigm by integrating emotional awareness into companionrobots.

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
b. Todesignandimplementcompanionrobotsthatcanperceive,interpret,andrespondtohumanemotionseffectively. c. To evaluate the impact of emotional awareness on the usability, acceptance, and overall experience of companion robotsinvarioussettings.
FINDINGS
The findings of this study reveal that various design strategies have been employed in the development of emotionally intelligentcompanionrobots,eachaimingtoenhancetherobots'abilitytointeractwithusersonanemotionallevel.These strategiesofteninvolvetheintegrationofadvancedtechnologiessuchasfacialrecognitionsystems,voicemodulation,and adaptivelearningalgorithms.Thesefeaturesallowtherobotstodetectandrespondtohumanemotionsmoreeffectively, creatinga more personalizedand empatheticuser experience. However, the effectivenessoftheseemotional recognition and response features can vary. Robots equipped with more sophisticated machine learning algorithms tend to perform better in recognizing a wide range of human emotions. Yet, the physical limitations of the robots or the complexity of human emotional expressions can sometimes hinder their ability to respond appropriately, indicating a need for further refinementinthesesystems.
The study also conducted a SWOC (Strengths, Weaknesses, Opportunities, Challenges) analysis of emotionally intelligent companionrobots.Theanalysisidentifiedsignificantstrengths,suchasthepotentialtogreatlyenhancementalwell-being by providing emotional support and companionship, particularly for individuals who are elderly or socially isolated. However,therobots' currentlimitationsinunderstandingand expressing emotions,alongwiththe highcosts associated with their development and deployment, were noted as significant weaknesses. Despite these challenges, there are substantial opportunities for growth in the use of these robots, particularly in healthcare and eldercare sectors where there is an increasing demand for personalized care solutions. Nevertheless, the study also highlighted challenges, including ethical concerns related to privacy and emotional manipulation, as well as the technical difficulties in creating robotswithtrulyhuman-likeemotionalintelligence.
Additionally, the findings emphasize the crucial role of human-robot interaction in the overall effectiveness of these emotionally intelligent robots. Users generally prefer robots that can mimic human emotional expressions naturally and intuitively,withtrustandcomfortlevelsbeingdirectlylinkedtohowwelltherobot canunderstandandrespondtotheir emotions. Furthermore, case studies from healthcare environments demonstrate that emotionally intelligent companion robots have been particularly successful in reducing feelings of loneliness and improving emotional well-being among elderly patients. These robots have not only provided companionship but have also engaged users in meaningful interactions,offeringbothmentalstimulationandemotionalsupport.Thisunderscorestheneedforongoingdevelopment torefinetheemotionalcapabilitiesofcompanionrobots,ensuringtheycanbettermeetthediverseneedsoftheirusers.
SUGGESTIONS
Implementcomprehensivetrainingprogramsfordesignersanddevelopersofcompanionrobots.Theseprogramsshould focus on improving the accuracy of emotional response algorithms, enhancing user interaction skills, and addressing emotional intelligence in design. This will ensure that companion robots can better understand and respond to user emotions,providingamoreeffectiveandengaginguserexperience.
Conduct frequent and thorough audits of companion robot systems to identify and address technical malfunctions and system reliability issues. Regular audits will help ensure that the robots function optimally, minimizing the risk of technical failures and enhancing overall user satisfaction. This proactive approach will help maintain the robot's performanceandlongevity.
Develop and provide clear, detailed, and user-friendly instructions for operating companion robots. These instructions shouldcoverallfunctionalitiesandprovideguidanceonhowtoeffectivelyusetherobot'scapabilities.Byimprovinguser instructions,userswillbetterunderstandhowtointeractwiththerobot,reducing misunderstandingsandenhancingthe overallexperience.
Invest in the advancement of emotional recognition technologies. Enhance the robot's ability to accurately detect and respond to user emotions through improved sensors, machine learning algorithms, and multimodal communication methods. Strengthening these systems will allow robots to provide more personalized and appropriate emotional responses,improvingusersatisfaction.

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
Establish a robust mechanism for users to provide feedback on their experiences with companion robots. Encourage a cultureofopencommunicationwhereuserscaneasilyreportissues,sharetheirexperiences,andsuggestimprovements. Utilizing user feedback will help in continuously refining the robots' functionalities and addressing any problems effectively,leadingtobetteruserengagementandsatisfaction.
CONCLUSION
This research provides an in-depth analysis of various aspects related to the design and implementation of emotionally intelligentcompanionrobots.Throughtheexaminationofcurrentpracticesandemerging trends,thestudyhighlightsthe critical factors contributing to the effectiveness and user satisfaction of these robots. Key findings emphasize the importance of accurate emotional recognition, robust interaction protocols, and the continuous refinement of emotional intelligencesystems.
Therecurringthemesofemotionalresponseaccuracy,userinteractionquality,andsystemreliabilityunderscoretheneed for ongoing advancements in technology and design methodologies. The study stresses the importance of integrating sophisticated emotional recognition technologies, improving user instructions, and fostering transparent user feedback mechanisms. By addressing these areas, designers and developers can enhance the overall effectiveness of companion robotsinprovidingmeaningfulandsupportiveinteractions.
Theresearchidentifiesspecificareasforimprovement,suchasadvancingemotionalintelligencealgorithms,strengthening system reliability, and optimizing user engagement strategies. This paper emphasizes the need for a comprehensive approachtocompanionrobotdevelopment,includingcontinuoustraining,regularsystemaudits,anduser-centricdesign enhancements.
The diverse factors explored in this study ranging from technical advancements to user experience improvements highlight the complexity of developing emotionally intelligent companion robots. By focusing on these key areas and learningfromongoingdevelopments,stakeholderscanworktowardscreatingmoreeffectiveandemotionallyresponsive robots, ultimately improving user satisfaction and engagement. Embracing these strategies will contribute to the advancementofcompanionrobotsandtheirroleinenhancinghuman-computerinteractions.
Acknowledgement
We would like to express my sincere gratitude to Dr. Ramakrishna Hegde, Dean Srinivas University Institute of Engineering and Technology, and Prof. Vishwas Shetty, Head of the Department of Mechanical Engineering, for their unwaveringsupportand guidancethroughoutthisresearch.Special thankstoourguide,Prof. Vuppu SaiKrishna Prasad, forhisinvaluableinsights,continuousencouragement,andmentorship,whichwerecrucialinshapingthedirectionofthis work.WearealsodeeplygratefultothevariousfacultymembersoftheDepartmentof MechanicalEngineering,SUIETfor theirconstructivefeedbackandtechnicalassistance,allofwhichcontributedtothesuccessfulcompletionofthisresearch.
REFERENCES
[1] Adini, Y., Moses, Y., and Ullman, S. 1997. Face recognition: The problem of compensating for changes in illuminationdirection.IEEETrans.Patt.Anal.Mach.Intell.19,721 732.]]DigitalLibraryGoogleScholar
[2] Akamatsu, S., Sasaki, T., Fukamachi, H., Masui, N., and Suenaga, Y. 1992. An accurate and robust face identification scheme. In Proceedings, International Conference on Pattern Recognition. 217 220.]]Google Scholar
[3] Atick, J., Griffin, P., and Redlich, N. 1996. Statistical approach to shape from shading: Reconstruction of threedimensionalfacesurfacesfromsingletwo-dimensionalimages.NeuralComputat.8,1321 1340.]]
[4] Azarbayejani,A.,Starner,T.,Horowitz,B.,andPentland,A.1993.Visuallycontrolledgraphics.IEEETrans.Patt. Anal.Mach.Intell.15,602 604.]]
[5] Bachmann,T.1991.Identificationofspatiallyquantizedtachistoscopicimagesoffaces:Howmanypixelsdoesit taketocarryidentity?EuropeanJ.Cog.Psych.3,87 103.]]
[6] Bailly-Bailliere, E., Bengio, S., Bimbot, F., Hamouz, M., Kittler, J., Mariethoz, J., Matas, J., Messer, K., Popovici, V., Poree, F., Ruiz, B., and Thiran, J. P. 2003. The BANCA database and evaluation protocol. In Proceedings of the InternationalConferenceonAudio-andVideo-BasedBiometricPersonAuthentication.625 638.]]
[7] Bartlett,J.C.andSearcy,J.1993.Inversionandconfigurationoffaces.Cog.Psych.25,281 316.]]Crossref

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
[8] Bartlett,M.S.,Lades,H.M.,andSejnowski,T.1998.Independentcomponentrepresentationforfacerecognition. InProceedings,SPIESymposiumonElectronicImaging:ScienceandTechnology.528 539.]]
[9] Basri, R. and Jacobs, D. W. 2001. Lambertian refelectances and linear subspaces. In Proceedings, International ConferenceonComputerVision.Vol.II.383 390.]]
[10] Belhumeur, P. N., Hespanha, J. P., and Kriegman, D. J. 1997. Eigenfaces vs. Fisherfaces: Recognition using class specificlinearprojection.IEEETrans.Patt.Anal.Mach.Intell.19,711 720.]]
[11] K. Audhkhasi, B. Ramabhadran, G. Saon, M. Picheny, D. Nahamoo, Direct acoustics-to-word models for English conversationalspeechrecognition,inInterspeech(2017),pp.959–963
[12] A. Avila, J. Monteiro, D. O’Shaughnessy, T. Falk, Speech emotion recognition on mobile devices based on modulationspectralfeaturepoolinganddeepneuralnetworks,inIEEEISSPIT(2017)
[13] T.Backstrom,SpeechCoding:WithCode-ExcitedLinearPrediction(Springer,Berlin,2017)
[14] L. Bai, P. Weber, P. Jancovic, M. Russell, Exploring how phone classification neural networks learn phonetic informationbyvisualisingandinterpretingbottleneckfeatures,inInterspeech(2018),pp.1472–1476
[15] Y. Bengio, A.C. Courville, P. Vincent, Representation learning: a review and new perspectives. IEEE Trans. PatternAnal.Mach.Intell.35(8),1798–1828(2013).
[16] R. Jafri, H. R. Arabnia, “A Survey of Face Recognition Techniques", Journal of Information Processing Systems, Vol.5,No.2,June2009.
[17] C.A.Hansen,"FaceRecognition",InstituteforComputerScienceUniversityofTromso,Norway.
[18] M. D. Kelly. Visual identification of people by computer. PhD thesis, Stanford University, Stanford, CA, USA, 1971,
[19] T.Kanade.ComputerRecognitionofHumanFaces,47,1977.
[20] W. Zhao, R. Chellappa, P. J. Phillips & A. Rosenfeld, "Face Recognition Literature Survey," ACM Computing Surveys,Vol.35,No.4,December2003,pp.399-458.
[21] An Introduction to Face Recognition Technology” Shang-Hung Lin, Ph.D. IC Media Corporation, Informing ScienceSpecialIssueonMultimediaInformingTechnologies-part2,volume3No1,2000.
[22] “Recognizing faces with PCA and ICA” Bruce A. Draper, Kyungim Baek, Marian Stewart Bartlett ,and J. Ross Beveridgea, Computer Vision and Image Understanding 91 (2003) 115–137, doi:10.1016/S10773142(03)00077-8.
[23] M. A. Turk and A. P. Pentland, "Face Recognition Using Eigenfaces," Journal of Cognitive Neuroscience, Vol. 3, No.1,1991,pp.71-86.
[24] H. Murase and S. Nayar, "Visual Learning and Recognition of 3D Objects from Appearance", Int'l j. Computer Vision,vol.14,no.1,1995.
[25] WangL,MinamiK,YamamotoK,NakagawS.SpeakeridentificationbycombiningMFCCandphaseinformation in noisy environments. Proceeding of IEEE Internation- al Conference on Acoustics Speech and Signal Processing(ICASSP);Dallas,TX,USA.2010Mar14-19.p.4502–5.
[26] Sharma K, Singh P. Speech recognition of Punjabi numerals using synergic HMM and DTW approach. Indian JournalofScienceandTechnology.2015Oct;8(27):1–6.
[27] Chaiwongsai J, Chiracharit W, Chamnongthai K, Miyana- ga Y. An architecture of HMM-based isolated-word speech recognition with tone detection function. Proceedings of the International Symposium on Intelligent SignalProcess-ingandCommunicationSystems(ISPACSʼ2008);Bangkok,Thailand.2009Feb8-11.pp.1-4.
[28] Biswas S, Ahmad S, Molla I. Speaker identification using Cepstral based features and discrete Hidden Markov Mod- el. Proceeding of IEEE International Conference on Infor- mation and Communication Technology (ICICT'07);Dha-ka,Bangladesh.2007Mar7-9.p.303–6.
[29] ShahHNM,RashidMZA,AbdollahMF,KamarudinMN,LinCK,KamisZ.Biometricvoicerecognitioninsecurity system.IndianJournalofScienceandTechnology.2014
[30] deKromG.Consistencyandreliabilityofvoicequalityratingsfordierenttypesofspeech fragments.JSpeech LanHearRes.1994Oct37:985–1000.
[31] R. Chellappa, A. Roy-Chowdhury, and S. Zhou,"Human Identification Using Gait and Face," in The Electrical EngineeringHandbook,3rded:CRCPress,2004.
[32] L. El Hafi, S. Isobe, Y. Tabuchi, Y. Katsumata, H. Nakamura, T. Fukui, T. Matsuo, G. A. Garcia Ricardez, M. Yamamoto,A.Taniguchi,Y.Hagiwara,andT.Taniguchi.2020.Systemforaugmentedhuman robotinteraction throughmixedrealityandrobottrainingbynon-expertsincustomerserviceenvironments.AdvancedRobotics, Vol.34,3 4(2020),157 172.https://doi.org/10.1080/01691864.2019.1694068Crossref
[33] AbdelfetahHentout,MustaphaAouache,AbderraoufMaoudj,andIsmaAkli.2019.Human robotinteractionin industrialcollaborativerobotics:aliteraturereviewofthedecade2008 2017.AdvancedRobotics,Vol.33,15 16(2019),764 799.https://doi.org/10.1080/01691864.2019.1636714Crossref
2024, IRJET | Impact Factor value: 8.315 | ISO 9001:2008 Certified Journal | Page13

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
[34] Jhih-Yuan Huang, Wei-Po Lee, Chen-Chia Chen, and Bu-Wei Dong. 2020. Developing Emotion-Aware Human Robot Dialogues for Domain-Specific and Goal-Oriented Tasks. Robotics, Vol. 9, 2 (2020), 31. https://doi.org/10.3390/robotics9020031Crossref
[35] H and A. R. "Artificial Intelligence and Machine Learning for Enterprise Management," 2019 International Conference on Smart Systems and Inventive Technology (ICSSIT), Nov. 2019, doi: 10.1109/icssit46314.2019.8987964.
[36] L. Jin, Z. Li, and J. Tang. "Deep semantic multimodal hashing network for scalable image-text and video-text retrievals,"IEEETrans.NeuralNetw.Learn.Syst.,vol.34,no.4,pp.1838-1851,2023.
[37] N.RajputandS.K.Verma,"Speechrecognitionusingtheepochwisebackpropagationthroughtimealgorithm," Int. J. Comput. Appl., vol. 95, no. 21, pp. 17-21, 2014. [6]. J.-J. Xiong, G.-B. Zhang, J.-X. Wang, and T.-H. Yan, "Improved sliding mode control for finite-time synchronization of nonidentical delayed recurrent neural networks,"IEEETrans.NeuralNetw.Learn.Syst.,vol.31,no.6,pp.2209-2216,2020
[38] Knote, R.; Janson, A.; Eigenbrod, L. & Söllner, M. (2018): The What and How of Smart Personal Assistants: Principles and Application Domains for IS Reserach. In: Multikonferenz Wirtschaftsinformatik (MKWI). Lüneburg,Germany.
[39] Maedche. A, Morana S, Schacht S, Werth D, Knumeich J, Advanced User Assistance Systems, In: Business and InformationSystemsEngineering58,367-370(2016)
[40] TMF, The Medical Futurist, “Chatbots will serve as Health assistants", April 2017, https://medicalfuturist.com/chatbots-health-assistants.
[41] P.Milhorat, S.Schlogl, G.Chollet, J.Boudy, A.Esposito, G.Pelosi, Building the Next generation of Personal digital Assistants, In: International Conference on Advanced Technologies for Signal and Image Processing (ATSIP), 2014.
[42] Magnus Stenman, Automatic Speech Recognition An Evaluation of Google Speech, UMEA Universitet, Spring 2015
[43] Dautenhahn, K., Woods, S., Kaouri, C., Walters, M., Koay, K.L., Werry, I.: What is a robot companion friend, assistantorbutler?In:Proc.IEEEIRS/RSJInt.Conf.onIntelligentRobotsandSystems(IROS2005),pp.1488–1493.IEEEPress,Edmonton(2005)
[44] Wilks, Y. (ed.): Close Engagements with Artificial Companions, Key social, Psychological, Ethical and Design Issues(2010)
[45] Breazeal, C.L.: Designing Sociable Robots. Intelligent Robotics and Autonomous Agents (illustrated ed.). MIT Press,Cambridge(2004)
[46] Leite,I.,Castellano,G.,Pereira,A.,Martinho,C.,Paiva,A.,McOwan,P.W.:Designingagamecompanionforlongterm social interaction. In: Proceedings of the International Workshop on Affective-aware Virtual Agents and SocialRobots(2009)
[47] Correia, S., Pedrosa, S., Costa, J., Estanqueiro, M.: Little Mozart: Establishing long term relationships with (virtual) companions. In: Ruttkay, Z., Kipp, M., Nijholt, A., Vilhjálmsson, H.H. (eds.) IVA 2009. LNCS, vol. 5773, pp.492–493.Springer,Heidelberg(2009)
[48] J.W.Ryu,C.Park,J.Kim,S.Kang,J.Oh,J.Sohn,andH.K.Cho,"KOBIE:Apet-typeemotionrobot,”J.KoreaRobot. Soc.,vol.3,no.2,pp.154-163,Jun.2008.
[49] L.Zhang,M.Jiang,D.Farid,andM.A.Hossain, "Intelligentfacialemotionrecognitionandsemantic-basedtopic detectionforahumanoidrobot,"Exp.Syst.Appl.,vol.40,no.13,pp.5160-5168,Oct.2013.
[50] D.S.Kwon,Y.K.Kwak,J.C.Park,M.J.Chung,E.S.Jee,K.S.Park,H.R.Kim,Y.M.Kim,J. C.Park,E.H.Kim,K.H. Hyun,H.J.Min,H.S.Lee,J.W.Park,S.H.Jo,S.Y.Park,andK.W.Lee,"Emotioninteractionsystemforaservice robot,"inProc.16thIEEEInt.Symp.RobotandHumanInteractiveCommunication,Jeju,Korea,2007,pp.351356
[51] J. J. Gross and L. F. Barrett, “The emerging field of affective science," Emotion, vol. 13, no. 6, pp. 997-998, Dec. 2013.
[52] K. Hirota and F. Y. Dong, “Development of mascot robot system in NEDO project," in Proc. 4th Int. IEEE Conf. IntelligentSystems,Varna,Bulgaria,2008,pp.1-38-1-44.
[53] Y.Yamazaki,F.Y.Dong,Y.Masuda,Y.Uehara,P.Kormushev,H.A.Vu,P.Q.Le,andK.Hirota,“Intentexpression usingeyerobotformascot robotsystem,"inProc.8thInt.Symp.AdvancedIntelligentSystems,Madrid, Spain, 2007,pp.576-580
[54] F.Cavalloetal.,"EmotionModellingforSocialRoboticsApplications:AReview",JournalofBionicEngineering, vol.15,no.2,pp.185-203,2018.http://dx.doi.org/10.1007/s42235-018-0015-y
[55] H.Chung,J.ParkandS.Lee,"DigitalforensicapproachesforAmazonAlexaecosystem,"DFRWSUSA2017.
[56] H.Chung,M.Iorga,J.VoasandS.Lee,""Alexa,CanITrustYou?","inComputer,vol.50,no.9,pp.100-104,2017.
[57] ElasticKibana.http://www.elastic.co/products/kibana
2024, IRJET | Impact Factor value: 8.315 | ISO 9001:2008 Certified Journal | Page14

International Research Journal of Engineering and Technology (IRJET) e-ISSN:2395-0056
Volume: 11 Issue: 09 | Sep 2024 www.irjet.net p-ISSN:2395-0072
[58] Wakemeup:whattimedoAmericansstarttheirday?,http://www.edisonresearch.com/wake-me-up-series-2/ [59] A. Mishra, P. Makula, A. Kumar, K. Karan and V. K. Mittal, "A Voice-Controlled Personal Assistant Robot." InternationalConferenceonIndustrialInstrumentationandControl(ICIC),2015.
[60] I.H.Shanavas,P.B.ReddyandM.C.Doddegowda,"APersonalAssistantRobotUsingRaspberryPi."International ConferenceonDesignInnovationsfor3CsComputeCommunicateContro,2018.
[61] C.H. Lin, H. Andrian, Y.Q. Wang, and K.T. Song, "Personal Assistant Robot." Proceedings of the 2005 IEEE InternationalConferenceonMcchafronics,2005.
[62] Real-time object detection with deep learning and OpenCV, https://www.pyimagesearch.com/2017/09/18/real-time-object-detection-with-deep-learning-and-opencv/ [63] Train your own tiny YOLO v3 on Google colaboratory with the custom dataset, https://medium.com/@today.rafi/train-your-own-tiny-yolo-v3-on-google- colaboratory-withthe-customdataset-2e35db02bf8f
[64] The AI Guy. (2020, January 28). YOLOv3 in the CLOUD: Install and Train Custom Object Detector (FREE GPU) [Videofile].Retrievedfromhttps://www.youtube.com/watch?v=10joRJt39Ns&t=1440s
[65] Butter M, Rensma A, van Boxsel J, Kalisingh S, Schoone M, Leis M, et al. (2008). Robotics for Healthcare. Retrievedfromhttp://repository.tudelft.nl/view/tno/uuid%3Abeddf38c-e88c-4d2a-8394-e7234d9b3e8a/
[66] Chong L, Jianfeng S, Linhong J (2013). Lower limb rehabilitation robots: A review. World Congress on Medical PhysicsandBiomedicalEngineering,2042–2045.[GoogleScholar]
[67] BedafSGelderblomGJdeWitteL(2015) Overviewandcategorizationofrobotssupportingindependentliving ofelderlypeople:Whatactivitiesdotheysupportandhowfarhavetheydeveloped Assistive Technology,27,88 –100.10.1080/10400435.2014.978916[PubMed][CrossRef][GoogleScholar]
[68] Yu, C.; Tapus, A. Interactive robot learning for multimodal emotion recognition. In Proceedings of the International Conference on Social Robotics, Madrid, Spain, 26–29 November 2019; Springer: Berlin/Heidelberg,Germany,2019;pp.633–642.[Google Scholar]
[69] Savery,R.;Weinberg,G.Asurveyofroboticsandemotion:Classificationsandmodelsofemotionalinteraction. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication(RO-MAN),Naples,Italy,31August–4September2020;pp.986–993.[Google Scholar]
[70] Garcia-Garcia, J.M.; Lozano, M.D.; Penichet, V.M.; Law, E.L.C. Building a three-level multimodal emotion recognitionframework. Multimed. Tools Appl. 2023, 82,239–269.[Google Scholar][CrossRef]
[71] Zall,R.;Kangavari,M.R.ComparativeAnalytical Survey onCognitive Agents with Emotional Intelligence. Cogn. Comput. 2022, 14,1223–1246.[Google Scholar][CrossRef]
[72] Cañamero, L. Embodied affect for real-world human-robot interaction. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 459–460. [Google Scholar]
[73] Chung,S.E.;Ryoo,H.Y.Functional/SemanticGestureDesignFactorStudiesonSocialRobotforUserExperience Design. Int. J. Smart Home 2020, 14,1–8.[Google Scholar][CrossRef]
[74] Chen,L.;Su,W.;Feng,Y.;Wu,M.;She,J.;Hirota,K.Two-layerfuzzymultiplerandomforestforspeechemotion recognitioninhuman-robotinteraction. Inf. Sci. 2020, 509,150–163.[Google Scholar][CrossRef]
[75] DelCoco,M.;Leo,M.;Carcagnì,P.;Fama,F.;Spadaro,L.;Ruta,L.;Pioggia,G.;Distante,C.Studyofmechanismsof social interactionstimulationinautismspectrumdisorder byassistedhumanoid robot. IEEE Trans. Cogn. Dev. Syst. 2017, 10, 993–1004.[Google Scholar][CrossRef]
2024, IRJET | Impact Factor value: 8.315 | ISO 9001:2008