Provost & President's Retrospective Review 2011-21

Page 52

05.14

05

Research case studies

Future technologies for communication Rachel McDonnell

Over the past year, we have been forced to spend prolonged periods of time working remotely and experiencing much of our lives online, distancing ourselves from ‘real’ human contact. Video conferencing and virtual reality simulations allow us to maintain interactions with friends, family or business associates. The key challenge in remote connection is to hide the gap of physical distance, and allow natural interaction, which can be difficult with the current video-based technologies such as Zoom, Teams, etc. Recently, there have been huge technological advances in real-time computer graphics, providing sophisticated interactions between people, using highly realistic avatars and virtual and augmented reality (AR/VR). These AR/VR devices allow for content to be truly 3D and immersive where people can cohabit a room with a digital representation of their friend, partner, or relative and be able to interact, talk and even touch, while being in different parts of the world in real-life. These systems are no longer science fiction – however, the current state-of-the-art is still lacking in terms of avatar realism and their ability to reproduce subtle human motions and emotions. This means that even though we can coexist in a virtual space, our avatars might not have the necessary

capabilities for conversation with other virtual humans. Maybe they don’t look enough like us, or their appearances are too cartoon-like to have a serious conversation, or they can’t reproduce our face and body emotions sufficiently, resulting in information loss during communication. Mimicking our unique expressions – My research at the School of Computer Science and Statistics focuses on computer graphics and aims to make the appearance and motion of these types of virtual avatars more realistic, through perceptual experiments and developing new algorithms based on real human movement. As an example, my group recently ran experiments to test the use of personalised avatars in video-conferencing for effective communication. We found that using virtual representations in video-conferencing proved a positive experience for most users and reduced ‘zoom-fatigue’, allowing people to be less focused on their appearance while on-call. We also developed new algorithms for creating virtual characters which mimic our unique expressions (e.g., the smile looks like our unique smile, not a generic smiling expression), thus improving communication. This work was funded by Science Foundation Ireland’s Frontiers for the Future Programme and published at top conferences and journals for computer graphics and virtual reality.

Speech to gesture – My work as a Principal Investigator in the SFI ADAPT Research Centre has led me to investigate photorealistic embodied conversational agents, which are basically virtual humans that look real and can converse with you using artificial intelligence. These agents can be used in a range of scenarios – for example to improve training in healthcare, education, and sales. However, humans are very effective communicators and can notice small irregularities in motion or behaviours of virtual humans, making the task of automating their behaviours difficult. My group has developed new algorithms for automating the gesture behaviours for these characters, so that they can produce appropriate hand gestures during conversation, which is a surprisingly difficult task to automate. Our work in this area has been published at the top venues for intelligent virtual agents, and we have begun investigating the commercial feasibility of such a system. My group also recently published the Trinity Speech-Gesture database, a high-quality motion capture and speech of spontaneous natural conversation. The dataset has become the de facto standard for evaluating speech-to-gesture systems in the community, as it is the largest dataset of its kind that is freely available for researchers worldwide.

Rachel McDonnell received her BA(Mod) and PhD from Trinity and joined the School of Computer Science and Statistics as a lecturer in 2011. She is now Associate Professor of Creative Technologies and a principal investigator with ADAPT, Trinity’s Centre for AI-driven Digital Content Technology and was elected a Fellow of Trinity College Dublin in 2020. The recipient of a prestigious SFI Career Development Award, and a recent Frontiers for the Future grant, she has published almost 100 articles in peer-reviewed conferences and journals. Her research focuses on building plausible and appealing virtual humans. Contact: ramcdonn@tcd.ie

Trinity College Dublin – The University of Dublin


Turn static files into dynamic content formats.

Create a flipbook

Articles inside

16 Public engagement

7min
pages 112-115

keeping it contemporary

7min
pages 104-107

and realising potential

6min
pages 108-111

13 Developing the campus

6min
pages 100-103

12 Philanthropy & alumni engagement

6min
pages 96-99

11.7 Professor Aileen Kavanagh

5min
pages 94-95

11.2 Professor Stephen Thomas

4min
pages 84-85

11.6 Professor Ortwin Hess

5min
pages 92-93

11.5 Professor Sylvia Draper

5min
pages 90-91

11.4 Professor Omar García

4min
pages 88-89

11.3 Professor Colin Doherty

5min
pages 86-87

11 11.0 New professor interviews

5min
pages 80-81

10 Trinity’s thriving flora and fauna

7min
pages 76-79

and industry engagement

7min
pages 72-75

07 Opening access to education

7min
pages 64-67

08 Supporting the Trinity student experience

6min
pages 68-71

05.18 Plamen Stamenov

3min
pages 58-59

05.17 David Kenny

3min
pages 56-57

05.15 Adriele Prina-Mello

3min
pages 52-53

05.14 Rachel McDonnell

3min
pages 50-51

05.12 Catherine Hayes

4min
pages 46-47

05.4 Tríona Lally

4min
pages 30-31

05.11 Aidan McDonald

3min
pages 44-45

05.8 Catherine Comiskey

4min
pages 38-39

05.7 Lina Zgaga

3min
pages 36-37

05.5 Jeremy (Jay) Piggott

3min
pages 32-33

05.3 Kenneth Pearce

4min
pages 28-29

04 Trinity’s Global Relations

7min
pages 18-21

05.6 Mary Rogan

3min
pages 34-35
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.