Abstract
Natural co-speech gestures are essential components to improve the experience of Human-robot interaction (HRI). However, current gesture generation approaches have many limitations of not being natural, not aligning with the speech and content, or the lack of diverse speaker styles. Therefore, this work aims to repoduce the work by [5] generating natural gestures in simulation based on tri-modal inputs and apply this to a robot. During evaluation, “motion variance” and “Frechet Gesture Distance (FGD)” is employed to evaluate the performance objectively. Then, human participants were recruited to subjectively evaluate the gestures. Results show that the movements in that paper have been successfully transferred to the robot and the gestures have diverse styles and are correlated with the speech. Moreover, there is a significant likeability and style difference between different gestures.
Original language | English |
---|---|
Title of host publication | HAI '24: Proceedings of the 12th International Conference on Human-Agent Interaction |
Publisher | Association for Computing Machinery |
Pages | 426-428 |
Number of pages | 3 |
ISBN (Print) | 9798400711787 |
DOIs | |
Publication status | Published - 24 Nov 2024 |
Event | 12th International Conference on Human-Agent Interaction 2024 - Swansea University, Swansea, United Kingdom Duration: 24 Nov 2024 → 27 Nov 2024 https://hai-conference.net/hai2024/ |
Conference
Conference | 12th International Conference on Human-Agent Interaction 2024 |
---|---|
Country/Territory | United Kingdom |
City | Swansea |
Period | 24/11/24 → 27/11/24 |
Internet address |