In this paper, we will explore human movementusing a tutoring spotter system which controls an iCub robot.We will present an evaluation based on the captured humanmovement from an experimental study, where our participantsdemonstrated a salt-shaker and a cup-stacking task to the iCubrobot. We will use a method of action recognition, which willhelp the robot to differentiate between these actions, as it willfocus the robot’s attention on the vital information presented bythe human. Our findings imply that the behaviour of the robotaffects our participants and it influences both, presentationtime as well as the ratio between action and sub-action duringthe task presentation. Furthermore, the stability of the actionrecognition system is influenced by this modification of thehuman presentation.
|IEEE International Symposium on Robot and Human Interactive Communication
|25th IEEE International Symposium on Robot and Human Interactive Communication 2016
|26/08/16 → 31/08/16