Limits...
Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

Giuliani M, Mirnig N, Stollnberger G, Stadler S, Buchner R, Tscheligi M - Front Psychol (2015)

Bottom Line: We also found that the participants talked more in the case of social norm violations and less during technical failures.The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills.The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

View Article: PubMed Central - PubMed

Affiliation: Department of Computer Sciences, Center for Human-Computer Interaction, University of Salzburg Salzburg, Austria.

ABSTRACT
Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

No MeSH data available.


Screenshot of an error situation in the ELAN annotation tool.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4495306&req=5

Figure 3: Screenshot of an error situation in the ELAN annotation tool.

Mentions: For data analysis, we annotated our video corpus using the video coding tool ELAN (Wittenburg et al., 2006). Figure 3 shows an example of an ELAN annotation of a video of one of the JAMES user studies using our annotation format.


Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

Giuliani M, Mirnig N, Stollnberger G, Stadler S, Buchner R, Tscheligi M - Front Psychol (2015)

Screenshot of an error situation in the ELAN annotation tool.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4495306&req=5

Figure 3: Screenshot of an error situation in the ELAN annotation tool.
Mentions: For data analysis, we annotated our video corpus using the video coding tool ELAN (Wittenburg et al., 2006). Figure 3 shows an example of an ELAN annotation of a video of one of the JAMES user studies using our annotation format.

Bottom Line: We also found that the participants talked more in the case of social norm violations and less during technical failures.The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills.The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

View Article: PubMed Central - PubMed

Affiliation: Department of Computer Sciences, Center for Human-Computer Interaction, University of Salzburg Salzburg, Austria.

ABSTRACT
Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

No MeSH data available.