Computationally Modeling Joint Action

In human social interaction, in addition to knowing quite a bit about social context and convention, neurotypical people are also able to "mindread" their conversation partners by being able to infer and predict meaning, attribute intentionality, share attention and empathize with one another (Baron-Cohen 1997).

In order to realize the vision of socially intelligent machines, we computationally model human-human and human-robot dyadic and group interaction on dimensions including rapport, synchrony, and mimicry, in both naturalistic and laboratory contexts. We are currently working on several projects that aim to automatically identify synchronous actions in stationary, mobile, and real-time systems.

Selected Publications:

Iqbal, T., Gonzales, M., and Riek, L.D. (2014) "A Model for Time-Synchronized Sensing and Motion to Support Human-Robot Fluency". In Proc. of the 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI) Workshop on Timing in HRI. [pdf]

Iqbal, T. and Riek, L.D. (2014) "Assessing Group Synchrony During a Rhythmic Social Activity: A Systemic Approach". In Proc. of the 6th Conference of the International Society for Gesture Studies (ISGS).