Computationally Modeling Joint Action

As robots become more commonplace in human social environments, it is important they are able to understand activities around them, and learn how to work well with human collaborators. We are working on ways to model joint action, which is coordinated behavior between two or more people.

Presently, we computationally model both human-human and human-robot group interaction across multiple naturalistic contexts, including: psychomotor synchrony, clinical teamwork, and face-to-face interaction. We are currently working on several projects that aim to automatically identify and inform synchronous actions in stationary, mobile, and real-time systems.

Selected Publications:

Iqbal, T., Gonzales, M.J. and Riek, L.D. "Mobile Robots and Marching Humans: Measuring Synchronous Joint Action While in Motion". In Proceedings of the AAAI Fall Symposium on Artificial Intelligence in Human-Robot Interaction (AI-HRI), 2014.

Iqbal, T., Gonzales, M., and Riek, L.D. (2014) "A Model for Time-Synchronized Sensing and Motion to Support Human-Robot Fluency". In Proc. of the 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI) Workshop on Timing in HRI. [pdf]

Iqbal, T. and Riek, L.D. (2014). "Role Distribution in Synchronous Human-Robot Joint Action”.23rd IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Towards a Framework for Joint Action Workshop. [pdf]

Iqbal, T. and Riek, L.D. (2014) "Assessing Group Synchrony During a Rhythmic Social Activity: A Systemic Approach". In Proc. of the 6th Conference of the International Society for Gesture Studies (ISGS).