Workshop on Empathic Computing @ IEEE VR’22

The goal of this workshop is to present important research topics in Empathic Computing, collaborative AR/VR, and related topics, and to identify key areas for future research. It will hopefully contribute to growing the community of researchers interested in Empathic Computing related research topics.

Overview and Description
Empathic Computing explores how technology can create deeper shared understanding or empathy between people [1]. Empathic Computing systems include a combination of natural collaboration, capturing user experience and their surroundings, and implicit understanding of user emotion and context.

Augmented Reality (AR) and Virtual Reality (VR) technologies support collaboration in three-dimensional environments [8] and offer exciting prospects for Empathic Computing. They enable us to create rich systems that make it easier to share one’s context, environment, and emotional state with another person. For example, we can recreate a user’s environment [3] and capture communication cues like eye gaze [6], gestures, body movement, and physiological data [7].

This workshop explores key aspects of Empathic Computing, including; (1) Novel collaborative AR/VR experiences, (2) Communication models, (3) Use of physiological cues, (4) Methods for sharing the physical world.

(1) Novel collaborative AR/VR experiences
Recent advances in AR/VR support new types of collaborative experiences compared to those experienced through traditional media. For example, hybrid experiences allow an AR user to interact with remote collaborators in VR who are immersed in a reconstruction of their space [2]. Researchers are invited to submit examples of new collaborative AR/VR experiences that include novel research on sharing communication cues or supporting interaction that is impossible in the real world (like sharing bodies or simultaneous collaboration between several people).

(2) Communication models
Despite the potential for enhancing collaboration, there’s relatively little research on communication models for collaborative AR/VR systems [4] and how these models can be used to predict the impact of technology on communication [5]. Researchers are invited to submit user studies that evaluate systems based on communication models or outline communication models specifically for AR/VR.

(3) Use of physiological cues
A third important area is the use of sensors to provide physiological cues that enhance collaborative experiences. Recent head-mounted displays include sensors such as gaze and gesture tracking, heart rate, GSR, and EEG sensors These can help enhance shared understanding by capturing the user’s emotional and cognitive state. For example, [5] enhances connection by sharing heart rate information, and [6] shows how sharing gaze cues in AR/VR improves collaboration. Researchers are invited to submit examples of using physiological sensors to share emotion and cognitive cues that enhance collaboration.

(4) Methods for sharing the physical world.
Finally, advances in computer vision technology, depth, and 360-degree cameras make it possible to capture and recreate a user’s physical environment for remote collaborators. Researchers are invited to submit examples of how scene capture and 3D reconstruction can enhance collaboration and a sense of presence.

This workshop will provide an opportunity for academic and industry researchers to present their latest work or research in progress. This will be presented in short papers, but the best of these papers will also be invited to be submitted in expanded form for a special issue of the MTI Journal. In addition, part of the workshop will be devoted to a discussion on the important issues in Empathic Computing with the goal of producing a substantial review paper summarizing the grand challenges of the field.

The workshop will cover a range of different topics such as:
• Communication models for collaborative AR/VR experiences
• Use of physiological sensors in collaborative AR/VR experiences
• Recognizing and sharing emotion in collaborative AR/VR experiences
• Techniques for capturing and sharing real spaces
• Evaluation methods for collaborative systems
• New types of collaborative AR/VR experiences

The workshop format will include a variety of activities, including several keynote speakers, a presentation of selected submitted position papers, and time for a moderated discussion. A call for papers will be used to attract position papers for the workshop. These papers will be 4-8 pages in length (TVCG format), and will provide an opportunity to present work in progress, or results from ongoing research.

Important Dates
o Submission Deadline: January 20, 2022 (extended)
o Notification Deadline: January 22, 2022
o Camera-ready Deadline: January 28, 2022
o Workshop Date: TBA

Submission Link

Workshop organizers:
Mark Billinghurst (University of Auckland) –
Arindam Dey (University of Queensland) –
Janet Johnson (University of California San Diego) –

[1] Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017, June). Empathic mixed reality: Sharing what you feel and interacting with what you see. In 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR) (pp. 38-41). IEEE.
[2] Dey, A., Chen, H., Zhuang, C., Billinghurst, M., & Lindeman, R. W. (2018, October). Effects of sharing real-time multi-sensory heart rate feedback in different immersive collaborative virtual environments. In 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 165-173). IEEE.
[3] Zillner, J., Mendez, E., & Wagner, D. (2018, October). Augmented reality remote collaboration with dense reconstruction. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 38-39). IEEE.
[4] Dey, A., Billinghurst, M., Lindeman, R. W., & Swan, J. (2018). A systematic review of 10 years of augmented reality usability studies: 2005 to 2014. Frontiers in Robotics and AI, 5, 37.
[5] Drago, E. (2015). The effect of technology on face-to-face communication. Elon Journal of Undergraduate Research in Communications, 6(1).
[6] Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
[7] Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017, May). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 4045-4056).
[8] Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13).