Workshop on Empathic Computing @ IEEE VR’22

The goal of this workshop is to present important research topics in Empathic Computing, collaborative AR/VR, and related topics, and to identify key areas for future research. It will hopefully contribute to growing the community of researchers interested in Empathic Computing-related research topics.

Program Overview (all times are in NZDT)

12 March 2022 (Saturday)

11:00 AM – Opening Remarks

11:10 AM – Keynote Talk: Exploring cerebral and bodily synergies in shared virtual environments by Anatole Lécuyer

In this talk, we will discuss how to promote cerebral and bodily synergies in shared virtual environments. We will first evoke the technological prerequisites, showing how related software and hardware technologies including physiological computing and brain-computer interfaces have progressively matured, and how important it is to share such research tools – as open-source platforms – in the VR community. Then we will present several representative examples of teasing collaborative setups illustrating novel kinds of cerebral and/or bodily collaborations and shedding light on the potential future of shared virtual environments.

12:00 PM – Paper Presentations

1. Exploring empathy with Digital Humans
Kate Loveys; Mark Sagar; Mark Billinghurst; Nastaran Saffaryazdi; Elizabeth Broadbent
2. Conceptual Design of Emotional and Pain Expressions of a Virtual Patient in a Virtual Reality Training for Paramedics
Guillermo Carbonell; Jonas Schild
3. Training Empathic Skills in Virtual Reality: A Scoping Review
Lynda Joy Gerry; Mark Billinghurst; Elizabeth Broadbent

01:00 PM – Break

01:10 PM – Paper Presentations

4. Empathy building ‘in the wild’ – a Reflection on Avoidance of the Emotional Engagement
Magdalena Igras-Cybulska; Artur Cybulski; Damian Gałuszka; Jan Smolarczyk
5. Designing and Implementing Individualized VR for Supporting Depression
Ilona Halim; Nilufar Baghaei; Lehan Stemmet; Mark Billinghurst; Richard Porter
6. Effects of Heart Rate Feedback on an Asymmetric Platform using Augmented Reality and Laptop
Arindam Dey; Yufei Cao; Chelsea Dobbins

02:10 PM – Break

02:20 PM – Group activities

03:50 PM – Closing Remarks

Keynote Speaker: Dr Anatole Lécuyer

 Anatole Lécuyer is Senior Researcher and Head of the Hybrid research team, at Inria, the French National Institute for Research in Computer Science and Control, in Rennes, France. His research interests include virtual reality, haptic interaction, 3D user interfaces, and brain-computer interfaces. He regularly serves as an expert in Virtual Reality and BCI for public bodies such as European Commission (EC), European Research Council (ERC), or French National Research Agency (ANR). He is currently Associate Editor of “IEEE Transactions on Visualization and Computer Graphics”, “Frontiers in Virtual Reality” and “Presence” journals. He was Program Chair of IEEE Virtual Reality Conference (2015-2016) and General Chair of IEEE Symposium on Mixed and Augmented Reality (2017) and IEEE Symposium on 3D User Interfaces (2012-2013). He is the author or co-author of more than 200 scientific publications. Anatole Lécuyer obtained the Inria-French Academy of Sciences Young Researcher Prize in 2013 and the IEEE VGTC Technical Achievement Award in Virtual/Augmented Reality in 2019.

Overview and Description
Empathic Computing explores how technology can create deeper shared understanding or empathy between people [1]. Empathic Computing systems include a combination of natural collaboration, capturing user experience and their surroundings, and implicit understanding of user emotion and context.

Augmented Reality (AR) and Virtual Reality (VR) technologies support collaboration in three-dimensional environments [8] and offer exciting prospects for Empathic Computing. They enable us to create rich systems that make it easier to share one’s context, environment, and emotional state with another person. For example, we can recreate a user’s environment [3] and capture communication cues like eye gaze [6], gestures, body movement, and physiological data [7].

This workshop explores key aspects of Empathic Computing, including; (1) Novel collaborative AR/VR experiences, (2) Communication models, (3) Use of physiological cues, (4) Methods for sharing the physical world.

(1) Novel collaborative AR/VR experiences
Recent advances in AR/VR support new types of collaborative experiences compared to those experienced through traditional media. For example, hybrid experiences allow an AR user to interact with remote collaborators in VR who are immersed in a reconstruction of their space [2]. Researchers are invited to submit examples of new collaborative AR/VR experiences that include novel research on sharing communication cues or supporting interaction that is impossible in the real world (like sharing bodies or simultaneous collaboration between several people).

(2) Communication models
Despite the potential for enhancing collaboration, there’s relatively little research on communication models for collaborative AR/VR systems [4] and how these models can be used to predict the impact of technology on communication [5]. Researchers are invited to submit user studies that evaluate systems based on communication models or outline communication models specifically for AR/VR.

(3) Use of physiological cues
A third important area is the use of sensors to provide physiological cues that enhance collaborative experiences. Recent head-mounted displays include sensors such as gaze and gesture tracking, heart rate, GSR, and EEG sensors These can help enhance shared understanding by capturing the user’s emotional and cognitive state. For example, [5] enhances connection by sharing heart rate information, and [6] shows how sharing gaze cues in AR/VR improves collaboration. Researchers are invited to submit examples of using physiological sensors to share emotion and cognitive cues that enhance collaboration.

(4) Methods for sharing the physical world.
Finally, advances in computer vision technology, depth, and 360-degree cameras make it possible to capture and recreate a user’s physical environment for remote collaborators. Researchers are invited to submit examples of how scene capture and 3D reconstruction can enhance collaboration and a sense of presence.

This workshop will provide an opportunity for academic and industry researchers to present their latest work or research in progress. This will be presented in short papers, but the best of these papers will also be invited to be submitted in expanded form for a special issue of the MTI Journal. In addition, part of the workshop will be devoted to a discussion on the important issues in Empathic Computing with the goal of producing a substantial review paper summarizing the grand challenges of the field.

The workshop will cover a range of different topics such as:
• Communication models for collaborative AR/VR experiences
• Use of physiological sensors in collaborative AR/VR experiences
• Recognizing and sharing emotion in collaborative AR/VR experiences
• Techniques for capturing and sharing real spaces
• Evaluation methods for collaborative systems
• New types of collaborative AR/VR experiences

The workshop format will include a variety of activities, including several keynote speakers, a presentation of selected submitted position papers, and time for a moderated discussion. A call for papers will be used to attract position papers for the workshop. These papers will be 4-8 pages in length (TVCG format), and will provide an opportunity to present work in progress, or results from ongoing research.

Important Dates
o Submission Deadline: January 20, 2022 (extended)
o Notification Deadline: January 22, 2022
o Camera-ready Deadline: January 28, 2022
o Workshop Date: March 12, 2000 (Saturday) 11:00 AM – 04:00 PM (NZDT, UTC+13)

Workshop organizers:
Mark Billinghurst (University of Auckland) –
Arindam Dey (University of Queensland) –
Janet Johnson (University of California San Diego) –

[1] Piumsomboon, T., Lee, Y., Lee, G. A., Dey, A., & Billinghurst, M. (2017, June). Empathic mixed reality: Sharing what you feel and interacting with what you see. In 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR) (pp. 38-41). IEEE.
[2] Dey, A., Chen, H., Zhuang, C., Billinghurst, M., & Lindeman, R. W. (2018, October). Effects of sharing real-time multi-sensory heart rate feedback in different immersive collaborative virtual environments. In 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (pp. 165-173). IEEE.
[3] Zillner, J., Mendez, E., & Wagner, D. (2018, October). Augmented reality remote collaboration with dense reconstruction. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 38-39). IEEE.
[4] Dey, A., Billinghurst, M., Lindeman, R. W., & Swan, J. (2018). A systematic review of 10 years of augmented reality usability studies: 2005 to 2014. Frontiers in Robotics and AI, 5, 37.
[5] Drago, E. (2015). The effect of technology on face-to-face communication. Elon Journal of Undergraduate Research in Communications, 6(1).
[6] Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
[7] Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017, May). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 4045-4056).
[8] Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018, April). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1-13).