Empathic Computing is a research field that aims to use technology to create deeper shared understanding or empathy between people. At the same time, Mixed Reality (MR) technology provides an immersive experience that can make an ideal interface for collaboration. In this paper, we present some of our research into how MR technology can be applied to creating Empathic Computing experiences. This includes exploring how to share gaze in a remote collaboration between Augmented Reality (AR) and Virtual Reality (VR) environments, using physiological signals to enhance collaborative VR, and supporting interaction through eye-gaze in VR. Early outcomes indicate that as we design collaborative interfaces to enhance empathy between people, this could also benefit the personal experience of the individual interacting with the interface.
Interfaces for collaborative tasks, such as multiplayer games can enable more effective and enjoyable collaboration. However, in these systems, the emotional states of the users are often not communicated properly due to their remoteness from one another. In this paper, we investigate the effects of showing emotional states of one collaborator to the other during an immersive Virtual Reality (VR) gameplay experience. We created two collaborative immersive VR games that display the real-time heart-rate of one player to the other. The two different games elicited different emotions, one joyous and the other scary. We tested the effects of visualizing heart-rate feedback in comparison with conditions where such a feedback was absent. The games had significant main effects on the overall emotional experience.
According to previous research, head mounted displays (HMDs) and head worn cameras (HWCs) are useful for remote collaboration. These systems can be especially helpful for remote assistance on physical tasks, when a remote expert can see the workspace of the local user and provide feedback. However, a HWC often has a wide field of view and so it may be difficult to know exactly where the local user is looking. In this chapter we explore how head mounted eye-tracking can be used to convey gaze cues to a remote collaborator. We describe two prototypes developed that integrate an eye-tracker with a HWC and see-through HMD, and results from user studies conducted with the systems. Overall, we found that showing gaze cues on a shared video appears to be better than just providing the video on its own, and combining gaze and pointing cues is the most effective interface for remote collaboration among the conditions tested. We also discuss the limitations of this work and present directions for future research.