Empathy Glasses

We have been developing a remote collaboration system with Empathy Glasses, a head worn display designed to create a stronger feeling of empathy between remote collaborators. To do this, we combined a head- mounted see-through display with a facial expression recognition system, a heart rate sensor, and an eye tracker. The goal is to enable a remote person to see and hear from another person’s perspective and to understand how they are feeling. In this way, the system shares non-verbal cues that could help increase empathy between remote collaborators.

Project Video(s):

Publications

  • Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze
    Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman and Mark Billinghurst

    Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman and Mark Billinghurst. 2017. Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In Proceedings of ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, pp. 197-204. http://dx.doi.org/10.2312/egve.20171359

    @inproceedings {egve.20171359,
    booktitle = {ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
    editor = {Robert W. Lindeman and Gerd Bruder and Daisuke Iwai},
    title = {{Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze}},
    author = {Lee, Gun A. and Kim, Seungwon and Lee, Youngho and Dey, Arindam and Piumsomboon, Thammathip and Norman, Mitchell and Billinghurst, Mark},
    year = {2017},
    publisher = {The Eurographics Association},
    ISSN = {1727-530X},
    ISBN = {978-3-03868-038-3},
    DOI = {10.2312/egve.20171359}
    }
    To improve remote collaboration in video conferencing systems, researchers have been investigating augmenting visual cues onto a shared live video stream. In such systems, a person wearing a head-mounted display (HMD) and camera can share her view of the surrounding real-world with a remote collaborator to receive assistance on a real-world task. While this concept of augmented video conferencing (AVC) has been actively investigated, there has been little research on how sharing gaze cues might affect the collaboration in video conferencing. This paper investigates how sharing gaze in both directions between a local worker and remote helper in an AVC system affects the collaboration and communication. Using a prototype AVC system that shares the eye gaze of both users, we conducted a user study that compares four conditions with different combinations of eye gaze sharing between the two users. The results showed that sharing each other’s gaze significantly improved collaboration and communication.
  • Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration
    Kunal Gupta, Gun A. Lee and Mark Billinghurst

    Kunal Gupta, Gun A. Lee and Mark Billinghurst. 2016. Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration. IEEE Transactions on Visualization and Computer Graphics Vol.22, No.11, pp.2413-2422. https://doi.org/10.1109/TVCG.2016.2593778

    @ARTICLE{7523400,
    author={K. Gupta and G. A. Lee and M. Billinghurst},
    journal={IEEE Transactions on Visualization and Computer Graphics},
    title={Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration},
    year={2016},
    volume={22},
    number={11},
    pages={2413-2422},
    keywords={cameras;gaze tracking;helmet mounted displays;eye-tracking camera;gaze tracking;head-mounted camera;head-mounted display;remote collaboration;task space remote collaboration;virtual gaze information;virtual pointer;wearable interface;Cameras;Collaboration;Computers;Gaze tracking;Head;Prototypes;Teleconferencing;Computer conferencing;Computer-supported collaborative work;teleconferencing;videoconferencing},
    doi={10.1109/TVCG.2016.2593778},
    ISSN={1077-2626},
    month={Nov},}
    We present results from research exploring the effect of sharing virtual gaze and pointing cues in a wearable interface for remote collaboration. A local worker wears a Head-mounted Camera, Eye-tracking camera and a Head-Mounted Display and shares video and virtual gaze information with a remote helper. The remote helper can provide feedback using a virtual pointer on the live video view. The prototype system was evaluated with a formal user study. Comparing four conditions, (1) NONE (no cue), (2) POINTER, (3) EYE-TRACKER and (4) BOTH (both pointer and eye-tracker cues), we observed that the task completion performance was best in the BOTH condition with a significant difference of POINTER and EYETRACKER individually. The use of eye-tracking and a pointer also significantly improved the co-presence felt between the users. We discuss the implications of this research and the limitations of the developed system that could be improved in further work.
  • A Remote Collaboration System with Empathy Glasses

    Y. Lee, K. Masai, K. Kunze, M. Sugimoto and M. Billinghurst. 2016. A Remote Collaboration System with Empathy Glasses. 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)(ISMARW), Merida, pp. 342-343. http://doi.ieeecomputersociety.org/10.1109/ISMAR-Adjunct.2016.0112

    @INPROCEEDINGS{7836533,
    author = {Y. Lee and K. Masai and K. Kunze and M. Sugimoto and M. Billinghurst},
    booktitle = {2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)(ISMARW)},
    title = {A Remote Collaboration System with Empathy Glasses},
    year = {2017},
    volume = {00},
    number = {},
    pages = {342-343},
    keywords={Collaboration;Glass;Heart rate;Biomedical monitoring;Cameras;Hardware;Computers},
    doi = {10.1109/ISMAR-Adjunct.2016.0112},
    url = {doi.ieeecomputersociety.org/10.1109/ISMAR-Adjunct.2016.0112},
    ISSN = {},
    month={Sept.}
    }
    In this paper, we describe a demonstration of remote collaboration system using Empathy glasses. Using our system, a local worker can share a view of their environment with a remote helper, as well as their gaze, facial expressions, and physiological signals. The remote user can send back visual cues via a see-through head mounted display to help them perform better on a real world task. The system also provides some indication of the remote users face expression using face tracking technology.
  • Empathy Glasses
    Katsutoshi Masai, Kai Kunze, Maki Sugimoto, and Mark Billinghurst

    Katsutoshi Masai, Kai Kunze, Maki Sugimoto, and Mark Billinghurst. 2016. Empathy Glasses. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 1257-1263. https://doi.org/10.1145/2851581.2892370

    @inproceedings{Masai:2016:EG:2851581.2892370,
    author = {Masai, Katsutoshi and Kunze, Kai and sugimoto, Maki and Billinghurst, Mark},
    title = {Empathy Glasses},
    booktitle = {Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems},
    series = {CHI EA '16},
    year = {2016},
    isbn = {978-1-4503-4082-3},
    location = {San Jose, California, USA},
    pages = {1257--1263},
    numpages = {7},
    url = {http://doi.acm.org/10.1145/2851581.2892370},
    doi = {10.1145/2851581.2892370},
    acmid = {2892370},
    publisher = {ACM},
    address = {New York, NY, USA},
    keywords = {emotional interface, facial expression, remote collaboration, wearables},
    }
    In this paper, we describe Empathy Glasses, a head worn prototype designed to create an empathic connection between remote collaborators. The main novelty of our system is that it is the first to combine the following technologies together: (1) wearable facial expression capture hardware, (2) eye tracking, (3) a head worn camera, and (4) a see-through head mounted display, with a focus on remote collaboration. Using the system, a local user can send their information and a view of their environment to a remote helper who can send back visual cues on the local user's see-through display to help them perform a real world task. A pilot user study was conducted to explore how effective the Empathy Glasses were at supporting remote collaboration. We describe the implications that can be drawn from this user study.