Arindam Dey

Arindam Dey

Research Fellow

Dr. Arindam Dey is a Research Fellow at the Empathic Computing Laboratory. He completed his PhD from University of South Australia in 2013 (Supervised by: Prof. Christian Sandor and Prof. Bruce Thomas). Since the completion of his PhD and before joining UniSA again in 2016, Arindam worked in three postdoctoral positions at James Cook University (Australia), Worcester Polytechnic Institute (USA), and University of Tasmania (Australia). He visited Technical University of Munich (Germany) for a research internship and Indian Institute of Technology, Kharagpore (India) for a summer internship (during B.Tech).

His primary research area is Mixed Reality and Human-Computer Interaction. He has more than 20 publications (Google Scholar). He is a peer-reviewer of many journals and conferences and have been a part of the organizing committee of the following conferences:

  • OzCHI (2016): Research Demonstration Chair
  • IEEE Symposium on 3D User Interfaces (2016): Poster Chair
  • Australasian Conference on Artificial Life and Computational Intelligence (2015): International Program Committee Member
  • Asia Pacific Conference on Computer Human Interaction (2013): E-Publications Chair and Program Committee Member
  • IEEE International Symposium on Mixed and Augmented Reality (2013): Online Communications Chair
  • IEEE International Symposium on Mixed and Augmented Reality (2012): Student Volunteer Co-Chair

For more details about Arindam please visit his personal website.

Email:
arindam.dey@unisa.edu.au
Phone:
+61 08 8302 3811

Projects

  • Empathy Glasses

    We have been developing a remote collaboration system with Empathy Glasses, a head worn display designed to create a stronger feeling of empathy between remote collaborators. To do this, we combined a head- mounted see-through display with a facial expression recognition system, a heart rate sensor, and an eye tracker. The goal is to enable a remote person to see and hear from another person's perspective and to understand how they are feeling. In this way, the system shares non-verbal cues that could help increase empathy between remote collaborators.

  • Empathy in Virtual Reality

    Virtual reality (VR) interfaces is an influential medium to trigger emotional changes in humans. However, there is little research on making users of VR interfaces aware of their own and in collaborative interfaces, one another's emotional state. In this project, through a series of system development and user evaluations, we are investigating how physiological data such as heart rate, galvanic skin response, pupil dilation, and EEG can be used as a medium to communicate emotional states either to self (single user interfaces) or the collaborator (collaborative interfaces). The overarching goal is to make VR environments more empathetic and collaborators more aware of each other's emotional state.

Publications

  • A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014
    Arindam Dey, Mark Billinghurst, Robert W Lindeman, J Swan

    Dey A, Billinghurst M, Lindeman RW and Swan JE II (2018) A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014. Front. Robot. AI 5:37. doi: 10.3389/frobt.2018.00037

    @ARTICLE{10.3389/frobt.2018.00037,
    AUTHOR={Dey, Arindam and Billinghurst, Mark and Lindeman, Robert W. and Swan, J. Edward},
    TITLE={A Systematic Review of 10 Years of Augmented Reality Usability Studies: 2005 to 2014},
    JOURNAL={Frontiers in Robotics and AI},
    VOLUME={5},
    PAGES={37},
    YEAR={2018},
    URL={https://www.frontiersin.org/article/10.3389/frobt.2018.00037},
    DOI={10.3389/frobt.2018.00037},
    ISSN={2296-9144},
    }
    Augmented Reality (AR) interfaces have been studied extensively over the last few decades, with a growing number of user-based experiments. In this paper, we systematically review 10 years of the most influential AR user studies, from 2005 to 2014. A total of 291 papers with 369 individual user studies have been reviewed and classified based on their application areas. The primary contribution of the review is to present the broad landscape of user-based AR research, and to provide a high-level view of how that landscape has changed. We summarize the high-level contributions from each category of papers, and present examples of the most influential user studies. We also identify areas where there have been few user studies, and opportunities for future research. Among other things, we find that there is a growing trend toward handheld AR user studies, and that most studies are conducted in laboratory settings and do not involve pilot testing. This research will be useful for AR researchers who want to follow best practices in designing their own AR user studies.
  • He who hesitates is lost (... in thoughts over a robot)
    James Wen, Amanda Stewart, Mark Billinghurst, Arindam Dey, Chad Tossell, Victor Finomore

    James Wen, Amanda Stewart, Mark Billinghurst, Arindam Dey, Chad Tossell, and Victor Finomore. 2018. He who hesitates is lost (...in thoughts over a robot). In Proceedings of the Technology, Mind, and Society (TechMindSociety '18). ACM, New York, NY, USA, Article 43, 6 pages. DOI: https://doi.org/10.1145/3183654.3183703

    @inproceedings{Wen:2018:HHL:3183654.3183703,
    author = {Wen, James and Stewart, Amanda and Billinghurst, Mark and Dey, Arindam and Tossell, Chad and Finomore, Victor},
    title = {He Who Hesitates is Lost (...In Thoughts over a Robot)},
    booktitle = {Proceedings of the Technology, Mind, and Society},
    series = {TechMindSociety '18},
    year = {2018},
    isbn = {978-1-4503-5420-2},
    location = {Washington, DC, USA},
    pages = {43:1--43:6},
    articleno = {43},
    numpages = {6},
    url = {http://doi.acm.org/10.1145/3183654.3183703},
    doi = {10.1145/3183654.3183703},
    acmid = {3183703},
    publisher = {ACM},
    address = {New York, NY, USA},
    keywords = {Anthropomorphism, Empathy, Human Machine Team, Robotics, User Study},
    }
    In a team, the strong bonds that can form between teammates are often seen as critical for reaching peak performance. This perspective may need to be reconsidered, however, if some team members are autonomous robots since establishing bonds with fundamentally inanimate and expendable objects may prove counterproductive. Previous work has measured empathic responses towards robots as singular events at the conclusion of experimental sessions. As relationships extend over long periods of time, sustained empathic behavior towards robots would be of interest. In order to measure user actions that may vary over time and are affected by empathy towards a robot teammate, we created the TEAMMATE simulation system. Our findings suggest that inducing empathy through a back story narrative can significantly change participant decisions in actions that may have consequences for a robot companion over time. The results of our study can have strong implications for the overall performance of human machine teams.
  • Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze
    Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman and Mark Billinghurst

    Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman and Mark Billinghurst. 2017. Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In Proceedings of ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, pp. 197-204. http://dx.doi.org/10.2312/egve.20171359

    @inproceedings {egve.20171359,
    booktitle = {ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
    editor = {Robert W. Lindeman and Gerd Bruder and Daisuke Iwai},
    title = {{Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze}},
    author = {Lee, Gun A. and Kim, Seungwon and Lee, Youngho and Dey, Arindam and Piumsomboon, Thammathip and Norman, Mitchell and Billinghurst, Mark},
    year = {2017},
    publisher = {The Eurographics Association},
    ISSN = {1727-530X},
    ISBN = {978-3-03868-038-3},
    DOI = {10.2312/egve.20171359}
    }
    To improve remote collaboration in video conferencing systems, researchers have been investigating augmenting visual cues onto a shared live video stream. In such systems, a person wearing a head-mounted display (HMD) and camera can share her view of the surrounding real-world with a remote collaborator to receive assistance on a real-world task. While this concept of augmented video conferencing (AVC) has been actively investigated, there has been little research on how sharing gaze cues might affect the collaboration in video conferencing. This paper investigates how sharing gaze in both directions between a local worker and remote helper in an AVC system affects the collaboration and communication. Using a prototype AVC system that shares the eye gaze of both users, we conducted a user study that compares four conditions with different combinations of eye gaze sharing between the two users. The results showed that sharing each other’s gaze significantly improved collaboration and communication.