Huidong Bai

Huidong Bai

Research Fellow

Dr. Huidong Bai is a Research Fellow at the Empathic Computing Laboratory (ECL) established within the Auckland Bioengineering Institute (ABI, University of Auckland). His areas of research include exploring remote collaborative Mixed Reality (MR) interfaces with spatial scene reconstruction and segmentation, as well as integrating empathic sensing and computing into the collaboration system to enhance shared communication cues.

Before joining the ECL, he was a Postdoctoral Fellow at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ, University of Canterbury), and focused on multimodal natural interaction for mobile and wearable Augmented Reality (AR).

Dr. Bai received his Ph.D. from the HIT Lab NZ in 2016, supervised by Prof. Mark Billinghurst and Prof. Ramakrishnan Mukundan. During his Ph.D., he was also a software engineer intern in the Vuforia team at Qualcomm in 2013 for developing a mobile AR teleconferencing system, and has been an engineering director in a start-up, Envisage AR since 2015 for developing industrial MR/AR applications.

Publications

  • Real-time visual representations for mobile mixed reality remote collaboration.
    Gao, L., Bai, H., He, W., Billinghurst, M., & Lindeman, R. W.

    Gao, L., Bai, H., He, W., Billinghurst, M., & Lindeman, R. W. (2018, December). Real-time visual representations for mobile mixed reality remote collaboration. In SIGGRAPH Asia 2018 Virtual & Augmented Reality (p. 15). ACM.

    @inproceedings{gao2018real,
    title={Real-time visual representations for mobile mixed reality remote collaboration},
    author={Gao, Lei and Bai, Huidong and He, Weiping and Billinghurst, Mark and Lindeman, Robert W},
    booktitle={SIGGRAPH Asia 2018 Virtual \& Augmented Reality},
    pages={15},
    year={2018},
    organization={ACM}
    }
    In this study we present a Mixed-Reality based mobile remote collaboration system that enables an expert providing real-time assistance over a physical distance. By using the Google ARCore position tracking, we can integrate the keyframes captured with one external depth sensor attached to the mobile phone as one single 3D point-cloud data set to present the local physical environment into the VR world. This captured local scene is then wirelessly streamed to the remote side for the expert to view while wearing a mobile VR headset (HTC VIVE Focus). In this case, the remote expert can immerse himself/herself in the VR scene and provide guidance just as sharing the same work environment with the local worker. In addition, the remote guidance is also streamed back to the local side as an AR cue overlaid on top of the local video see-through display. Our proposed mobile remote collaboration system supports a pair of participants performing as one remote expert guiding one local worker on some physical tasks in a more natural and efficient way in a large scale work space from a distance by simulating the face-to-face co-work experience using the Mixed-Reality technique.
  • Filtering 3D Shared Surrounding Environments by Social Proximity in AR
    Nassani, A., Bai, H., Lee, G., Langlotz, T., Billinghurst, M., & Lindeman, R. W.

    Nassani, A., Bai, H., Lee, G., Langlotz, T., Billinghurst, M., & Lindeman, R. W. (2018, October). Filtering 3D Shared Surrounding Environments by Social Proximity in AR. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 123-124). IEEE.

    @inproceedings{nassani2018filtering,
    title={Filtering 3D Shared Surrounding Environments by Social Proximity in AR},
    author={Nassani, Alaeddin and Bai, Huidong and Lee, Gun and Langlotz, Tobias and Billinghurst, Mark and Lindeman, Robert W},
    booktitle={2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
    pages={123--124},
    year={2018},
    organization={IEEE}
    }
    In this poster, we explore the social sharing of surrounding environments on wearable Augmented Reality (AR) devices. In particular, we propose filtering the level of detail of sharing the surrounding environment based on the social proximity between the viewer and the sharer. We test the effect of having the filter (varying levels of detail) on the shared surrounding environment on the sense of privacy from both viewer and sharer perspectives and conducted a pilot study using HoloLens. We report on semi-structured questionnaire results and suggest future directions in the social sharing of surrounding environments.
  • Static local environment capturing and sharing for MR remote collaboration
    Gao, L., Bai, H., Lindeman, R., & Billinghurst, M.

    Gao, L., Bai, H., Lindeman, R., & Billinghurst, M. (2017, November). Static local environment capturing and sharing for MR remote collaboration. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (p. 17). ACM.

    @inproceedings{gao2017static,
    title={Static local environment capturing and sharing for MR remote collaboration},
    author={Gao, Lei and Bai, Huidong and Lindeman, Rob and Billinghurst, Mark},
    booktitle={SIGGRAPH Asia 2017 Mobile Graphics \& Interactive Applications},
    pages={17},
    year={2017},
    organization={ACM}
    }
    We present a Mixed Reality (MR) system that supports entire scene capturing of the local physical work environment for remote collaboration in a large-scale workspace. By integrating the key-frames captured with external depth sensor as one single 3D point-cloud data set, our system could reconstruct the entire local physical workspace into the VR world. In this case, the remote helper could observe the local scene independently from the local user's current head and camera position, and provide gesture guiding information even before the local user staring at the target object. We conducted a pilot study to evaluate the usability of the system by comparing it with our previous oriented view system which only sharing the current camera view together with the real-time head orientation data. Our results indicate that this entire scene capturing and sharing system could significantly increase the remote helper's spatial awareness of the local work environment, especially in a large-scale workspace, and gain an overwhelming user preference (80%) than previous system.
  • A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing
    Huidong Bai , Prasanth Sasikumar , Jing Yang , Mark Billinghurst

    Huidong Bai, Prasanth Sasikumar, Jing Yang, and Mark Billinghurst. 2020. A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. DOI:https://doi.org/10.1145/3313831.3376550

    @inproceedings{bai2020user,
    title={A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing},
    author={Bai, Huidong and Sasikumar, Prasanth and Yang, Jing and Billinghurst, Mark},
    booktitle={Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems},
    pages={1--13},
    year={2020}
    }
    Supporting natural communication cues is critical for people to work together remotely and face-to-face. In this paper we present a Mixed Reality (MR) remote collaboration system that enables a local worker to share a live 3D panorama of his/her surroundings with a remote expert. The remote expert can also share task instructions back to the local worker using visual cues in addition to verbal communication. We conducted a user study to investigate how sharing augmented gaze and gesture cues from the remote expert to the local worker could affect the overall collaboration performance and user experience. We found that by combing gaze and gesture cues, our remote collaboration system could provide a significantly stronger sense of co-presence for both the local and remote users than using the gaze cue alone. The combined cues were also rated significantly higher than the gaze in terms of ease of conveying spatial actions.
  • A Constrained Path Redirection for Passive Haptics
    Lili Wang ; Zixiang Zhao ; Xuefeng Yang ; Huidong Bai ; Amit Barde ; Mark Billinghurst

    L. Wang, Z. Zhao, X. Yang, H. Bai, A. Barde and M. Billinghurst, "A Constrained Path Redirection for Passive Haptics," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 2020, pp. 651-652, doi: 10.1109/VRW50115.2020.00176.

    @inproceedings{wang2020constrained,
    title={A Constrained Path Redirection for Passive Haptics},
    author={Wang, Lili and Zhao, Zixiang and Yang, Xuefeng and Bai, Huidong and Barde, Amit and Billinghurst, Mark},
    booktitle={2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
    pages={651--652},
    year={2020},
    organization={IEEE}
    }
    Navigation with passive haptic feedback can enhance users’ immersion in virtual environments. We propose a constrained path redirection method to provide users with corresponding haptic feedback at the right time and place. We have quantified the VR exploration practicality in a study and the results show advantages over steer-to-center method in terms of presence, and over Steinicke’s method in terms of matching errors and presence.