Pai Yun Suen

Pai Yun Suen

Research Fellow

Dr. Yun Suen Pai is currently a Research Fellow at the Empathic Computing Laboratory (ECL), University of Auckland, directed by Prof. Mark Billinghurst. He received his Masters degree in Engineering Science on 2015 at the University of Malaya, Malaysia. He then completed his PhD from the Keio University Graduate School of Media Design, Yokohama, Japan in 2018 (Supervised by Prof. Kai Kunze), before continuing to be a researcher there for an additional 6 months.

His research interests includes the effects of augmented, virtual and mixed reality towards human perception, behavior, and physiological state. He has collaborated with several companies, including AirBus Malaysia, Fujitsu Design, NTT Media Intelligence Laboratory, Ignition Point, CSIRO and Google in several research areas, including haptics in AR, vision augmentation, VR navigation, and the use of machine learning for novel input and interactions.

Publications:
Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., & Billinghurst, M. (2020, March). Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality. To appear in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) IEEE.

Hajika, R., Gupta, K., Sasikumar, P., & Pai, Y. S. (2019, November). HyperDrum: Interactive Synchronous Drumming in Virtual Reality using Everyday Objects. In SIGGRAPH Asia 2019 XR (pp. 15-16). ACM.

Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., & Billinghurst, M. (2019, November). In AI We Trust: Investigating the Relationship between Biosignals, Trust and Cognitive Load in VR. In 25th ACM Symposium on Virtual Reality Software and Technology (p. 33). ACM.​

Publications

  • Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality
    Kunal Gupta, Ryo Hajika, Yun Suen Pai, Andreas Duenser, Martin Lochner, Mark Billinghurst

    K. Gupta, R. Hajika, Y. S. Pai, A. Duenser, M. Lochner and M. Billinghurst, "Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 756-765, doi: 10.1109/VR46266.2020.1581313729558.

    @inproceedings{gupta2020measuring,
    title={Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality},
    author={Gupta, Kunal and Hajika, Ryo and Pai, Yun Suen and Duenser, Andreas and Lochner, Martin and Billinghurst, Mark},
    booktitle={2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)},
    pages={756--765},
    year={2020},
    organization={IEEE}
    }
    With the advancement of Artificial Intelligence technology to make smart devices, understanding how humans develop trust in virtual agents is emerging as a critical research field. Through our research, we report on a novel methodology to investigate user’s trust in auditory assistance in a Virtual Reality (VR) based search task, under both high and low cognitive load and under varying levels of agent accuracy. We collected physiological sensor data such as electroencephalography (EEG), galvanic skin response (GSR), and heart-rate variability (HRV), subjective data through questionnaire such as System Trust Scale (STS), Subjective Mental Effort Questionnaire (SMEQ) and NASA-TLX. We also collected a behavioral measure of trust (congruency of users’ head motion in response to valid/ invalid verbal advice from the agent). Our results indicate that our custom VR environment enables researchers to measure and understand human trust in virtual agents using the matrices, and both cognitive load and agent accuracy play an important role in trust formation. We discuss the implications of the research and directions for future work.
  • Adapting Fitts’ Law and N-Back to Assess Hand Proprioception
    Tamil Gunasekaran, Ryo Hajika, Chloe Dolma Si Ying Haigh, Yun Suen Pai, Danielle Lottridge, Mark Billinghurst.

    Gunasekaran, T. S., Hajika, R., Haigh, C. D. S. Y., Pai, Y. S., Lottridge, D., & Billinghurst, M. (2021, May). Adapting Fitts’ Law and N-Back to Assess Hand Proprioception. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).

    @inproceedings{gunasekaran2021adapting,
    title={Adapting Fitts’ Law and N-Back to Assess Hand Proprioception},
    author={Gunasekaran, Tamil Selvan and Hajika, Ryo and Haigh, Chloe Dolma Si Ying and Pai, Yun Suen and Lottridge, Danielle and Billinghurst, Mark},
    booktitle={Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems},
    pages={1--7},
    year={2021}
    }
    Proprioception is the body’s ability to sense the position and movement of each limb, as well as the amount of effort exerted onto or by them. Methods to assess proprioception have been introduced before, yet there is little to no study on assessing the degree of proprioception on body parts for use cases like gesture recognition wearable computing. We propose the use of Fitts’ law coupled with the N-Back task to evaluate proprioception of the hand. We evaluate 15 distinct points at the back of the hand and assess the musing extended 3D Fitts’ law. Our results show that the index of difficulty of tapping point from thumb to pinky increases gradually with a linear regression factor of 0.1144. Additionally, participants perform the tap before performing the N-Back task. From these results, we discuss the fundamental limitations and suggest how Fitts’ law can be further extended to assess proprioception
  • NeuralDrum: Perceiving Brain Synchronicity in XR Drumming
    Y. S. Pai, Ryo Hajika, Kunla Gupta, Prasnth Sasikumar, Mark Billinghurst.

    Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).

    @incollection{pai2020neuraldrum,
    title={NeuralDrum: Perceiving Brain Synchronicity in XR Drumming},
    author={Pai, Yun Suen and Hajika, Ryo and Gupta, Kunal and Sasikumar, Prasanth and Billinghurst, Mark},
    booktitle={SIGGRAPH Asia 2020 Technical Communications},
    pages={1--4},
    year={2020}
    }
    Brain synchronicity is a neurological phenomena where two or more individuals have their brain activation in phase when performing a shared activity. We present NeuralDrum, an extended reality (XR) drumming experience that allows two players to drum together while their brain signals are simultaneously measured. We calculate the Phase Locking Value (PLV) to determine their brain synchronicity and use this to directly affect their visual and auditory experience in the game, creating a closed feedback loop. In a pilot study, we logged and analysed the users’ brain signals as well as had them answer a subjective questionnaire regarding their perception of synchronicity with their partner and the overall experience. From the results, we discuss design implications to further improve NeuralDrum and propose methods to integrate brain synchronicity into interactive experiences.
  • NapWell: An EOG-based Sleep Assistant Exploring the Effects of Virtual Reality on Sleep Onset
    Yun Suen Pai, Marsel L. Bait, Juyoung Lee, Jingjing Xu, Roshan L Peiris, Woontack Woo, Mark Billinghurst & Kai Kunze

    Pai, Y. S., Bait, M. L., Lee, J., Xu, J., Peiris, R. L., Woo, W., ... & Kunze, K. (2022). NapWell: an EOG-based sleep assistant exploring the effects of virtual reality on sleep onset. Virtual Reality, 26(2), 437-451.

    @article{pai2022napwell,
    title={NapWell: an EOG-based sleep assistant exploring the effects of virtual reality on sleep onset},
    author={Pai, Yun Suen and Bait, Marsel L and Lee, Juyoung and Xu, Jingjing and Peiris, Roshan L and Woo, Woontack and Billinghurst, Mark and Kunze, Kai},
    journal={Virtual Reality},
    volume={26},
    number={2},
    pages={437--451},
    year={2022},
    publisher={Springer}
    }
    We present NapWell, a Sleep Assistant using virtual reality (VR) to decrease sleep onset latency by providing a realistic imagery distraction prior to sleep onset. Our proposed prototype was built using commercial hardware and with relatively low cost, making it replicable for future works as well as paving the way for more low cost EOG-VR devices for sleep assistance. We conducted a user study (n=20) by comparing different sleep conditions; no devices, sleeping mask, VR environment of the study room and preferred VR environment by the participant. During this period, we recorded the electrooculography (EOG) signal and sleep onset time using a finger tapping task (FTT). We found that VR was able to significantly decrease sleep onset latency. We also developed a machine learning model based on EOG signals that can predict sleep onset with a cross-validated accuracy of 70.03%. The presented study demonstrates the feasibility of VR to be used as a tool to decrease sleep onset latency, as well as the use of embedded EOG sensors with VR for automatic sleep detection.
  • RaITIn: Radar-Based Identification for Tangible Interactions
    Tamil Selvan Gunasekaran , Ryo Hajika , Yun Suen Pai , Eiji Hayashi , Mark Billinghurst

    Gunasekaran, T. S., Hajika, R., Pai, Y. S., Hayashi, E., & Billinghurst, M. (2022, April). RaITIn: Radar-Based Identification for Tangible Interactions. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-7).

    @inproceedings{gunasekaran2022raitin,
    title={RaITIn: Radar-Based Identification for Tangible Interactions},
    author={Gunasekaran, Tamil Selvan and Hajika, Ryo and Pai, Yun Suen and Hayashi, Eiji and Billinghurst, Mark},
    booktitle={CHI Conference on Human Factors in Computing Systems Extended Abstracts},
    pages={1--7},
    year={2022}
    }
    Radar is primarily used for applications like tracking and large-scale ranging, and its use for object identification has been rarely explored. This paper introduces RaITIn, a radar-based identification (ID) method for tangible interactions. Unlike conventional radar solutions, RaITIn can track and identify objects on a tabletop scale. We use frequency modulated continuous wave (FMCW) radar sensors to classify different objects embedded with low-cost radar reflectors of varying sizes on a tabletop setup. We also introduce Stackable IDs, where different objects can be stacked and combined to produce unique IDs. The result allows RaITIn to accurately identify visually identical objects embedded with different low-cost reflector configurations. When combined with a radar’s ability for tracking, it creates novel tabletop interaction modalities. We discuss possible applications and areas for future work.