Dr. Yun Suen Pai is currently a Research Fellow at the Empathic Computing Laboratory (ECL), University of Auckland, directed by Prof. Mark Billinghurst. He received his Masters degree in Engineering Science on 2015 at the University of Malaya, Malaysia. He then completed his PhD from the Keio University Graduate School of Media Design, Yokohama, Japan in 2018 (Supervised by Prof. Kai Kunze), before continuing to be a researcher there for an additional 6 months.
His research interests includes the effects of augmented, virtual and mixed reality towards human perception, behavior, and physiological state. He has collaborated with several companies, including AirBus Malaysia, Fujitsu Design, NTT Media Intelligence Laboratory, Ignition Point, CSIRO and Google in several research areas, including haptics in AR, vision augmentation, VR navigation, and the use of machine learning for novel input and interactions.
Publications:
Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., & Billinghurst, M. (2020, March). Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality. To appear in 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) IEEE.
Hajika, R., Gupta, K., Sasikumar, P., & Pai, Y. S. (2019, November). HyperDrum: Interactive Synchronous Drumming in Virtual Reality using Everyday Objects. In SIGGRAPH Asia 2019 XR (pp. 15-16). ACM.
Gupta, K., Hajika, R., Pai, Y. S., Duenser, A., Lochner, M., & Billinghurst, M. (2019, November). In AI We Trust: Investigating the Relationship between Biosignals, Trust and Cognitive Load in VR. In 25th ACM Symposium on Virtual Reality Software and Technology (p. 33). ACM.
RadarHand is a wrist-worn wearable system that uses radar sensing to detect on-skin proprioceptive hand gestures, making it easy to interact with simple finger motions. Radar has the advantage of being robust, private, small, penetrating materials and requiring low computation costs. In this project, we first evaluated the proprioceptive nature of the back of the hand and found that the thumb is the most proprioceptive of all the finger joints, followed by the index finger, middle finger, ring finger and pinky finger. This helped determine the types of gestures most suitable for the system. Next, we trained deep-learning models for gesture classification. Out of 27 gesture group possibilities, we achieved 92% accuracy for a generic set of seven gestures and 93% accuracy for the proprioceptive set of eight gestures. We also evaluated RadarHand's performance in real-time and achieved an accuracy of between 74% and 91% depending if the system or user initiates the gesture first. This research could contribute to a new generation of radar-based interfaces that allow people to interact with computers in a more natural way.
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Pai, Y. S., Bait, M. L., Lee, J., Xu, J., Peiris, R. L., Woo, W., ... & Kunze, K. (2022). NapWell: an EOG-based sleep assistant exploring the effects of virtual reality on sleep onset. Virtual Reality, 26(2), 437-451.
K. Gupta, R. Hajika, Y. S. Pai, A. Duenser, M. Lochner and M. Billinghurst, "Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 756-765, doi: 10.1109/VR46266.2020.1581313729558.
Gunasekaran, T. S., Hajika, R., Haigh, C. D. S. Y., Pai, Y. S., Lottridge, D., & Billinghurst, M. (2021, May). Adapting Fitts’ Law and N-Back to Assess Hand Proprioception. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
Gunasekaran, T. S., Hajika, R., Pai, Y. S., Hayashi, E., & Billinghurst, M. (2022, April). RaITIn: Radar-Based Identification for Tangible Interactions. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-7).