My research interest is exploring how extended reality technologies can be leveraged to understand human cognition, assist diverse users, and augment our abilities. I graduated from the University of Malaya, Malaysia (UM) in 2013 with a degree in Computer-Aided Design and Manufacturing Engineering. I then furthered my research at UM which focuses on the use of AR technology for simulating machining processes, to obtain my Masters in Engineering Science in 2015. In 2018, I earned my Ph.D in Media Design from the Keio University Graduate School of Media Design (KMD) with my thesis entitled “Convex Interactions: Towards Efficient Human Motion in Peripersonal Space Using Virtual Reality”.
For my postdoctoral research, I travelled to New Zealand to join the Empathic Computing Laboratory (ECL) for 2 years. Afterwards, I was a Project Senior Assistant Professor (non-tenured) at KMD, directing the Physionetic Interactions Group in the Embodied Media Laboratory until 2023.
I am currently a Lecturer (tenured) in the University of Auckland, New Zealand where I co-direct ECL with Prof Mark Billinghurst.
RadarHand is a wrist-worn wearable system that uses radar sensing to detect on-skin proprioceptive hand gestures, making it easy to interact with simple finger motions. Radar has the advantage of being robust, private, small, penetrating materials and requiring low computation costs. In this project, we first evaluated the proprioceptive nature of the back of the hand and found that the thumb is the most proprioceptive of all the finger joints, followed by the index finger, middle finger, ring finger and pinky finger. This helped determine the types of gestures most suitable for the system. Next, we trained deep-learning models for gesture classification. Out of 27 gesture group possibilities, we achieved 92% accuracy for a generic set of seven gestures and 93% accuracy for the proprioceptive set of eight gestures. We also evaluated RadarHand's performance in real-time and achieved an accuracy of between 74% and 91% depending if the system or user initiates the gesture first. This research could contribute to a new generation of radar-based interfaces that allow people to interact with computers in a more natural way.
K. Gupta, R. Hajika, Y. S. Pai, A. Duenser, M. Lochner and M. Billinghurst, "Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 756-765, doi: 10.1109/VR46266.2020.1581313729558.
Gunasekaran, T. S., Hajika, R., Haigh, C. D. S. Y., Pai, Y. S., Lottridge, D., & Billinghurst, M. (2021, May). Adapting Fitts’ Law and N-Back to Assess Hand Proprioception. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Pai, Y. S., Bait, M. L., Lee, J., Xu, J., Peiris, R. L., Woo, W., ... & Kunze, K. (2022). NapWell: an EOG-based sleep assistant exploring the effects of virtual reality on sleep onset. Virtual Reality, 26(2), 437-451.
Gunasekaran, T. S., Hajika, R., Pai, Y. S., Hayashi, E., & Billinghurst, M. (2022, April). RaITIn: Radar-Based Identification for Tangible Interactions. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-7).
Hajika, R., Gunasekaran, T. S., Haigh, C. D. S. Y., Pai, Y. S., Hayashi, E., Lien, J., ... & Billinghurst, M. (2024). RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive Gestures. ACM Transactions on Computer-Human Interaction, 31(2), 1-36.
Fernandez, A. A., Kim, C. C., Gunasekaran, T. S., Pai, Y. S., & Minamizawa, K. (2023, October). Virtual journalist: measuring and inducing cultural empathy by visualizing empathic perspectives in VR. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 667-672). IEEE.