Tamil Selvan is a master’s student at the Empathic Computing Laboratory, at the University of Auckland, New Zealand, under the supervision of Prof. Mark Billinghurst. His research involves creating an interactive system using multiple radar systems. Prior to his, he worked as a research intern at the Augmented Human Lab at the University of Auckland.
He obtained a bachelor’s degree in Electronics and Communication Engineering at Vellore Institute of Technology, India. For more info, please visit www.tamilselvan.info
This project explores how brain activity can be used for computer input. The innovative MindReader game uses EEG (electroencephalogram) based Brain-Computer Interface (BCI) technology to showcase the player’s real-time brain waves. It uses colourful and creative visuals to show the raw brain activity from a number of EEG electrodes worn on the head. The player can also play a version of the Angry Birds game where their concentration level determines how far the birds can be shot. In this cheerful and engaging demo, friends and family can challenge each other to see who has the strongest neural connections!
RadarHand is a wrist-worn wearable system that uses radar sensing to detect on-skin proprioceptive hand gestures, making it easy to interact with simple finger motions. Radar has the advantage of being robust, private, small, penetrating materials and requiring low computation costs. In this project, we first evaluated the proprioceptive nature of the back of the hand and found that the thumb is the most proprioceptive of all the finger joints, followed by the index finger, middle finger, ring finger and pinky finger. This helped determine the types of gestures most suitable for the system. Next, we trained deep-learning models for gesture classification. Out of 27 gesture group possibilities, we achieved 92% accuracy for a generic set of seven gestures and 93% accuracy for the proprioceptive set of eight gestures. We also evaluated RadarHand's performance in real-time and achieved an accuracy of between 74% and 91% depending if the system or user initiates the gesture first. This research could contribute to a new generation of radar-based interfaces that allow people to interact with computers in a more natural way.
Gunasekaran, T. S., Hajika, R., Haigh, C. D. S. Y., Pai, Y. S., Lottridge, D., & Billinghurst, M. (2021, May). Adapting Fitts’ Law and N-Back to Assess Hand Proprioception. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).
Gunasekaran, T. S., Hajika, R., Pai, Y. S., Hayashi, E., & Billinghurst, M. (2022, April). RaITIn: Radar-Based Identification for Tangible Interactions. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-7).
Hajika, R., Gunasekaran, T. S., Haigh, C. D. S. Y., Pai, Y. S., Hayashi, E., Lien, J., ... & Billinghurst, M. (2024). RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive Gestures. ACM Transactions on Computer-Human Interaction, 31(2), 1-36.
Fernandez, A. A., Kim, C. C., Gunasekaran, T. S., Pai, Y. S., & Minamizawa, K. (2023, October). Virtual journalist: measuring and inducing cultural empathy by visualizing empathic perspectives in VR. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 667-672). IEEE.