Kunal Gupta is a PhD candidate at the Empathic Computing Laboratory, at the University of Auckland, New Zealand, under the supervision of Prof. Mark Billinghurst. His research revolves around emotion recognition and representation in VR/AR using physiological sensing and contextual information.
Prior to this, he worked as a User Experience Researcher at Google and as an Interaction Designer at a startup in India, for a total of around 5 years of industry experience. He obtained a master’s degree in Human Interface Technology (MHIT) from the HITLab NZ at the University of Canterbury under the supervision of Prof Mark Billinghurst in 2015. As part of his master’s degree, he conducted some of the first research into the use of eye-gaze to enhance remote collaboration in wearable AR systems.
For more details about his research work, check out his website: kunalgupta.in
This project explores how brain activity can be used for computer input. The innovative MindReader game uses EEG (electroencephalogram) based Brain-Computer Interface (BCI) technology to showcase the player’s real-time brain waves. It uses colourful and creative visuals to show the raw brain activity from a number of EEG electrodes worn on the head. The player can also play a version of the Angry Birds game where their concentration level determines how far the birds can be shot. In this cheerful and engaging demo, friends and family can challenge each other to see who has the strongest neural connections!
K. Gupta, R. Hajika, Y. S. Pai, A. Duenser, M. Lochner and M. Billinghurst, "Measuring Human Trust in a Virtual Assistant using Physiological Sensing in Virtual Reality," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 2020, pp. 756-765, doi: 10.1109/VR46266.2020.1581313729558.
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Schlagowski, R., Gupta, K., Mertes, S., Billinghurst, M., Metzner, S., & André, E. (2022, March). Jamming in MR: Towards Real-Time Music Collaboration in Mixed Reality. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (pp. 854-855). IEEE.
A. Jing, K. Gupta, J. McDade, G. A. Lee and M. Billinghurst, "Comparing Gaze-Supported Modalities with Empathic Mixed Reality Interfaces in Remote Collaboration," 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Singapore, Singapore, 2022, pp. 837-846, doi: 10.1109/ISMAR55827.2022.00102.
Allison Jing, Kunal Gupta, Jeremy McDade, Gun Lee, and Mark Billinghurst. 2022. Near-Gaze Visualisations of Empathic Communication Cues in Mixed Reality Collaboration. In ACM SIGGRAPH 2022 Posters (SIGGRAPH '22). Association for Computing Machinery, New York, NY, USA, Article 29, 1–2. https://doi.org/10.1145/3532719.3543213
Schlagowski, R., Nazarenko, D., Can, Y., Gupta, K., Mertes, S., Billinghurst, M., & André, E. (2023, April). Wish you were here: Mental and physiological effects of remote music collaboration in mixed reality. In Proceedings of the 2023 CHI conference on human factors in computing systems (pp. 1-16).
Chang, Z., Bai, H., Zhang, L., Gupta, K., He, W., & Billinghurst, M. (2022). The impact of virtual agents’ multimodal communication on brain activity and cognitive load in virtual reality. Frontiers in Virtual Reality, 3, 995090.