Amit Barde

Amit Barde

Research Fellow

Amit is a Research Fellow at the Empathic Computing Laboratory within the Auckland Bioengineering Institute, University of Auckland. He has a background in the arts, and his doctoral research explored the use of spatialised auditory and visual cue for information delivery on wearable devices. Amit’s research interests lie at the intersection of the arts, science and engineering.

 

Amit is also an experienced sound designer, having worked on numerous short films and commercials.

Projects

  • Brain Synchronisation in VR

    Collaborative Virtual Reality have been the subject of research for nearly three decades now. This has led to a deep understanding of how individuals interact in such environments and some of the factors that impede these interactions. However, despite this knowledge we still do not fully understand how inter-personal interactions in virtual environments are reflected in the physiological domain. This project seeks to answer the question by monitoring neural activity of participants in collaborative virtual environments. We do this by using a technique known as Hyperscanning, which refers to the simultaneous acquisition of neural activity from two or more people. In this project we use Hyperscanning to determine if individuals interacting in a virtual environment exhibit inter-brain synchrony. The goal of this project is to first study the phenomenon of inter-brain synchrony, and then find means of inducing and expediting it by making changes in the virtual environment. This project feeds into the overarching goals of the Empathic Computing Laboratory that seek to bring individuals closer using technology as a vehicle to evoke empathy.

Publications

  • A Constrained Path Redirection for Passive Haptics
    Lili Wang ; Zixiang Zhao ; Xuefeng Yang ; Huidong Bai ; Amit Barde ; Mark Billinghurst

    L. Wang, Z. Zhao, X. Yang, H. Bai, A. Barde and M. Billinghurst, "A Constrained Path Redirection for Passive Haptics," 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 2020, pp. 651-652, doi: 10.1109/VRW50115.2020.00176.

    @inproceedings{wang2020constrained,
    title={A Constrained Path Redirection for Passive Haptics},
    author={Wang, Lili and Zhao, Zixiang and Yang, Xuefeng and Bai, Huidong and Barde, Amit and Billinghurst, Mark},
    booktitle={2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
    pages={651--652},
    year={2020},
    organization={IEEE}
    }
    Navigation with passive haptic feedback can enhance users’ immersion in virtual environments. We propose a constrained path redirection method to provide users with corresponding haptic feedback at the right time and place. We have quantified the VR exploration practicality in a study and the results show advantages over steer-to-center method in terms of presence, and over Steinicke’s method in terms of matching errors and presence.
  • A comparative study on inter-brain synchrony in real and virtual environments using hyperscanning
    Ihshan Gumilar, Ekansh Sareen, Reed Bell, Augustus Stone, Ashkan Hayati, Jingwen Mao, Amit Barde, Anubha Gupta, Arindam Dey, Gun Lee, Mark Billinghurst

    Gumilar, I., Sareen, E., Bell, R., Stone, A., Hayati, A., Mao, J., ... & Billinghurst, M. (2021). A comparative study on inter-brain synchrony in real and virtual environments using hyperscanning. Computers & Graphics, 94, 62-75.

    @article{gumilar2021comparative,
    title={A comparative study on inter-brain synchrony in real and virtual environments using hyperscanning},
    author={Gumilar, Ihshan and Sareen, Ekansh and Bell, Reed and Stone, Augustus and Hayati, Ashkan and Mao, Jingwen and Barde, Amit and Gupta, Anubha and Dey, Arindam and Lee, Gun and others},
    journal={Computers \& Graphics},
    volume={94},
    pages={62--75},
    year={2021},
    publisher={Elsevier}
    }
    Researchers have employed hyperscanning, a technique used to simultaneously record neural activity from multiple participants, in real-world collaborations. However, to the best of our knowledge, there is no study that has used hyperscanning in Virtual Reality (VR). The aims of this study were; firstly, to replicate results of inter-brain synchrony reported in existing literature for a real world task and secondly, to explore whether the inter-brain synchrony could be elicited in a Virtual Environment (VE). This paper reports on three pilot-studies in two different settings (real-world and VR). Paired participants performed two sessions of a finger-pointing exercise separated by a finger-tracking exercise during which their neural activity was simultaneously recorded by electroencephalography (EEG) hardware. By using Phase Locking Value (PLV) analysis, VR was found to induce similar inter-brain synchrony as seen in the real-world. Further, it was observed that the finger-pointing exercise shared the same neurally activated area in both the real-world and VR. Based on these results, we infer that VR can be used to enhance inter-brain synchrony in collaborative tasks carried out in a VE. In particular, we have been able to demonstrate that changing visual perspective in VR is capable of eliciting inter-brain synchrony. This demonstrates that VR could be an exciting platform to explore the phenomena of inter-brain synchrony further and provide a deeper understanding of the neuroscience of human communication.
  • Connecting the Brains via Virtual Eyes: Eye-Gaze Directions and Inter-brain Synchrony in VR
    Ihshan Gumilar, Amit Barde, Ashkan Hayati, Mark Billinghurst, Gun Lee, Abdul Momin, Charles Averill, Arindam Dey.

    Gumilar, I., Barde, A., Hayati, A. F., Billinghurst, M., Lee, G., Momin, A., ... & Dey, A. (2021, May). Connecting the Brains via Virtual Eyes: Eye-Gaze Directions and Inter-brain Synchrony in VR. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-7).

    @inproceedings{gumilar2021connecting,
    title={Connecting the Brains via Virtual Eyes: Eye-Gaze Directions and Inter-brain Synchrony in VR},
    author={Gumilar, Ihshan and Barde, Amit and Hayati, Ashkan F and Billinghurst, Mark and Lee, Gun and Momin, Abdul and Averill, Charles and Dey, Arindam},
    booktitle={Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems},
    pages={1--7},
    year={2021}
    }
    Hyperscanning is an emerging method for measuring two or more brains simultaneously. This method allows researchers to simultaneously record neural activity from two or more people. While this method has been extensively implemented over the last five years in the real-world to study inter-brain synchrony, there is little work that has been undertaken in the use of hyperscanning in virtual environments. Preliminary research in the area demonstrates that inter-brain synchrony in virtual environments can be achieved in a mannersimilar to thatseen in the real world. The study described in this paper proposes to further research in the area by studying how non-verbal communication cues in social interactions in virtual environments can afect inter-brain synchrony. In particular, we concentrate on the role eye gaze playsin inter-brain synchrony. The aim of this research is to explore how eye gaze afects inter-brain synchrony between users in a collaborative virtual environment
  • A Review of Hyperscanning and Its Use in Virtual Environments
    Ihshan Gumilar, Ekansh Sareen, Reed Bell, Augustus Stone, Ashkan Hayati, Jingwen Mao, Amit Barde, Anubha Gupta, Arindam Dey, Gun Lee, Mark Billinghurst

    Barde, A., Gumilar, I., Hayati, A. F., Dey, A., Lee, G., & Billinghurst, M. (2020, December). A Review of Hyperscanning and Its Use in Virtual Environments. In Informatics (Vol. 7, No. 4, p. 55). Multidisciplinary Digital Publishing Institute.

    @inproceedings{barde2020review,
    title={A Review of Hyperscanning and Its Use in Virtual Environments},
    author={Barde, Amit and Gumilar, Ihshan and Hayati, Ashkan F and Dey, Arindam and Lee, Gun and Billinghurst, Mark},
    booktitle={Informatics},
    volume={7},
    number={4},
    pages={55},
    year={2020},
    organization={Multidisciplinary Digital Publishing Institute}
    }
    Researchers have employed hyperscanning, a technique used to simultaneously record neural activity from multiple participants, in real-world collaborations. However, to the best of our knowledge, there is no study that has used hyperscanning in Virtual Reality (VR). The aims of this study were; firstly, to replicate results of inter-brain synchrony reported in existing literature for a real world task and secondly, to explore whether the inter-brain synchrony could be elicited in a Virtual Environment (VE). This paper reports on three pilot-studies in two different settings (real-world and VR). Paired participants performed two sessions of a finger-pointing exercise separated by a finger-tracking exercise during which their neural activity was simultaneously recorded by electroencephalography (EEG) hardware. By using Phase Locking Value (PLV) analysis, VR was found to induce similar inter-brain synchrony as seen in the real-world. Further, it was observed that the finger-pointing exercise shared the same neurally activated area in both the real-world and VR. Based on these results, we infer that VR can be used to enhance inter-brain synchrony in collaborative tasks carried out in a VE. In particular, we have been able to demonstrate that changing visual perspective in VR is capable of eliciting inter-brain synchrony. This demonstrates that VR could be an exciting platform to explore the phenomena of inter-brain synchrony further and provide a deeper understanding of the neuroscience of human communication.
  • Inter-brain connectivity: Comparisons between real and virtual environments using hyperscanning
    Amit Barde, Nastaran Saffaryazdi, P. Withana, N. Patel, Prasanth Sasikumar, Mark Billinghurst

    Barde, A., Saffaryazdi, N., Withana, P., Patel, N., Sasikumar, P., & Billinghurst, M. (2019, October). Inter-brain connectivity: Comparisons between real and virtual environments using hyperscanning. In 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 338-339). IEEE.

    @inproceedings{barde2019inter,
    title={Inter-brain connectivity: Comparisons between real and virtual environments using hyperscanning},
    author={Barde, Amit and Saffaryazdi, Nastaran and Withana, Pawan and Patel, Nakul and Sasikumar, Prasanth and Billinghurst, Mark},
    booktitle={2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
    pages={338--339},
    year={2019},
    organization={IEEE}
    }
    Inter-brain connectivity between pairs of people was explored during a finger tracking task in the real-world and in Virtual Reality (VR). This was facilitated by the use of a dual EEG set-up that allowed us to use hyperscanning to simultaneously record the neural activity of both participants. We found that similar levels of inter-brain synchrony can be elicited in the real-world and VR for the same task. This is the first time that hyperscanning has been used to compare brain activity for the same task performed in real and virtual environments.
  • Inter-brain Synchrony and Eye Gaze Direction During Collaboration in VR
    Ihshan Gumilar , Amit Barde , Prasanth Sasikumar , Mark Billinghurst , Ashkan F. Hayati , Gun Lee , Yuda Munarko , Sanjit Singh , Abdul Momin

    Gumilar, I., Barde, A., Sasikumar, P., Billinghurst, M., Hayati, A. F., Lee, G., ... & Momin, A. (2022, April). Inter-brain Synchrony and Eye Gaze Direction During Collaboration in VR. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (pp. 1-7).

    @inproceedings{gumilar2022inter,
    title={Inter-brain Synchrony and Eye Gaze Direction During Collaboration in VR},
    author={Gumilar, Ihshan and Barde, Amit and Sasikumar, Prasanth and Billinghurst, Mark and Hayati, Ashkan F and Lee, Gun and Munarko, Yuda and Singh, Sanjit and Momin, Abdul},
    booktitle={CHI Conference on Human Factors in Computing Systems Extended Abstracts},
    pages={1--7},
    year={2022}
    }
    Brain activity sometimes synchronises when people collaborate together on real world tasks. Understanding this process could to lead to improvements in face to face and remote collaboration. In this paper we report on an experiment exploring the relationship between eye gaze and inter-brain synchrony in Virtual Reality (VR). The experiment recruited pairs who were asked to perform finger-tracking exercises in VR with three different gaze conditions: averted, direct, and natural, while their brain activity was recorded. We found that gaze direction has a significant effect on inter-brain synchrony during collaboration for this task in VR. This shows that representing natural gaze could influence inter-brain synchrony in VR, which may have implications for avatar design for social VR. We discuss implications of our research and possible directions for future work.
  • Effects of interacting with facial expressions and controllers in different virtual environments on presence, usability, affect, and neurophysiological signals
    Arindam Dey, Amit Barde, Bowen Yuan, Ekansh Sareen, Chelsea Dobbins, Aaron Goh, Gaurav Gupta, Anubha Gupta, MarkBillinghurst

    Dey, A., Barde, A., Yuan, B., Sareen, E., Dobbins, C., Goh, A., ... & Billinghurst, M. (2022). Effects of interacting with facial expressions and controllers in different virtual environments on presence, usability, affect, and neurophysiological signals. International Journal of Human-Computer Studies, 160, 102762.

    @article{dey2022effects,
    title={Effects of interacting with facial expressions and controllers in different virtual environments on presence, usability, affect, and neurophysiological signals},
    author={Dey, Arindam and Barde, Amit and Yuan, Bowen and Sareen, Ekansh and Dobbins, Chelsea and Goh, Aaron and Gupta, Gaurav and Gupta, Anubha and Billinghurst, Mark},
    journal={International Journal of Human-Computer Studies},
    volume={160},
    pages={102762},
    year={2022},
    publisher={Elsevier}
    }
    Virtual Reality (VR) interfaces provide an immersive medium to interact with the digital world. Most VR interfaces require physical interactions using handheld controllers, but there are other alternative interaction methods that can support different use cases and users. Interaction methods in VR are primarily evaluated based on their usability, however, their differences in neurological and physiological effects remains less investigated. In this paper—along with other traditional qualitative matrices such as presence, affect, and system usability—we explore the neurophysiological effects—brain signals and electrodermal activity—of using an alternative facial expression interaction method to interact with VR interfaces. This form of interaction was also compared with traditional handheld controllers. Three different environments, with different experiences to interact with were used—happy (butterfly catching), neutral (object picking), and scary (zombie shooting). Overall, we noticed an effect of interaction methods on the gamma activities in the brain and on skin conductance. For some aspects of presence, facial expression outperformed controllers but controllers were found to be better than facial expressions in terms of usability.
  • Implementation of Attention-Based Spatial Audio for 360° Environments.
    Nassani, A., Barde, A., Bai, H., Nanayakkara, S., & Billinghurst, M.

  • LightSense-Long Distance
    Uwe Rieger, Yinan Liu, Tharindu Kaluarachchi, Amit Barde, Huidong Bai, Alaeddin Nassani, Suranga Nanayakkara, Mark Billinghurst.

    Rieger, U., Liu, Y., Kaluarachchi, T., Barde, A., Bai, H., Nassani, A., ... & Billinghurst, M. (2023). LightSense-Long Distance. In ACM SIGGRAPH Asia 2023 Art Gallery (pp. 1-2).

    @incollection{rieger2023lightsense,
    title={LightSense-Long Distance},
    author={Rieger, Uwe and Liu, Yinan and Kaluarachchi, Tharindu and Barde, Amit and Bai, Huidong and Nassani, Alaeddin and Nanayakkara, Suranga and Billinghurst, Mark},
    booktitle={ACM SIGGRAPH Asia 2023 Art Gallery},
    pages={1--2},
    year={2023}
    }
    'LightSense - Long Distance' explores remote interaction with architectural space. It is a virtual extension of the project 'LightSense,' which is currently presented at the exhibition 'Cyber Physical: Architecture in Real Time' at EPFL Pavilions in Switzerland. Using numerous VR headsets, the setup at the Art Gallery at SIGGRAPH Asia establishes a direct connection between both exhibition sites in Sydney and Lausanne.
    'LightSense' at EPFL Pavilions is an immersive installation that allows the audience to engage in intimate interaction with a living architectural body. It consists of a 12-meter-long construction that combines a lightweight structure with projected 3D holographic animations. At its core sits a neural network, which has been trained on sixty thousand poems. This allows the structure to engage, lead, and sustain conversations with the visitor. Its responses are truly associative, unpredictable, meaningful, magical, and deeply emotional. Analysing the emotional tenor of the conversation, 'LightSense' can transform into a series of hybrid architectural volumes, immersing the visitors in Pavilions of Love, Anger, Curiosity, and Joy.
    'LightSense's' physical construction is linked to a digital twin. Movement, holographic animations, sound, and text responses are controlled by the cloud-based AI system. This combination creates a location-independent cyber-physical system. As such, the 'Long Distance' version, which premiered at SIGGRAPH Asia, enables the visitors in Sydney to directly engage with the physical setup in Lausanne. Using VR headsets with a new 360-degree 4K live streaming system, the visitors find themselves teleported to face 'LightSense', able to engage in a direct conversation with the structure on-site.
    'LightSense - Long Distance' leaves behind the notion of architecture being a place-bound and static environment. Instead, it points toward the next generation of responsive buildings that transcend space, are capable of dynamic behaviour, and able to accompany their visitors as creative partners.