Lynda Gerry

Lynda Gerry

PhD Student

Lynda is a researcher and software developer employing methods from psychology and cognitive neuroscience to test the impacts of embodied interfaces in virtual environments on learning, cooperation, and empathy. Her goal is to cultivate “augmented social cognition” by using virtual environments to enhance our abilities to understand one another through embodied interfaces created based on theoretical models of empathy from neuroscience and psychology. In 2016, she created Paint With Me, an expert-novice skills transmission designed to motivate and teach users to paint in a mixed reality setup wherein users see and hear from the point of view of a painter describing her creative process while painting along with her on their own physical canvas. Motion tracking and adaptive visual feedback and made this system effective for the teaching of creative, fine-motor skills. Moreover, synchronously performing a joint action while sharing a first-person perspective was correlated with greater empathic accuracy. Lynda obtained her MA from University of Copenhagen in 2017, with a thesis titled “Virtual Reality as a Tool to Facilitate Empathy”, in Cognition and Communication with coursework and supervision from The Center for Subjectivity Research. Lynda also worked in the Empathic Computing Lab at University of South Australia with Brain-Computer Interfaces in virtual environments and created an augmented reality version of The Machine to Be Another body swap called First-Person Squared. Her main research interests include enactivism, embodiment, empathy, bodily self-consciousness, sense of agency, and intersubjectivity.

Publications

  • Levity: A Virtual Reality System that Responds to Cognitive Load
    Lynda Gerry, Barrett Ens, Adam Drogemuller, Bruce Thomas, Mark Billinghurst

    Lynda Gerry, Barrett Ens, Adam Drogemuller, Bruce Thomas, and Mark Billinghurst. 2018. Levity: A Virtual Reality System that Responds to Cognitive Load. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (CHI EA '18). ACM, New York, NY, USA, Paper LBW610, 6 pages. DOI: https://doi.org/10.1145/3170427.3188479

    @inproceedings{Gerry:2018:LVR:3170427.3188479,
    author = {Gerry, Lynda and Ens, Barrett and Drogemuller, Adam and Thomas, Bruce and Billinghurst, Mark},
    title = {Levity: A Virtual Reality System That Responds to Cognitive Load},
    booktitle = {Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems},
    series = {CHI EA '18},
    year = {2018},
    isbn = {978-1-4503-5621-3},
    location = {Montreal QC, Canada},
    pages = {LBW610:1--LBW610:6},
    articleno = {LBW610},
    numpages = {6},
    url = {http://doi.acm.org/10.1145/3170427.3188479},
    doi = {10.1145/3170427.3188479},
    acmid = {3188479},
    publisher = {ACM},
    address = {New York, NY, USA},
    keywords = {brain computer interface, cognitive load, virtual reality, visual search task},
    }
    This paper presents the ongoing development of a proof-of-concept, adaptive system that uses a neurocognitive signal to facilitate efficient performance in a Virtual Reality visual search task. The Levity system measures and interactively adjusts the display of a visual array during a visual search task based on the user's level of cognitive load, measured with a 16-channel EEG device. Future developments will validate the system and evaluate its ability to improve search efficiency by detecting and adapting to a user's cognitive demands.