Zhuang Chang

Zhuang Chang

PhD Student

Zhuang Chang is a Ph.D. candidate at the Empathic Computing Laboratory, at the University of Auckland, New Zealand, under the supervision of Prof. Mark Billinghurst and Dr. Huidong Bai. His research is about empathic mixed reality with emotional agents.

He received a master’s degree in advanced manufacturing engineering from the Northwestern Polytechnical University, China in 2018. His master research topic was about designing and evaluating Augmented Reality systems for occluded assembly tasks in narrow spaces. He visited the Empathic Computing Laboratory, at the University of South Australia, in 2017 and conducted research on eliciting and sharing emotions in collaborative Mixed Reality.

Projects

  • TBI Cafe

    Over 36,000 Kiwis experience Traumatic Brain Injury (TBI) per year. TBI patients often experience severe cognitive fatigue, which impairs their ability to cope well in public/social settings. Rehabilitation can involve taking people into social settings with a therapist so that they can relearn how to interact in these environments. However, this is a time-consuming, expensive and difficult process. To address this, we've created the TBI Cafe, a VR tool designed to help TBI patients cope with their injury and practice interacting in a cafe. In this application, people in VR practice ordering food and drink while interacting with virtual characters. Different types of distractions are introduced, such as a crying baby and loud conversations, which are designed to make the experience more stressful, and let the user practice managing stressful situations. Clinical trials with the software are currently underway.

Publications

  • An Asynchronous Hybrid Cross Reality Collaborative System
    Hyunwoo Cho, Bowen Yuan, Jonathon Derek Hart, Eunhee Chang, Zhuang Chang, Jiashuo Cao, Gun A. Lee, Thammathip Piumsomboon, and Mark Billinghurst.

    Cho, H., Yuan, B., Hart, J. D., Chang, E., Chang, Z., Cao, J., ... & Billinghurst, M. (2023, October). An asynchronous hybrid cross reality collaborative system. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 70-73). IEEE.

    @inproceedings{cho2023asynchronous,
    title={An asynchronous hybrid cross reality collaborative system},
    author={Cho, Hyunwoo and Yuan, Bowen and Hart, Jonathon Derek and Chang, Eunhee and Chang, Zhuang and Cao, Jiashuo and Lee, Gun A and Piumsomboon, Thammathip and Billinghurst, Mark},
    booktitle={2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
    pages={70--73},
    year={2023},
    organization={IEEE}
    }
    This work presents a Mixed Reality (MR)-based asynchronous hybrid cross reality collaborative system which supports recording and playback of user actions in three-dimensional task space at different periods in time. Using this system, an expert user can record a task process such as virtual object placement or assembly, which can then be viewed by other users in either Augmented Reality (AR) or Virtual Reality (VR) views at later points in time to complete the task. In VR, the pre-scanned 3D workspace can be experienced to enhance the understanding of spatial information. Alternatively, AR can provide real-scale information to help the workers manipulate real world objects, and complete the task assignment. Users can also seamlessly move between AR and VR views as desired. In this way the system can contribute to improving task performance and co-presence during asynchronous collaboration.
  • Time Travellers: An Asynchronous Cross Reality Collaborative System
    Hyunwoo Cho, Bowen Yuan, Jonathon Derek Hart, Zhuang Chang, Jiashuo Cao, Eunhee Chang, and Mark Billinghurst.

    Cho, H., Yuan, B., Hart, J. D., Chang, Z., Cao, J., Chang, E., & Billinghurst, M. (2023, October). Time Travellers: An Asynchronous Cross Reality Collaborative System. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 848-853). IEEE.

    @inproceedings{cho2023time,
    title={Time Travellers: An Asynchronous Cross Reality Collaborative System},
    author={Cho, Hyunwoo and Yuan, Bowen and Hart, Jonathon Derek and Chang, Zhuang and Cao, Jiashuo and Chang, Eunhee and Billinghurst, Mark},
    booktitle={2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
    pages={848--853},
    year={2023},
    organization={IEEE}
    }
    This work presents a Mixed Reality (MR)-based asynchronous hybrid cross reality collaborative system which supports recording and playback of user actions in a large task space at different periods in time. Using this system, an expert can record a task process such as virtual object placement or assembly, which can then be viewed by other users in either Augmented Reality (AR) or Virtual Reality (VR) at later points in time to complete the task. In VR, the pre-scanned 3D workspace can be experienced to enhance the understanding of spatial information. Alternatively, AR can provide real-scale information to help the workers manipulate real-world objects, and complete the assignment. Users can seamlessly switch between AR and VR views as desired. In this way, the system can contribute to improving task performance and co-presence during asynchronous collaboration. The system is demonstrated in a use-case scenario of object assembly using parts that must be retrieved from a storehouse location. A pilot user study found that cross reality asynchronous collaboration system was helpful in providing information about work environments, inducing faster task completion with a lower task load. We provide lessons learned and suggestions for future research.
  • Exploring Real-time Precision Feedback for AR-assisted Manual Adjustment in Mechanical Assembly
    Xingyue Tang, Zhuang Chang, Weiping He, Mark Billinghurst, and Xiaotian Zhang.

    Tang, X., Chang, Z., He, W., Billinghurst, M., & Zhang, X. (2023, October). Exploring Real-time Precision Feedback for AR-assisted Manual Adjustment in Mechanical Assembly. In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).

    @inproceedings{tang2023exploring,
    title={Exploring Real-time Precision Feedback for AR-assisted Manual Adjustment in Mechanical Assembly},
    author={Tang, Xingyue and Chang, Zhuang and He, Weiping and Billinghurst, Mark and Zhang, Xiaotian},
    booktitle={Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology},
    pages={1--11},
    year={2023}
    }
    Augmented Reality (AR) based manual assembly nowadays enables to guide the process of physical tasks, providing intuitive instructions and detailed information in real-time. However, very limited studies have explored AR manual adjustment tasks with precision requirements. In this paper, we develop an AR-assisted guidance system for manual adjustments with relatively high-precision requirements. We first assessed the accuracy of the special-set OptiTrack system to determine the threshold of precision requirements for our user study. We further evaluated the performance of Number-based and Bar-based precision feedback by comparing orienting assembly errors and task completion time, as well as the usability in the user study. We found that the assembly errors of orientation in the Number-based and Bar-based interfaces were significantly lower than the baseline condition, while there was no significant difference between the Number-based and Bar-based interfaces. Furthermore, the Number-based showed faster task completion time, lower workload, and higher usability than the Bar-based condition.
  • The impact of virtual agents’ multimodal communication on brain activity and cognitive load in virtual reality
    Zhuang Chang, Huidong Bai, Li Zhang, Kunal Gupta, Weiping He, and Mark Billinghurst.

    Chang, Z., Bai, H., Zhang, L., Gupta, K., He, W., & Billinghurst, M. (2022). The impact of virtual agents’ multimodal communication on brain activity and cognitive load in virtual reality. Frontiers in Virtual Reality, 3, 995090.

    @article{chang2022impact,
    title={The impact of virtual agents’ multimodal communication on brain activity and cognitive load in virtual reality},
    author={Chang, Zhuang and Bai, Huidong and Zhang, Li and Gupta, Kunal and He, Weiping and Billinghurst, Mark},
    journal={Frontiers in Virtual Reality},
    volume={3},
    pages={995090},
    year={2022},
    publisher={Frontiers Media SA}
    }
    Related research has shown that collaborating with Intelligent Virtual Agents (IVAs) embodied in Augmented Reality (AR) or Virtual Reality (VR) can improve task performance and reduce task load. Human cognition and behaviors are controlled by brain activities, which can be captured and reflected by Electroencephalogram (EEG) signals. However, little research has been done to understand users’ cognition and behaviors using EEG while interacting with IVAs embodied in AR and VR environments. In this paper, we investigate the impact of the virtual agent’s multimodal communication in VR on users’ EEG signals as measured by alpha band power. We develop a desert survival game where the participants make decisions collaboratively with the virtual agent in VR. We evaluate three different communication methods based on a within-subject pilot study: 1) a Voice-only Agent, 2) an Embodied Agent with speech and gaze, and 3) a Gestural Agent with a gesture pointing at the object while talking about it. No significant difference was found in the EEG alpha band power. However, the alpha band ERD/ERS calculated around the moment when the virtual agent started speaking indicated providing a virtual body for the sudden speech could avoid the abrupt attentional demand when the agent started speaking. Moreover, a sudden gesture coupled with the speech induced more attentional demands, even though the speech was matched with the virtual body. This work is the first to explore the impact of IVAs’ interaction methods in VR on users’ brain activity, and our findings contribute to the IVAs interaction design.