Jiashuo Cao

Jiashuo Cao

PhD Student

Jiashuo Cao is a Ph.D. candidate at the Empathic Computing Laboratory (ECL), at the University of Auckland. His research interests include Human-Computer Interaction, Assistive Technology, and Extended Reality. Before joining ECL, he completed his master’s studies in Augmented Human Lab.

Currently, his research primarily focuses on utilizing virtual reality to facilitate remote psychotherapy. He aims to enhance the immersion and effectiveness of remote therapy through the assistance of virtual agents. In his leisure time, he enjoys contemplating the relationship between the individual and the world, as well as playing board games.

Publications

  • An Asynchronous Hybrid Cross Reality Collaborative System
    Hyunwoo Cho, Bowen Yuan, Jonathon Derek Hart, Eunhee Chang, Zhuang Chang, Jiashuo Cao, Gun A. Lee, Thammathip Piumsomboon, and Mark Billinghurst.

    Cho, H., Yuan, B., Hart, J. D., Chang, E., Chang, Z., Cao, J., ... & Billinghurst, M. (2023, October). An asynchronous hybrid cross reality collaborative system. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 70-73). IEEE.

    @inproceedings{cho2023asynchronous,
    title={An asynchronous hybrid cross reality collaborative system},
    author={Cho, Hyunwoo and Yuan, Bowen and Hart, Jonathon Derek and Chang, Eunhee and Chang, Zhuang and Cao, Jiashuo and Lee, Gun A and Piumsomboon, Thammathip and Billinghurst, Mark},
    booktitle={2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
    pages={70--73},
    year={2023},
    organization={IEEE}
    }
    This work presents a Mixed Reality (MR)-based asynchronous hybrid cross reality collaborative system which supports recording and playback of user actions in three-dimensional task space at different periods in time. Using this system, an expert user can record a task process such as virtual object placement or assembly, which can then be viewed by other users in either Augmented Reality (AR) or Virtual Reality (VR) views at later points in time to complete the task. In VR, the pre-scanned 3D workspace can be experienced to enhance the understanding of spatial information. Alternatively, AR can provide real-scale information to help the workers manipulate real world objects, and complete the task assignment. Users can also seamlessly move between AR and VR views as desired. In this way the system can contribute to improving task performance and co-presence during asynchronous collaboration.
  • Time Travellers: An Asynchronous Cross Reality Collaborative System
    Hyunwoo Cho, Bowen Yuan, Jonathon Derek Hart, Zhuang Chang, Jiashuo Cao, Eunhee Chang, and Mark Billinghurst.

    Cho, H., Yuan, B., Hart, J. D., Chang, Z., Cao, J., Chang, E., & Billinghurst, M. (2023, October). Time Travellers: An Asynchronous Cross Reality Collaborative System. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (pp. 848-853). IEEE.

    @inproceedings{cho2023time,
    title={Time Travellers: An Asynchronous Cross Reality Collaborative System},
    author={Cho, Hyunwoo and Yuan, Bowen and Hart, Jonathon Derek and Chang, Zhuang and Cao, Jiashuo and Chang, Eunhee and Billinghurst, Mark},
    booktitle={2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
    pages={848--853},
    year={2023},
    organization={IEEE}
    }
    This work presents a Mixed Reality (MR)-based asynchronous hybrid cross reality collaborative system which supports recording and playback of user actions in a large task space at different periods in time. Using this system, an expert can record a task process such as virtual object placement or assembly, which can then be viewed by other users in either Augmented Reality (AR) or Virtual Reality (VR) at later points in time to complete the task. In VR, the pre-scanned 3D workspace can be experienced to enhance the understanding of spatial information. Alternatively, AR can provide real-scale information to help the workers manipulate real-world objects, and complete the assignment. Users can seamlessly switch between AR and VR views as desired. In this way, the system can contribute to improving task performance and co-presence during asynchronous collaboration. The system is demonstrated in a use-case scenario of object assembly using parts that must be retrieved from a storehouse location. A pilot user study found that cross reality asynchronous collaboration system was helpful in providing information about work environments, inducing faster task completion with a lower task load. We provide lessons learned and suggestions for future research.
  • Striving for Authentic and Sustained Technology Use in the Classroom: Lessons Learned from a Longitudinal Evaluation of a Sensor-Based Science Education Platform
    Yvonne Chua,Sankha Cooray,Juan Pablo Forero Cortes,Paul Denny,Sonia Dupuch,Dawn L. Garbett,Alaeddin Nassani,Jiashuo Cao,Hannah Qiao,Andrew Reis,Deviana Reis,Philip M. Scholl,Priyashri Kamlesh Sridhar,Hussel Suriyaarachchi,Fiona Taimana,Vanessa Tang,Chamod Weerasinghe,Elliott Wen,Michelle Wu,Qin Wu,Haimo Zhang &Suranga Nanayakkara

    Chua, Y., Cooray, S., Cortes, J. P. F., Denny, P., Dupuch, S., Garbett, D. L., ... & Nanayakkara, S. (2023). Striving for Authentic and Sustained Technology Use in the Classroom: Lessons Learned from a Longitudinal Evaluation of a Sensor-Based Science Education Platform. International Journal of Human–Computer Interaction, 1-14.

    @article{chua2023striving,
    title={Striving for Authentic and Sustained Technology Use in the Classroom: Lessons Learned from a Longitudinal Evaluation of a Sensor-Based Science Education Platform},
    author={Chua, Yvonne and Cooray, Sankha and Cortes, Juan Pablo Forero and Denny, Paul and Dupuch, Sonia and Garbett, Dawn L and Nassani, Alaeddin and Cao, Jiashuo and Qiao, Hannah and Reis, Andrew and others},
    journal={International Journal of Human--Computer Interaction},
    pages={1--14},
    year={2023},
    publisher={Taylor \& Francis}
    }
    Technology integration in educational settings has led to the development of novel sensor-based tools that enable students to measure and interact with their environment. Although reports from using such tools can be positive, evaluations are often conducted under controlled conditions and short timeframes. There is a need for longitudinal data collected in realistic classroom settings. However, sustained and authentic classroom use requires technology platforms to be seen by teachers as both easy to use and of value. We describe our development of a sensor-based platform to support science teaching that followed a 14-month design process. We share insights from this design and development approach, and report findings from a six-month large-scale evaluation involving 35 schools and 1245 students. We share lessons learnt, including that technology integration is not an educational goal per se and that technology should be a transparent tool to enable students to achieve their learning goals.