SharedSphere is a Mixed Reality (MR) remote collaboration and shared experience platform. It combines an Augmented Reality (AR) experience for a local worker out in the field with a Virtual Reality (VR) experience for a remote expert. The two users are connected through a live 360 panorama video captured and shared from the host user.
As a MR 360 panoramic video conferencing platform, SharedSphere supports view independence which allows the remote expert to look around the local worker’s surroundings independent of the worker’s orientation. SharedSphere also incorporates situational awareness cues that allow each of the users to see what direction the other person is looking at. The technology supports intuitive gesture sharing. The remote expert can see the local worker’s hand gestures in the live video, and Shared Sphere uses motion sensing in the VR environment to create a virtual copy of the expert’s hand gestures for the local user. Using virtual hands, the remote expert can point at objects of interest or even draw annotations on the real-world objects.
Gun A. Lee, Theophilus Teo, Seungwon Kim, and Mark Billinghurst. 2017. Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (SA '17). ACM, New York, NY, USA, Article 14, 4 pages. http://doi.acm.org/10.1145/3132787.3139203
Theophilus Teo, Gun A. Lee, Mark Billinghurst, and Matt Adcock. 2018. Hand gestures and visual annotation in live 360 panorama-based mixed reality remote collaboration. In Proceedings of the 30th Australian Conference on Computer-Human Interaction (OzCHI '18). ACM, New York, NY, USA, 406-410. DOI: https://doi.org/10.1145/3292147.3292200
Zhengqing Li, Liwei Chan, Theophilus Teo, and Hideki Koike. 2020. OmniGlobeVR: A Collaborative 360° Communication System for VR. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA ’20). Association for Computing Machinery, New York, NY, USA, 1–8. DOI:https://doi.org/10.1145/3334480.3382869