Based on the development of augmented and virtual reality technologies, there have been many types of remote collaboration systems developed using head-mounted displays (HMDs). These systems can be especially helpful for remote assistance on physical tasks when a remote expert can see the workspace of the local user and provide feedback/assistance.
In this project, we focus on how combining technologies such as Augmented Reality and Wearable Computing can be used to create novel remote collaboration systems enabling remote experts to help workers. The systems should enable a local user to send a view of their environment to a remote user, as well as many empathic cues which could express emotion and intention, such as gesture, gaze, facial expressions, and physiological signal. The remote user can send back visual cues via a see-through HMD to help them perform better on a real world task.
We expect that our system will be helpful for the future of remote collaboration in the areas of medical training, manufacturing, teaching etc.
For this project, we collaborate with Katsutoshi Masai, Kai Kunze, Maki Sugimoto at Keio University, Japan.