Dr. Seungwon Kim is a Postdoctoral Research Fellow investigating the remote collaboration system. With Augmented Reality technology, he adds visual communication cues (such as pointer, sketch, and virtual hands) in the shared live video of a video conferencing system and studies the effect of them for better remote collaboration. He has presented papers at major A/A* international conferences and journals such as CHI, ISMAR, JVCI, TIIS, and CSCW. He also has reviewed dozens of papers at CHI, ISMAR, VR, TVCG, BIT, and IMWUT.
He received his Ph.D. in Human Interface Technology from HITLab NZ in November 2016 with supervision of Prof. Mark Billinghurst, Dr. Gun Lee, and Dr. Christoph Bartneck. During the Ph.D, He received UC doctoral scholarship from University of Canterbury from April of 2012 to April of 2016. He developed one of the early remote collaboration interfaces that anchors virtual sketches in real world without a marker and previous data, and introduced the auto freeze function for a drawing annotation interface.
In 2013, he was selected through the Microsoft Worldwide Internship program by Nexus group at Microsoft Research (MSR) in Redmond. At MSR, he developed interfaces for Skype that includes three additional views (a high quality image view, a map view, and a scene view) together with a live video stream.
He completed Bachelor and Master Degrees in Computer Science in University of Tasmania, and received Tasmanian International Scholarship (TIS) during the Degrees. He is also a golden key member that is only available to top 15 percent students during the Bachelor or Master Degrees.
SharedSphere is a Mixed Reality based remote collaboration system which not only allows sharing a live captured immersive 360 panorama, but also supports enriched two-way communication and collaboration through sharing non-verbal communication cues, such as view awareness cues, drawn annotation, and hand gestures.
Mirrors are physical displays that show our real world in reflection. While physical mirrors simply show what is in the real world scene, with help of digital technology, we can also alter the reality reflected in the mirror. The Augmented Mirrors project aims at exploring visualisation interaction techniques for exploiting mirrors as Augmented Reality (AR) displays. The project especially focuses on using user interface agents for guiding user interaction with Augmented Mirrors.
We have been developing a remote collaboration system with Empathy Glasses, a head worn display designed to create a stronger feeling of empathy between remote collaborators. To do this, we combined a head- mounted see-through display with a facial expression recognition system, a heart rate sensor, and an eye tracker. The goal is to enable a remote person to see and hear from another person's perspective and to understand how they are feeling. In this way, the system shares non-verbal cues that could help increase empathy between remote collaborators.
Gun A. Lee, Theophilus Teo, Seungwon Kim, and Mark Billinghurst. 2017. Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (SA '17). ACM, New York, NY, USA, Article 14, 4 pages. http://doi.acm.org/10.1145/3132787.3139203
Kim, S., Billinghurst, M., & Lee, G. (2018). The Effect of Collaboration Styles and View Independence on Video-Mediated Remote Collaboration. Computer Supported Cooperative Work (CSCW), 1-39.
Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman and Mark Billinghurst. 2017. Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In Proceedings of ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, pp. 197-204. http://dx.doi.org/10.2312/egve.20171359