Seungwon Kim

Seungwon Kim

Research Fellow

Dr. Seungwon Kim is a Postdoctoral Research Fellow investigating interfaces of a video conference for remote collaboration. With Augmented Reality technology adding or displaying virtual objects in the shared live video of the video conference, his main research theme is designing interfaces that show users’ information with the virtual objects in the video conference  and how the virtual objects helps users to have better remote collaboration.  He especially focuses on enhancing the shared experience in remote collaboration while the virtual object gives little effect on understanding real world.

He received his Ph.D. in Human Interface Technology from HITLab NZ in November 2016 with supervision of Prof. Mark Billinghurst, Dr. Gun Lee, and Dr. Christoph Bartneck. During the Ph.D, He received UC doctoral scholarship from University of Canterbury from April of 2012 to April of 2016. He developed one of the early interfaces that anchors virtual annotations in real world without a marker and previous data, and introduced the auto freeze function for a drawing annotation interface.

In 2013, he was selected through the Microsoft Worldwide Internship program by Nexus group at Microsoft Research (MSR) in Redmond. At MSR, he developed interfaces for Skype that includes three additional views (a high quality image view, a map view, and a scene view) together with a live video stream.

He completed Bachelor and Master Degrees in Computer Science in University of Tasmania, and received Tasmanian International Scholarship (TIS) during the Degrees. He is also a golden key member that is only available to top 15 percent students during the Bachelor or Master Degrees.

Dr. Seungwon Kim is a Postdoctoral Research Fellow investigating interfaces of a video conference for remote collaboration. With Augmented Reality technology adding or displaying virtual objects in the shared live video of the video conference, his main research theme is designing interfaces that show users’ information with the virtual objects in the video conference  and how the virtual objects helps users to have better remote collaboration.  He especially focuses on enhancing the shared experience in remote collaboration while the virtual object gives little effect on understanding real world.

He received his Ph.D. in Human Interface Technology from HITLab NZ in November 2016 with supervision of Prof. Mark Billinghurst, Dr. Gun Lee, and Dr. Christoph Bartneck. During the Ph.D, He received UC doctoral scholarship from University of Canterbury from April of 2012 to April of 2016. He developed one of the early interfaces that anchors virtual annotations in real world without a marker and previous data, and introduced the auto freeze function for a drawing annotation interface.

In 2013, he was selected through the Microsoft Worldwide Internship program by Nexus group at Microsoft Research (MSR) in Redmond. At MSR, he developed interfaces for Skype that includes three additional views (a high quality image view, a map view, and a scene view) together with a live video stream.

He completed Bachelor and Master Degrees in Computer Science in University of Tasmania, and received Tasmanian International Scholarship (TIS) during the Degrees. He is also a golden key member that is only available to top 15 percent students during the Bachelor or Master Degrees.

Projects

  • SharedSphere

    SharedSphere is a Mixed Reality based remote collaboration system which not only allows sharing a live captured immersive 360 panorama, but also supports enriched two-way communication and collaboration through sharing non-verbal communication cues, such as view awareness cues, drawn annotation, and hand gestures.

  • Augmented Mirrors

    Mirrors are physical displays that show our real world in reflection. While physical mirrors simply show what is in the real world scene, with help of digital technology, we can also alter the reality reflected in the mirror. The Augmented Mirrors project aims at exploring visualisation interaction techniques for exploiting mirrors as Augmented Reality (AR) displays. The project especially focuses on using user interface agents for guiding user interaction with Augmented Mirrors.

  • Empathy Glasses

    We have been developing a remote collaboration system with Empathy Glasses, a head worn display designed to create a stronger feeling of empathy between remote collaborators. To do this, we combined a head- mounted see-through display with a facial expression recognition system, a heart rate sensor, and an eye tracker. The goal is to enable a remote person to see and hear from another person's perspective and to understand how they are feeling. In this way, the system shares non-verbal cues that could help increase empathy between remote collaborators.

Publications

  • Mixed Reality Collaboration through Sharing a Live Panorama
    Gun A. Lee, Theophilus Teo, Seungwon Kim, Mark Billinghurst

    Gun A. Lee, Theophilus Teo, Seungwon Kim, and Mark Billinghurst. 2017. Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (SA '17). ACM, New York, NY, USA, Article 14, 4 pages. http://doi.acm.org/10.1145/3132787.3139203

    @inproceedings{Lee:2017:MRC:3132787.3139203,
    author = {Lee, Gun A. and Teo, Theophilus and Kim, Seungwon and Billinghurst, Mark},
    title = {Mixed Reality Collaboration Through Sharing a Live Panorama},
    booktitle = {SIGGRAPH Asia 2017 Mobile Graphics \& Interactive Applications},
    series = {SA '17},
    year = {2017},
    isbn = {978-1-4503-5410-3},
    location = {Bangkok, Thailand},
    pages = {14:1--14:4},
    articleno = {14},
    numpages = {4},
    url = {http://doi.acm.org/10.1145/3132787.3139203},
    doi = {10.1145/3132787.3139203},
    acmid = {3139203},
    publisher = {ACM},
    address = {New York, NY, USA},
    keywords = {panorama, remote collaboration, shared experience},
    }
    One of the popular features on modern social networking platforms is sharing live 360 panorama video. This research investigates on how to further improve shared live panorama based collaborative experiences by applying Mixed Reality (MR) technology. SharedSphere is a wearable MR remote collaboration system. In addition to sharing a live captured immersive panorama, SharedSphere enriches the collaboration through overlaying MR visualisation of non-verbal communication cues (e.g., view awareness and gestures cues). User feedback collected through a preliminary user study indicated that sharing of live 360 panorama video was beneficial by providing a more immersive experience and supporting view independence. Users also felt that the view awareness cues were helpful for understanding the remote collaborator’s focus.
  • The Effect of Collaboration Styles and View Independence on Video-Mediated Remote Collaboration
    Seungwon Kim, Mark Billinghurst, Gun Lee

    Kim, S., Billinghurst, M., & Lee, G. (2018). The Effect of Collaboration Styles and View Independence on Video-Mediated Remote Collaboration. Computer Supported Cooperative Work (CSCW), 1-39.

    @Article{Kim2018,
    author="Kim, Seungwon
    and Billinghurst, Mark
    and Lee, Gun",
    title="The Effect of Collaboration Styles and View Independence on Video-Mediated Remote Collaboration",
    journal="Computer Supported Cooperative Work (CSCW)",
    year="2018",
    month="Jun",
    day="02",
    abstract="This paper investigates how different collaboration styles and view independence affect remote collaboration. Our remote collaboration system shares a live video of a local user's real-world task space with a remote user. The remote user can have an independent view or a dependent view of a shared real-world object manipulation task and can draw virtual annotations onto the real-world objects as a visual communication cue. With the system, we investigated two different collaboration styles; (1) remote expert collaboration where a remote user has the solution and gives instructions to a local partner and (2) mutual collaboration where neither user has a solution but both remote and local users share ideas and discuss ways to solve the real-world task. In the user study, the remote expert collaboration showed a number of benefits over the mutual collaboration. With the remote expert collaboration, participants had better communication from the remote user to the local user, more aligned focus between participants, and the remote participants' feeling of enjoyment and togetherness. However, the benefits were not always apparent at the local participants' end, especially with measures of enjoyment and togetherness. The independent view also had several benefits over the dependent view, such as allowing remote participants to freely navigate around the workspace while having a wider fully zoomed-out view. The benefits of the independent view were more prominent in the mutual collaboration than in the remote expert collaboration, especially in enabling the remote participants to see the workspace.",
    issn="1573-7551",
    doi="10.1007/s10606-018-9324-2",
    url="https://doi.org/10.1007/s10606-018-9324-2"
    }
    This paper investigates how different collaboration styles and view independence affect remote collaboration. Our remote collaboration system shares a live video of a local user’s real-world task space with a remote user. The remote user can have an independent view or a dependent view of a shared real-world object manipulation task and can draw virtual annotations onto the real-world objects as a visual communication cue. With the system, we investigated two different collaboration styles; (1) remote expert collaboration where a remote user has the solution and gives instructions to a local partner and (2) mutual collaboration where neither user has a solution but both remote and local users share ideas and discuss ways to solve the real-world task. In the user study, the remote expert collaboration showed a number of benefits over the mutual collaboration. With the remote expert collaboration, participants had better communication from the remote user to the local user, more aligned focus between participants, and the remote participants’ feeling of enjoyment and togetherness. However, the benefits were not always apparent at the local participants’ end, especially with measures of enjoyment and togetherness. The independent view also had several benefits over the dependent view, such as allowing remote participants to freely navigate around the workspace while having a wider fully zoomed-out view. The benefits of the independent view were more prominent in the mutual collaboration than in the remote expert collaboration, especially in enabling the remote participants to see the workspace.
  • Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze
    Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman and Mark Billinghurst

    Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammathip Piumsomboon, Mitchell Norman and Mark Billinghurst. 2017. Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In Proceedings of ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, pp. 197-204. http://dx.doi.org/10.2312/egve.20171359

    @inproceedings {egve.20171359,
    booktitle = {ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
    editor = {Robert W. Lindeman and Gerd Bruder and Daisuke Iwai},
    title = {{Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze}},
    author = {Lee, Gun A. and Kim, Seungwon and Lee, Youngho and Dey, Arindam and Piumsomboon, Thammathip and Norman, Mitchell and Billinghurst, Mark},
    year = {2017},
    publisher = {The Eurographics Association},
    ISSN = {1727-530X},
    ISBN = {978-3-03868-038-3},
    DOI = {10.2312/egve.20171359}
    }
    To improve remote collaboration in video conferencing systems, researchers have been investigating augmenting visual cues onto a shared live video stream. In such systems, a person wearing a head-mounted display (HMD) and camera can share her view of the surrounding real-world with a remote collaborator to receive assistance on a real-world task. While this concept of augmented video conferencing (AVC) has been actively investigated, there has been little research on how sharing gaze cues might affect the collaboration in video conferencing. This paper investigates how sharing gaze in both directions between a local worker and remote helper in an AVC system affects the collaboration and communication. Using a prototype AVC system that shares the eye gaze of both users, we conducted a user study that compares four conditions with different combinations of eye gaze sharing between the two users. The results showed that sharing each other’s gaze significantly improved collaboration and communication.