Alaeddin Nassani

Alaeddin Nassani

Research Fellow

Alaeddin is a Research Fellow at the Empathic Computing Laboratory (ECL).  His research interests include augmented reality (AR) interfaces, 360-degree video sharing and integrating haptics into virtual reality environments.  As part of the Auckland Bioengineering Institute (ABI) at the University of Auckland, Alaeddin also works with the Augmented Human Lab (AHL) helping the Kiwrious project that aims to inspire schoolchildren to learn about scientific inquiries.  Before joining the ECL, Alaeddin worked in the software industry for more than 12 years where he built software for mid to large organisations in the energy and finance industry. Alaeddin completed his PhD from the HITLabNZ at the University of Canterbury with a focus on wearable augmented reality for sharing social experiences. 

Projects

  • Show Me Around

    This project introduces an immersive way to experience a conference call - by using a 360° camera to live stream a person’s surroundings to remote viewers. Viewers have the ability to freely look around the host video and get a better understanding of the sender’s surroundings. Viewers can also observe where the other participants are looking, allowing them to understand better the conversation and what people are paying attention to. In a user study of the system, people found it much more immersive than a traditional video conferencing call and reported that they felt that they were transported to a remote location. Possible applications of this system include virtual tourism, education, industrial monitoring, entertainment, and more.

Publications

  • ShowMeAround: Giving Virtual Tours Using Live 360 Video
    Alaeddin Nassani, Li Zhang, Huidong Bai, Mark Billinghurst.

    Nassani, A., Zhang, L., Bai, H., & Billinghurst, M. (2021, May). ShowMeAround: Giving Virtual Tours Using Live 360 Video. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-4).

    @inproceedings{nassani2021showmearound,
    title={ShowMeAround: Giving Virtual Tours Using Live 360 Video},
    author={Nassani, Alaeddin and Zhang, Li and Bai, Huidong and Billinghurst, Mark},
    booktitle={Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems},
    pages={1--4},
    year={2021}
    }
    This demonstration presents ShowMeAround, a video conferencing system designed to allow people to give virtual tours over live 360-video. Using ShowMeAround a host presenter walks through a real space and can live stream a 360-video view to a small group of remote viewers. The ShowMeAround interface has features such as remote pointing and viewpoint awareness to support natural collaboration between the viewers and host presenter. The system also enables sharing of pre-recorded high resolution 360 video and still images to further enhance the virtual tour experience.
  • Jitsi360: Using 360 images for live tours.
    Nassani, A., Bai, H., & Billinghurst, M.

  • Designing, Prototyping and Testing of 360-degree Spatial Audio Conferencing for Virtual Tours.
    Nassani, A., Barde, A., Bai, H., Nanayakkara, S., & Billinghurst, M.

  • Implementation of Attention-Based Spatial Audio for 360° Environments.
    Nassani, A., Barde, A., Bai, H., Nanayakkara, S., & Billinghurst, M.

  • LightSense-Long Distance
    Uwe Rieger, Yinan Liu, Tharindu Kaluarachchi, Amit Barde, Huidong Bai, Alaeddin Nassani, Suranga Nanayakkara, Mark Billinghurst.

    Rieger, U., Liu, Y., Kaluarachchi, T., Barde, A., Bai, H., Nassani, A., ... & Billinghurst, M. (2023). LightSense-Long Distance. In ACM SIGGRAPH Asia 2023 Art Gallery (pp. 1-2).

    @incollection{rieger2023lightsense,
    title={LightSense-Long Distance},
    author={Rieger, Uwe and Liu, Yinan and Kaluarachchi, Tharindu and Barde, Amit and Bai, Huidong and Nassani, Alaeddin and Nanayakkara, Suranga and Billinghurst, Mark},
    booktitle={ACM SIGGRAPH Asia 2023 Art Gallery},
    pages={1--2},
    year={2023}
    }
    'LightSense - Long Distance' explores remote interaction with architectural space. It is a virtual extension of the project 'LightSense,' which is currently presented at the exhibition 'Cyber Physical: Architecture in Real Time' at EPFL Pavilions in Switzerland. Using numerous VR headsets, the setup at the Art Gallery at SIGGRAPH Asia establishes a direct connection between both exhibition sites in Sydney and Lausanne.
    'LightSense' at EPFL Pavilions is an immersive installation that allows the audience to engage in intimate interaction with a living architectural body. It consists of a 12-meter-long construction that combines a lightweight structure with projected 3D holographic animations. At its core sits a neural network, which has been trained on sixty thousand poems. This allows the structure to engage, lead, and sustain conversations with the visitor. Its responses are truly associative, unpredictable, meaningful, magical, and deeply emotional. Analysing the emotional tenor of the conversation, 'LightSense' can transform into a series of hybrid architectural volumes, immersing the visitors in Pavilions of Love, Anger, Curiosity, and Joy.
    'LightSense's' physical construction is linked to a digital twin. Movement, holographic animations, sound, and text responses are controlled by the cloud-based AI system. This combination creates a location-independent cyber-physical system. As such, the 'Long Distance' version, which premiered at SIGGRAPH Asia, enables the visitors in Sydney to directly engage with the physical setup in Lausanne. Using VR headsets with a new 360-degree 4K live streaming system, the visitors find themselves teleported to face 'LightSense', able to engage in a direct conversation with the structure on-site.
    'LightSense - Long Distance' leaves behind the notion of architecture being a place-bound and static environment. Instead, it points toward the next generation of responsive buildings that transcend space, are capable of dynamic behaviour, and able to accompany their visitors as creative partners.