Shared Reality

Shared Reality Project Image

The Shared Reality Project investigates remote collaboration ecosystems that are spatial, immersive and widely accessible. The DualStream augmented reality prototype enables users to spatially share information about themselves and their surroundings in real time. Going beyond traditional audio and video conferencing tools, DualStream can convey three-dimensional information about facial expressions, creating avatars that look and move as we do in real life. Leveraging the mobility and ubiquity of phones, DualStream enables users to simultaneously feel they are 鈥渂eing there鈥� in a remote location, and remote participants are 鈥渂eing here鈥� in their local environment. DualStream envisions spatial computing that is more widely accessible than experiences reliant on expensive head-worn devices. By building cross-reality ecosystems with stronger connections across mobile devices, PCs and immersive setups, we can better support collaboration between people no matter where they are located or what tools they have access to.

ACME Lab

Infographic of DualStream system overview

Infographic of DualStream

Associated Researchers

Publications

Rishi Vanukuru and Ellen Yi-Luen Do. 2024. "". In: Proceedings of the 2024 ACM International Conference on Interactive Media Experiences (IMX'24). (Stockholm, Sweden, June 12-14, 2024).听

Rishi Vanukuru and Ellen Yi-Luen Do. 2023. "". In: IEEE ISMAR 2023 Workshop on Cross-Reality Interactions. (Sydney, Australia, October 16-20, 2023).听

Rishi Vanukuru,听Suibi Che-Chuan Weng,听Krithik Zanjan,听Torin Hopkins,听Amy Banic,听Mark D Gross and听Ellen Yi-Luen Do.听2023听"". In:听IEEE International Symposium on Mixed and Augmented Reality (ISMAR).听(Sydney, Australia, October 16-20, 2023).