- Version
- Download 0
- File Size 1.82 MB
- File Count 1
- Create Date January 7, 2025
- Last Updated January 7, 2025
D3.7 Spatial Coherence Report and Demonstration
This document serves as the report about demonstrations addressing physical coherence across multiple locations in XR. We discuss how scanning and scene understanding descriptions of physical spaces can be used to create a coherent space where multiple remote people can meet. We describe how we use real time tracking data to derive full body avatar movement to be able to naturally walk in the virtual or augmented spaces. We discuss possibilities of using geometric and lighting information of physical spaces to create coherent physics, light and acoustics simulations. Furthermore, we discuss research of a new technique that explores acoustic coherence as style transfer problem with neural networks. Along with technical descriptions we describe 3 demonstrations that illustrate the developed technologies.
The video demonstrations presented are also available on GuestXR’s YouTube channel: https://www.youtube.com/@GuestXR