RESOURCES
GuestXR demos
Acoustic coherence and why it is important for interacting in VR?
About this demo
We usually think VR is all about what we see, but it actually involves many senses, including hearing. This demo has three parts. First, the participant is given a short introduction to audio in VR. Second, they are shown you how difficult it is to communicate in noisy, cacophonous environments. Third, they get to experiment with different speech enhancement methods designed to help in these situations.
What does it showcase?
This GuestXR demo shows how important sound is for making VR experiences feel real, especially those that require talking to others. It simulates a typical meeting in VR, where several users engage in a conversation. We show that the interaction is hardly possible with poor audio. We also let the user experience a few state of the art speech enhancement methods in action, making it easy to understand the remaining challenges of this technology, like the inability to filter some of the noises, added distortions, or influence on the spatial cues.

XR Meeting room with an LLM-based AI agent
About this demo
Multiple participants represented by full body avatars meet in a room seated or standing and can interact naturally through body language, face and eye movements and voice. The Guest is represented by a humanoid agent resembling Sigmund Freud that is connected through audio channels to listen to each participant’s contribution to the conversation. Freud can give advice when a question is addressed directly to him and summarise the conversation and give advice. The response is based on a large language model. The Guest may add objects to the scene as topics of discussion, in this case a model of planet Earth.
What does it showcase?
This GuestXR demo showcases the capabilities and possibilities of the GuestXR platform to connect AI models with shared XR spaces. It has many potential applications in ideation and problem solving, improving communication skills and group therapy.
An electrophysiology platform for GuestXR
About this demo
An electrophysiology training environment that records biosignals, such as electroencephalography (EEG), from multiple participants and monitors their affective states. Biomedical signal pre-processing techniques are applied to remove artifacts with minimal delay. EEG features are then extracted and classified in real time to discriminate different states, such as concentration levels, watching static or dynamic videos, or playing the Tetris game with a slow/medium/fast speed. This platform holds promise for enhancing social interactions in XR environments.
What does it showcase?
In this GuestXR demo, electrophysiology recordings for distinguishing and classifying affective categories are showcased. Using brain imaging to monitor emotions in social interaction will advance science and enhance predictive models of social behavior. Its integration into XR systems will deepen our understanding of the role of electrophysiology-based emotion recognition in virtual interactions.


The haptic belt


About this GuestXR demo
The participant is immersed in a virtual landscape, virtually embodying a realistic avatar walking and climbing a slope. When the slope is getting stepper and the avatar is getting more tired, the participant can “feel” its virtual breathing accelerating thanks to a novel haptic device: a compression belt actively pressing on the abdomen.
What does it showcase?
A novel haptic device has been developed by Inria. An elastic band is actively controlled to exert some pressure on the body. This simple haptic device was found very effective in providing convincing physical cues in various scenarios and contexts. For instance, it has been used to simulate a fake breathing biofeedback and stimulate empathy towards a virtual agent, or to convey a notion of “pressure” and physical proximity of a virtual agent (proxemics).
Open Call Warsaw protest on abortion rights
About this GuestXR demo
Multiple participants, represented by full-body avatars, gather on a street in Warsaw, standing or walking and interacting naturally through body language, facial expressions, eye movements, and voice. They witness a protest about abortion rights involving a large crowd, amidst scenes of destroyed cars, smoke, and explosions.
What does it showcase?
This GuestXR demo showcases the collaboration with the Open Call winner, aiming to integrate three historic protests: the 2020 abortion rights protest in Warsaw, the 2019 pro-democracy protests in Hong Kong, and Greta Thunberg’s 2018 climate change speech in Katowice.


Persuasive haptics

About this GuestXR demo
The participant is immersed in a virtual meeting scenario, during which they are facing two virtual agents speaking to them. The two agents speak alternatively. The speech of one of the two agents is artificially augmented with the “persuasive vibration” technique. The participant is expected to identify which agent is the most convincing. The hypothesis is that the haptically augmented agent will be found as more persuasive than the other.
What does it showcase?
This demo showcases a novel haptic technique called “persuasive vibration”. A vibration, corresponding to the speech of a person, is found to increase confidence and persuasion during social interactions in XR involving verbal communication. It could for instance be used to artificially increase the weight of a “shy” speaker during a virtual meeting.
Climate game in social XR
About this GuestXR demo
In the cryptocurrency mining scenario, three players represented by avatars mine energy, a common resource visualised as a pile of blocks. The game environment adjusts based on the resource state, personal gain, and collective actions. Players must strategise to avoid depleting the resource and experiencing the impact of their actions on the environment.
What does it showcase?
This demo uses a “tragedy of the commons” scenario and relates cryptocurrency mining to climate change. It shows the consequences of exploitation of common resources, energy, which impacts climate change, and the importance of sustainable practices. It highlights how gameplay can foster a deeper relationship with the climate as a shared resource and how GuestXR interventions help make better decisions for the environment within the game.

An example of The Guest in operation


About this GuestXR demo
Multiple participants, represented by avatars, engage in conversation
moderated by an AI-controlled participant known as The Guest, whose role is to ensure balanced participation by employing direct interventions, such as asking quieter participants for their opinions, and subtle techniques, like redirecting attention towards them. This approach creates a seamless and natural conversation flow, promoting and engaging an inclusive discussion where all participants feel heard and valued.
What does it showcase?
This demo showcases the innovative potential of the GuestXR approach in moderating virtual interactions, highlighting its ability to foster balanced and inclusive conversations. By ensuring equal participation, The Guest addresses common communication challenges in virtual settings, making this technology highly applicable to virtual meetings, online education, therapy sessions, and social VR experiences.
VRUnited
About this GuestXR demo
In this demo, the participant experiences VRUnited’s innovative features, including the creation of realistic avatars from a single photo, a variety of shared virtual experiences like concerts and interviews, and advanced interaction features such as realistic lipsync, facial tracking and gaze direction. Importantly, the participant sees VR United’s cross-platform functionality, operating on a wide range of VR headsets from mobile to desktop, and even on a standard desktop computer for non-immersive access.
What does it showcase?
This demo highlights VRUnited’s cutting-edge technology and accessibility, showing how sophisticated VR interactions are achievable without complex hardware setups. It demonstrates the platform’s versatility across various scenarios, emphasising enhanced user experience through realistic interactions and adaptive embodiment. Additionally, by emphasising its cross-platform compatibility, the demo shows VRUnited’s wide accessibility.

Effect of social presence on auditory threat perception – an EEG+VR study

About this GuestXR demo
In the demo, the participant sees a series of paintings appear on a wall in front of them. While they are watching the paintings, sounds are played at various points throughout. There are three blocks during which this happens. In the first block the participant is alone while watching the paintings. In the second block they are with another avatar in the room, controlled by a real person. In the third block they are with another avatar, controlled by the computer which does not respond meaningfully to things happening in the environment. During the real experiment, we record electroencephalography (EEG) signals from the brain, as well as heart rate, and we film the participants as they engage with the VR environment.
What does it showcase?
This demo investigates the effect of different conditions of social presence on people’s response to threatening or stressful input, specifically sounds. The aim is to analyse how people respond to other avatars within virtual environments and in particular how the degree of animacy affects their perception of the avatars.