About GuestXR PROJECT
Creating a new machine learning agent addressing conflicts and unlocking interaction in extended reality environments
The GuestXR project objective is to develop an immersive virtual social space anchored in extended reality based on examining individual and group behaviour by drawing on existing theoretical models from the neuroscience and social psychology standpoint.
Thus, the GuestXR solution will feature a machine learning agent designed to unlock interactions between participants and help them achieve their goals, whilst helping users address conflicts in virtual environments.
The project technology will be developed under the “Ethics by Design” approach. This entails considering any potential ethical issues which may come up when using Artificial Intelligence such as respect for human autonomy, prevention of harm, fairness and explicability
Key data
- Start-end date: 1 January 2022 – 31 December 2025
- Duration: 48 months
- Funded under Horizon 2020 programme H2020-FETPROACT-2020-2
- Overall budget: 4.499.519,76 €
- Reference number: 101017884
OUR VISION
Our vision is to develop a socially interactive multisensory platform system that uses extended reality, virtual and augmented reality, as the medium to bring people together for immersive, synchronous face-to-face interaction with positive social outcomes
PROJECT PHASES
From an idea to demonstrating the technology in relevant environment
Development of the Machine Learning Agent
Application of deep Reinforcement Learning (RL) novel methods to train The Guest to obtain social intelligence, so that it can take specific actions that foster pro-social behaviour and social goals, in the context of Extended Reality applications incorporating ethics-by-design methodology.
Social and affective modelling
Definition of agent psychological riles of behaviour, development of social behavioural simulations and computational description of body posture and movement features, as well as investigate how individuals’ expressions affect social interaction dynamics.
Integration into experimental studies
Creation of the overall architecture of the AR and VR systems and development of the multiple representations of The Guest and its actions, as well as self-representation through embodiment in AR and of remote participants.
Multisensory displays
Investigate and develop multimodal rendering such as haptics and research on multisensory integration and provide setups and guidelines for the project use cases.
Use cases development
System validation through different use cases, including conflict situations or involving participants with communication problems.
THE GUEST
Helping people achieve their meeting goals
The Guest is a machine learning system to be devised as an agent that can examine participants’ individual and group behaviour by drawing on existing theoretical models from the neuroscience and social psychology standpoint.
Therefore, the Guest will be able to perform a range of multisensory actions, using, for instance, visual or auditory features to create particular states of mind and stimulate a relaxed environment when it identifies a conflict amongst participants. This combination of AI with immersive systems, virtual and augmented reality will be a hugely challenging research task, given the vagaries of social meetings and individual behaviour.
Moreover, the Guest may introduce a small obstacle, the overcoming of which requires the cooperation of all participants, provide an opportunity for a relaxing common activity, change the background music to a more soothing one, disrupt escalating interactions by increasing the physical distance or introducing a physical separation between participants (changing the environment) and engage each participant in positive interaction with a dedicated actor.