Online Seminar Series
Unlocking the potential of XR: Shaping a pro-social metaverse

16th May - 27th June 2024

Inclusion, accessibility, and social interaction lie at the heart of shaping immersive environments for a global audience. As we journey through the boundless realms of virtual, augmented and extended reality, it becomes imperative to foster environments that not only inspire creativity and learning but also prioritize the well-being of every participant.

Hosted by Frontiers in Virtual Reality Journal and in collaboration with the GuestXR European project, this series of webinars explores how XR, AI, neuroscience, and social psychology converge to shape inclusive, accessible, and socially impactful virtual and augmented reality environments. From deploying AI for social good to leveraging EEG technology for emotional enhancement, we uncover the latest advancements driving empathy, inclusivity, and positive social change within XR spaces. Discover how multisensory experiences and innovative haptic feedback mechanisms are fostering empathy, action, and conflict resolution, envisioning a future where technology transcends boundaries to create a more harmonious and interconnected world.

Scheduled seminars

AI-enhanced XR experiences for social good

In this webinar, we will delve into the integration of Extended Reality (XR) technology and Artificial Intelligence (AI) and its potential in various societal sectors. The talk will feature two pioneering case studies showcasing AI-enhanced XR’s applications for social good. The first case study introduces “Inceptor,” a system addressing the global mental health crisis by transforming text scripts into interactive social VR scenarios. This tool aids in training professionals in resilience-building techniques, demonstrating significant potential with robust evidential support. The second case study, focused on the GuestXR project explores AI’s role in promoting pro-social behaviors within emerging virtual environments such the Metaverse. Utilizing a reinforcement learning framework, the AI meta-agent, the Guest, is trained in agent-based social simulations and later deployed in real XR environments to encourage positive interactions among users. These discussions will not only showcase innovative applications but also examine the practical and ethical challenges of deploying AI within XR environments.

Doron Friedman, Professor and Head of the Advanced Reality Lab, Reichman University

Associate Professor at the Sammy Ofer School of Communications and heads the Advanced Reality Lab. His academic journey began with a PhD in Computer Science from Tel Aviv University, focusing on AI-based cinematography in virtual environments, followed by a postdoctoral fellowship at University College London where he explored controlling virtual reality using brain-computer interfaces. A seasoned entrepreneur, Prof. Friedman has significantly contributed to the fields of intelligent systems and the Internet, with his work resulting in several patents and commercial products. Since 2008, he and his lab have engaged in national and international projects related to telepresence, virtual reality, and advanced human-computer interfaces, with a current focus on LLM-based virtual humans. His multidisciplinary research, spanning areas such as Artificial Intelligence and Human-Computer Interaction, has been published in prestigious journals and conferences and frequently features in major media outlets worldwide.

Brain-Computer Interfaces in Virtual Environments: Synergies between EEG and XR

May 30th, 2024

15:00h CEST

This webinar will present a toolkit for seamlessly integrating Brain Computer Interfaces (BCI) interactions into the Unity game engine developed by g.tec. This allows developers and designers to focus on the game mechanics without worrying about building a pipeline for signal processing and classification, and even to fast prototyping gamified research tasks. During this seminar, we will show the research in electrophysiology done within the GuestXR project and how to leverage on Unity native XR integrations and g.tec Unicorn headband to build hybrid experiences with EEG and mixed reality.

Michele Romani, University of Trento & g.tec medical engineering GmbH

R&D at g.tec medical engineering GmbH and PhD student at the CIMIL lab of University of Trento. He has a double degree in Human-Computer Interaction and Intelligent Systems. During his studies he became passionate about Brain-Computer Interfaces and has been working with EEG, machine learning and Unity since 2019. He is currently developing the Unicorn Unity Interface and researching BCI for gaming applications.

Enhancing social trust in Virtual Reality through affective haptic feedback

June 13th, 2024

15:00h CEST

This webinar will explore the role of affective haptic feedback in shaping social trust and connections within virtual reality (VR) environments. We’ll discuss the psychological mechanisms that underpin social trust, how tactile sensations can convey emotional and social cues, and their potential for creating more immersive and socially engaging VR experiences. Participants will learn about the possibilities for integrating affective haptic feedback to foster empathy and improve user interactions in VR settings.


Jeanne Hecquard, PhD student, Inria centre at Rennes University

Jeanne Hecquard is a second-year PhD student interested in virtual interactions between users. She obtained an MSc in Computer Science at the National Institute of Applied Sciences (INSA), as well as a Master’s degree in the science of informatics at Rennes University (France). Jeanne recently joined Inria where she is working on haptic devices for immersive social communication.


Study of the co-presence in the GuestXR project with VR United

June 27th, 2024

15:00h CEST

In this webinar, we will introduce the idea of copresence, the illusion of being and interacting with remotely located people in a shared VR. We will introduce the technology behind an application called ‘VR United’ and show how this is achievable technically, and also introduce case studies that have studied copresence experimentally. We will discuss how copresence evolved in a study that involved small groups of people, including how measurements based on sentiment analysis are used.

Ramon Oliva, postdoctoral researcher, Event Lab, University of Barcelona

Ramon Oliva has a PhD in Computer Science from Universitat Politècnica de Catalunya (UPC).  He is currently a senior postdoctoral researcher in the Event Lab at the Universitat de Barcelona in Spain. His work has concentrated on technical developments in VR. In particular, he developed the QuickVR system which is software that especially and flexibly supports embodiment in VR. He has also developed VR United that is a system for multiple people to simultaneously interact in a shared VR, each with a virtual body that looks like themselves. This was recently used by the Financial Times to conduct a long-distance interview between London and New York, which resulted in the 2023 IEEE CG&A paper “The Making of a Newspaper Interview in Virtual Reality: Realistic Avatars, Philosophy, and Sushi” vol. 43, no. 6, pp. 117-125. He has around 40 publications with an h-index of 21.


Esen Küçüktütüncü, PhD student, Event Lab, University of Barcelona

Esen Küçüktütüncü obtained her Masters degree in Cognitive Systems and Interactive Media at Universitat Pompeu Fabra, Barcelona, Spain. She is currently pursuing a PhD degree in the Event Lab, Institute of Neurosciences. Her research focuses on emergent social dynamics within social virtual reality and development of virtual agents to ensure social harmony in immersive shared environments. She has carried out a number of experimental studies in relation to both presence and copresence in shared virtual environments, and will bring that experience to this course.

Reichman University