[featured_image]
Download
Download is available until [expire_date]
  • Version
  • Download 0
  • File Size 101.37 KB
  • File Count 1
  • Create Date July 19, 2023
  • Last Updated July 19, 2023

D4.1 Specifications of Multisensory Cues - Executive summary

This task investigates the principles and neural underpinnings of multisensory perceptions. This report presents the results of the research conducted, whose aim was to study the use of multisensory cues, including audiovisual and audiotactile, in extended reality (XR) environments for social interaction. Eight studies were conducted to understand the effectiveness of various combinations of visual, auditory, and tactile cues for conveying information in XR. It was found that multisensory audiovisual cues can be used to represent critical visual information outside of the user's field of vision through audio, and that vibrotactile solutions can be implemented as part of XR sets for delivering information. It was also found that the combination of visual and auditory cues can be used to represent social cues, such as facial expressions and physical distance between people, and to convey spatial information in XR environments. These studies suggest that this multisensory integration can be used to convey social cues, such as tone of voice, emotional valence, and prosody, to both the hearing impaired and the general public. It can also provide additional cues, such as locational information, to aid in social interaction in XR environments and enhance the participation and leadership of remote users in meetings. These findings may have applications in promoting inclusion and improving communication in XR environments. Based on the findings of these studies, it is concluded that multisensory cues can be effectively used in XR environments for social interaction, and that their integration can be trainable for various uses.