Analysing Listener Behaviour Through Gaze Data and User Performance during a Sound Localisation Task in a VR Environment

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Virtual reality (VR) is used today in a variety of applications. Some examples include virtual meetings, simulators (driving, flight, medical interventions), digital twins and healthcare systems. In this context, audio plays a significant role in adding emotion and connecting users to what they see on the screen. For this reason, it is critical to evaluate how users perceive these stimuli in order to improve their quality of experience (QoE). The main objective of this work is to evaluate how listeners perceive spatialised audio in a VR environment. To accomplish this goal, users are required to localise multiple sound sources, while also listening to distractors, in the acoustic scene. In addition, this work investigates user interaction with the environment, by comparing two interaction methods used to localise the sound source. Findings from this experiment show that eye gaze can be associated with levels of effort and cognitive load in a sound localisation task. Furthermore, other metrics related to eye gaze such as the number of fixations and number of revisits to virtual objects were also used as descriptors of user attention in this task. The results obtained improve the understanding of a listener's behaviour in VR.

Original languageEnglish
Title of host publicationProceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages485-490
Number of pages6
ISBN (Electronic)9781665453653
DOIs
Publication statusPublished - 2022
Event21st IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022 - Singapore, Singapore
Duration: 17 Oct 202221 Oct 2022

Publication series

NameProceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022

Conference

Conference21st IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
Country/TerritorySingapore
CitySingapore
Period17/10/2221/10/22

Keywords

  • Eye Gaze
  • H.1.2.b [Human-centred computing]: Human-computer interaction
  • H.5.1.b [Multimedia Information Systems]: Artificial
  • Human Behaviour
  • Quality of Experience
  • Spatial Audio
  • Virtual Reality
  • augmented
  • virtual realities

Fingerprint

Dive into the research topics of 'Analysing Listener Behaviour Through Gaze Data and User Performance during a Sound Localisation Task in a VR Environment'. Together they form a unique fingerprint.

Cite this