TY - GEN
T1 - Analysing Listener Behaviour Through Gaze Data and User Performance during a Sound Localisation Task in a VR Environment
AU - Moraes, Adrielle Nazar
AU - Flynn, Ronan
AU - Murray, Niall
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Virtual reality (VR) is used today in a variety of applications. Some examples include virtual meetings, simulators (driving, flight, medical interventions), digital twins and healthcare systems. In this context, audio plays a significant role in adding emotion and connecting users to what they see on the screen. For this reason, it is critical to evaluate how users perceive these stimuli in order to improve their quality of experience (QoE). The main objective of this work is to evaluate how listeners perceive spatialised audio in a VR environment. To accomplish this goal, users are required to localise multiple sound sources, while also listening to distractors, in the acoustic scene. In addition, this work investigates user interaction with the environment, by comparing two interaction methods used to localise the sound source. Findings from this experiment show that eye gaze can be associated with levels of effort and cognitive load in a sound localisation task. Furthermore, other metrics related to eye gaze such as the number of fixations and number of revisits to virtual objects were also used as descriptors of user attention in this task. The results obtained improve the understanding of a listener's behaviour in VR.
AB - Virtual reality (VR) is used today in a variety of applications. Some examples include virtual meetings, simulators (driving, flight, medical interventions), digital twins and healthcare systems. In this context, audio plays a significant role in adding emotion and connecting users to what they see on the screen. For this reason, it is critical to evaluate how users perceive these stimuli in order to improve their quality of experience (QoE). The main objective of this work is to evaluate how listeners perceive spatialised audio in a VR environment. To accomplish this goal, users are required to localise multiple sound sources, while also listening to distractors, in the acoustic scene. In addition, this work investigates user interaction with the environment, by comparing two interaction methods used to localise the sound source. Findings from this experiment show that eye gaze can be associated with levels of effort and cognitive load in a sound localisation task. Furthermore, other metrics related to eye gaze such as the number of fixations and number of revisits to virtual objects were also used as descriptors of user attention in this task. The results obtained improve the understanding of a listener's behaviour in VR.
KW - Eye Gaze
KW - H.1.2.b [Human-centred computing]: Human-computer interaction
KW - H.5.1.b [Multimedia Information Systems]: Artificial
KW - Human Behaviour
KW - Quality of Experience
KW - Spatial Audio
KW - Virtual Reality
KW - augmented
KW - virtual realities
UR - http://www.scopus.com/inward/record.url?scp=85146050145&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct57072.2022.00102
DO - 10.1109/ISMAR-Adjunct57072.2022.00102
M3 - Conference contribution
AN - SCOPUS:85146050145
T3 - Proceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
SP - 485
EP - 490
BT - Proceedings - 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 21st IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2022
Y2 - 17 October 2022 through 21 October 2022
ER -