With fast technological developments, traditional perceptual environments disappear and new ones emerge. These changes make the human senses adapt to new ways of perceptual understanding, for example, regarding the perceptual integration of sound and vision. Proceeding from the fact that hearing cooperates with visual attention processes, the aim of this study is to investigate the effect of different sound design conditions on the perception of cinematic content in immersive audiovisual reproductions. Here we introduce the results of a visual selective attention task (counting objects) performed by participants watching a 270-degree immersive audiovisual display, on which a movie ("Ego Cure") was shown. Four sound conditions were used, which employed an increasing number of loudspeakers, i.e., mono, stereo, 5.1 and 7.1.4. Eye tracking was used to record the participant's eye gaze during the task. The eye tracking data showed that an increased number of speakers and a wider spatial audio distribution diffused the participants' attention from the task-related part of the display to non-task-related directions. The number of participants looking at the task-irrelevant display in the 7.1.4 condition was significantly higher than in the mono audio condition. This implies that additional spatial cues in the auditory modality automatically influence human visual attention (involuntary eye movements) and human analysis of visual information. Sound engineers should consider this when mixing educational or any other information-oriented productions.
|Title of host publication||12th Asia Pacific Workshop on Mixed and Augmented Reality (APMAR)|
|Publisher||IEEE Computer Society|
|Publication status||Published - 9 May 2019|
|Event|| 2019 12th Asia Pacific Workshop on Mixed and Augmented Reality (APMAR) - Nara Institute of Science and Technology (NAIST), Nara, Japan|
Duration: 28 Mar 2019 → 29 Mar 2019
|Conference||2019 12th Asia Pacific Workshop on Mixed and Augmented Reality (APMAR)|
|Period||28/03/19 → 29/03/19|