Abstract
In our natural environment, we simultaneously receive information through various sensory
modalities. The properties of these stimuli are coupled by physical laws, so that, for example, auditory and visual stimuli caused by the same event, have a fixed temporal, spatial, and contextual relation when reaching the observer. In speech, for example, visible lip movements and audible utterances occur in close synchrony, which contributes to the improvement of speech intelligibility under adverse acoustic conditions. Since research into multisensory time perception is being performed in a great variety of experimental and application contexts, it is difficult to obtain a good overview of specific problems in this area. The present paper is written with the intention of providing such an in-depth overview for the phenomenon of audio- visual asynchrony.
Original language | English |
---|---|
Pages (from-to) | 140-149 |
Number of pages | 10 |
Journal | IPO Annual Progress Report |
Volume | 35 |
Publication status | Published - 2000 |