Towards the detection of driver-pedestrian eye contact

Vishal Onkhar, Pavlo Bazilinskyy, Jork C.J. Stapel, Dimitra Dodou, Dariu Gavrila, Joost C.F. de Winter

Research output: Contribution to journalArticleAcademicpeer-review

11 Citations (Scopus)

Abstract

Non-verbal communication, such as eye contact between drivers and pedestrians, has been regarded as one way to reduce accident risk. So far, studies have assumed rather than objectively measured the occurrence of eye contact. We address this research gap by developing an eye contact detection method and testing it in an indoor experiment with scripted driver–pedestrian interactions at a pedestrian crossing. Thirty participants acted as a pedestrian either standing on an imaginary curb or crossing an imaginary one-lane road in front of a stationary vehicle with an experimenter in the driver’s seat. In half of the trials, pedestrians were instructed to make eye contact with the driver; in the other half, they were prohibited from doing so. Both parties’ gaze was recorded using eye trackers. An in-vehicle stereo camera recorded the car’s point of view, a head-mounted camera recorded the pedestrian’s point of view, and the location of the driver’s and pedestrian’s eyes was estimated using image recognition. We demonstrate that eye contact can be detected by measuring the angles between the vector joining the estimated location of the driver’s and pedestrian’s eyes, and the pedestrian’s and driver’s instantaneous gaze directions, respectively, and identifying whether these angles fall below a threshold of 4°. We achieved 100% correct classification of the trials involving eye contact and those without eye contact, based on measured eye contact duration. The proposed eye contact detection method may be useful for future research into eye contact.
Original languageEnglish
Article number101455
Number of pages14
JournalPervasive and Mobile Computing
Volume76
DOIs
Publication statusPublished - Sept 2021
Externally publishedYes

Funding

This research is supported by grant 016.Vidi.178.047 (“How should automated vehicles communicate with other road users?”), which is financed by the Netherlands Organisation for Scientific Research (NWO). We want to express our gratitude to experimenters Lars Kooijman, Sparsh Sharma, Anand Sudha, and Arjun Anantharaman, who took turns to fulfill the role of using the torch in our study. An additional debt of thanks is owed to Lars Kooijman for his suggestions during the early stages of our work. We are also grateful to the participants for taking the time to help us conduct our research. We further appreciate the efforts of the many members (past and current) of the Department of Cognitive Robotics at the Delft University of Technology in building the intelligent vehicle we used in our experiment. All authors read and approved the final manuscript. This research is supported by grant 016.Vidi.178.047 (“How should automated vehicles communicate with other road users?”), which is financed by the Netherlands Organisation for Scientific Research (NWO) . We want to express our gratitude to experimenters Lars Kooijman, Sparsh Sharma, Anand Sudha, and Arjun Anantharaman, who took turns to fulfill the role of using the torch in our study. An additional debt of thanks is owed to Lars Kooijman for his suggestions during the early stages of our work. We are also grateful to the participants for taking the time to help us conduct our research. We further appreciate the efforts of the many members (past and current) of the Department of Cognitive Robotics at the Delft University of Technology in building the intelligent vehicle we used in our experiment. All authors read and approved the final manuscript.

FundersFunder number
Nederlandse Organisatie voor Wetenschappelijk Onderzoek

    Keywords

    • Driver–pedestrian interaction
    • Eye contact
    • Eye tracking
    • Image recognition
    • Wearable devices

    Fingerprint

    Dive into the research topics of 'Towards the detection of driver-pedestrian eye contact'. Together they form a unique fingerprint.

    Cite this