“I am going this way”: Gazing Eyes on Self-Driving Car Show Multiple Driving Directions

This is the project page for one of the sub-projects related to "Gazing Car".
This study aims to investigate the communication capabilities of the eyes on the car. The user study showed that the self-driving car's eyes can convey fine-grained turning directions, which can resolve the issue in spacious traffic situations (multiple roads toward the same side). [pdf]

Authors

Fig 1

Abstract

Modern cars express three moving directions (left, right, straight) using turn signals (i.e., blinkers), which is insufficient when multiple paths are toward the same side. As such, drivers give additional hints (e.g., gesture, eye contact) in the conventional car-to-pedestrian interaction. As more self-driving cars without drivers join the public roads, we need additional communication channels. In this work, we discussed the problem of self-driving cars expressing their fine-grained moving direction to pedestrians in addition to blinkers. We built anthropomorphic robotic eyes and mounted them on a real car. We applied the eye gazing technique with the common knowledge: I gaze at the direction I am heading to. We found that the eyes can convey fine-grained directions from our formal VR-based user study, where participants could distinguish five directions with a lower error rate and less time compared to the conventional turn signals.

Video

The citations of two figures used in the video:
ePermitTest. 2020. Hand Signals for Driving A Car. Retrieved April 26, 2022 from https://www.epermittest.com/drivers-education/hand-signals-driving
toptreadtyres. 2018. Car Horn. Retrieved April 26, 2022 from https://www.toptreadtyres.co.uk/horn/

Publication

Xinyue Gui, Koki Toda, Stela Hanbyeol Seo, Chia-Ming Chang, and Takeo Igarashi. 2022. “I am going this way”: Gazing Eyes on Self-Driving Car Show Multiple Driving Directions. In Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '22). Association for Computing Machinery, New York, NY, USA, 319–329. https://doi.org/10.1145/3543174.3545251

Chia-Ming Chang, Koki Toda, Xinyue Gui, Stela H. Seo, and Takeo Igarashi. 2022. Can Eyes on a Car Reduce Traffic Accidents? In Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '22). Association for Computing Machinery, New York, NY, USA, 349–359. https://doi.org/10.1145/3543174.3546841

Chia-Ming Chang. 2020. A Gender Study of Communication Interfaces between an Autonomous Car and a Pedestrian. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '20). Association for Computing Machinery, New York, NY, USA, 42–45. https://doi.org/10.1145/3409251.3411719

Chia-Ming Chang, Koki Toda, Takeo Igarashi, Masahiro Miyata, and Yasuhiro Kobayashi. 2018. A Video-based Study Comparing Communication Modalities between an Autonomous Car and a Pedestrian. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '18). Association for Computing Machinery, New York, NY, USA, 104–109. https://doi.org/10.1145/3239092.3265950

Chia-Ming Chang, Koki Toda, Daisuke Sakamoto, and Takeo Igarashi. 2017. Eyes on a Car: an Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '17). Association for Computing Machinery, New York, NY, USA, 65–73. https://doi.org/10.1145/3122986.3122989