Work Zone Safety III: Calibration of Safety Notifications through Reinforcement Learning and Eye Tracking

According to the Federal Highway Administration (FHWA), work zone fatalities at road construction projects account for up to 3% of all workplace fatalities in a given year, and the primary causes are runovers/backovers, collisions, and caught in-between mobile equipment. Hence, drivers and the way they perceive the work zone and related notifications are primary factors required to reduce fatalities. A study of work zone crash data in five states showed that around half of the crashes occur within or adjacent to work activities, putting workers in danger together with drivers [2]. To reduce work zone injuries and fatalities, regulations such as mandated Personal Protective Equipment (PPE), traffic control plans, advance warning signs, the share of traveler information, and signal timing adjustments (ANSI, OSHA) were introduced by the regulatory bodies. However, these mainly aim for changing the behavior of drivers instead of workers. Although there is a large body of analysis and modeling literature related to work zone accidents as documented in [3], the actual safety treatments applicable to real-world work zones are limited at best and there is still a need for proactive approaches to be deployed at highway work zones, capable of warning construction workers of approaching hazards in advance. To improve work zone safety, in the previous two phases of this project, we proposed a virtual reality (VR)-based platform that integrates with SUMO and hardware in the loop sensors to realistically simulate dangerous situations in work zones (i.e., enabling worker-initiated changes in the work zone to be accounted in SUMO and updated simulation to be displayed real-time in VR). In this phase, we propose to add two main components to the existing VR work zone safety testing platform. The first component focuses on monitoring construction workers’ attention. To that end, we propose adding new functionality to the current VR platform to track the subjects’ attention through his/her head-movement and eye-movement to infer his/her gaze pattern. With the introduction of this method to measure the subject’s attention, we plan to capture additional critical information about the decision a worker makes.


  • Developing an integrated platform to enable hardware-in-the-loop for synchronous VR, traffic simulation and sensor interactions Zou, Z., Bernardes, S. D., Zuo, F., Ergan, S., and Ozbay, K. Journal of Advanced Engineering Informatics, 51, 101476, 2022.