A team of researchers from UCL has recently developed an innovative method to determine the attention levels of drivers using auto-pilot mode and their readiness to respond to warning signals. The study, published in Cognitive Research: Principles and Implications, suggests that eye movements can be used to detect how focused drivers are on their on-screen activities. This breakthrough offers a promising solution to assess drivers’ readiness to react to real-world signals, such as takeover requests from the car.

While fully autonomous driverless cars haven’t yet become available for personal use, cars equipped with an auto-pilot mode are already being used for commercial private purposes in certain locations, such as Germany and select US states. Auto-pilot mode allows drivers to take their hands off the wheel and engage in other activities, like playing games on the integrated central screen. However, existing models often require drivers to take back control of the vehicle at certain points, such as when traffic conditions change. For example, during a traffic jam, drivers can utilize auto-pilot mode, but once the congestion clears, they receive a “takeover” signal from the car’s AI, indicating that they must resume full control of the vehicle.

In order to determine if a person is too engrossed in another task to promptly respond to a “takeover” signal, the researchers conducted two experiments with 42 participants. These experiments simulated a scenario resembling a takeover situation with advanced auto-pilot models. The participants were asked to search for target items displayed on a computer screen consisting of colored shapes, and they had to maintain their gaze on the targets to demonstrate that they had found them. The search tasks were designed to be either easy or demanding, depending on the complexity of the target items and their arrangement.

During the search tasks, a tone would sound at certain intervals, and the participants had to immediately shift their attention away from the screen and press a button in response to the tone. The researchers monitored the time it took for the participants to stop watching the screen and respond to the tone, while also analyzing their eye movements across the screen. They discovered that participants took longer to respond to the tone when the task demanded more attention. Furthermore, the analysis of eye movements revealed that attention levels could be detected by observing changes in participants’ gaze patterns. When a task required more attention, participants tended to have longer fixations on specific items and shorter distances traveled between items.

To further validate their findings, the researchers trained a machine learning model using this eye movement data. Surprisingly, they were able to predict whether participants were engaged in an easy or demanding task based solely on their eye movement patterns. This suggests that eye movements can provide valuable insights into an individual’s level of engagement and attention.

The lead author of the study, Professor Nilli Lavie from UCL Institute of Cognitive Neuroscience, asserts that driverless car technology is advancing rapidly, promising a more enjoyable and productive driving experience. However, the crucial question remains: can drivers swiftly transition back to driving when they are fully engaged in another task? Lavie emphasizes that their findings demonstrate the possibility of detecting drivers’ attention levels and readiness to respond to warning signals by monitoring their gaze patterns. The study reveals that individuals can become so absorbed in on-screen activities that they ignore the world around them, resulting in delayed responses to warning signals.

The UCL-led research team’s development of a method to determine driver attention levels using eye movements presents an exciting prospect for the future of driverless car technology. By adopting this approach, it may be possible to better evaluate drivers’ preparedness to respond to takeover signals while engaged in other tasks. However, further work is needed to collect larger datasets and refine the machine learning models to enhance accuracy. With continued advancements in this field, the implementation of driver attention detection systems could substantially improve the safety and effectiveness of auto-pilot modes in vehicles.

Technology

Articles You May Like

The Mystery of Ultradense Matter: Unveiling the Secrets of Superheavy Elements
Twitter Leaves EU Disinformation Code
OpenAI’s ChatGPT will not destroy job market, says CEO Sam Altman
The Impact of Meta Platforms Blocking News in Canada

Leave a Reply

Your email address will not be published. Required fields are marked *