Last fall, a routine family drive turned into a crash that is now reshaping how experts think about autonomous vehicle safety. Raffi Krikorian, a former self-driving executive, was behind the wheel of his Tesla Model X when the steering suddenly jerked. He reacted. He took control. Seconds later, the car hit a concrete wall.
No one was seriously hurt. But the outcome raised a deeper issue: the system worked well enough for years to train trust, until one moment when trust was not enough.
What Happened in the Tesla Crash?
Krikorian had used Full Self-Driving mode for three years without incident. On that day, the vehicle was operating under Level 2 automation, meaning the driver must stay alert at all times.
When the system behaved unexpectedly, he intervened. But the transition from automated to manual control created confusion. The car crashed. Records showed he acted. Yet responsibility fell on him, not the system.
The Science Behind Driver Inattention
Researchers point to a concept called vigilance decrement. It describes what happens when humans monitor systems that rarely fail.
The data is clear. After just one month of automation use, drivers are 6× more likely to check their phones. Over time, attention fades.
More critically, when control returns to a human, the brain needs 5–8 seconds to understand the situation and act. But real-world emergencies unfold in 6 seconds or less. That gap leaves little room for recovery.
In past incidents, drivers had seconds to respond but failed to act in time. The issue is not just reaction speed, it is delayed awareness.
Who Pays When Automation Fails?
Today, drivers carry most of the risk. Systems like Tesla’s remain classified under partial automation, meaning legal responsibility stays with the human driver.
This has real financial consequences. Claims often fall under car insurance or newer policies tied to autonomous vehicle insurance, where liability is still evolving.
In one case, a major verdict awarded damages to victims, marking a shift toward shared accountability. But globally, the pattern remains: drivers absorb most of the cost.
Some companies are testing alternatives. One automaker announced it would cover damages caused by certain automated features, showing that liability models can change.
The Business of Monitoring Attention
For fleet operators and insurers, the lesson is practical. Systems that reduce driver effort can also reduce driver awareness.
That creates demand for tools like in-cabin monitoring systems and vehicle telematics, which track attention and behaviour in real time. These tools are becoming central to modern fleet management systems. The goal is not to remove automation but to manage its side effects.
A System Design Problem
The core issue is not whether self-driving technology works. It is that it works too well until it doesn’t.
Drivers adapt to reliability. They relax and when failure happens, the time needed to react does not match the time available.
Krikorian’s crash shows a simple truth, that safety is not just about better machines or better drivers. It is about how both interact, and who takes responsibility when that interaction breaks down.
Read also: Germany’s auto industry risks falling behind in self-driving race













![Diesel trucks [Reuters]](https://autojournal.africa/wp-content/uploads/2026/04/Diesel-trucks-Reuters-350x250.png)


