Tesla sued over Tokyo biker's death in 'dozing driver' Autopilot crash

Motorcyclist had stopped to help with a separate traffic accident, say court docs

Got Tips? 87 Reg comments
metallic white Tesla Model X at Brussels Motor Show
A white Tesla Model X. File photo

Tesla is being sued by the widow and daughter of a man killed when an allegedly dozing driver let his Model X’s Autopilot feature steer it into a group of people.

The complaint was filed on Tuesday in a court in the Northern District of California and assigned to Magistrate Judge Susan van Keulen yesterday.

Yoshihiro Umeda, a 44-year-old husband and father, died in April 2018 when a Tesla allegedly being operated in self-driving mode crashed into a group gathered at the site of an earlier traffic accident.

A number of motorcyclists had stopped on a motorway's inside lane after one of their number was in an unrelated traffic accident with a van. According to the filing, some way behind this accident, the Tesla – complete with driver who allegedly "began to feel drowsy" before the incident – was set to Autopilot, with onboard computers tracking a car in front of it.

That car changed lanes to avoid the stationary group of bikers. Rather than slowing down and changing lanes too, the complaint alleged the Tesla accelerated until it hit the group.

"When the vehicle in front of the Tesla Model X switched lanes the Tesla Autopilot suite of technologies failed to recognize the stationary van, motorcycles, and pedestrians ahead," said the lawsuit. "Instead, the Tesla Model X automatically began to accelerate to the preset cruising speed before crashing into these objects and people and killing Mr. Umeda."

Umeda's relatives filed the lawsuit against Tesla in the American state of California's federal Northern District Court, the local court for Tesla’s Palo Alto HQ. They claim that Tesla's Autopilot system is "defective and incapable of handling common driving scenarios" and that its system for detecting drivers who aren't paying attention is fatally defective".

Key to their claim is an allegation that despite the unnamed driver's hands being on the Tesla's wheel up to the moment of impact, he was able to "doze off" while not triggering any of the car's alarm features. These chimes and visual warnings are supposed to ensure drivers with the enhanced lane-keeping and cruise control features engaged pay attention to the road.

The relatives also allege that Tesla "knew of such defects" in Autopilot and had been warned about them before the fatal crash.

Tesla has yet to file its response to the lawsuit, for which it received a summons just yesterday. The complaint can be read in full as a 45-page PDF.

In 2018 a similar incident happened when a Tesla crashed into a stationary fire engine in America. Robin Geoulla, the driver, told investigators he was drinking coffee and eating a bagel at the time of the crash. He later described the Autopilot self-driving feature as having been "named wrong".

Tragically, an Apple engineer was killed in the same year when his Tesla Model X accelerated into a roadside crash barrier. American regulators at the time claimed Tesla did not co-operate fully with the post-crash investigation. ®

Sponsored: How to simplify data protection on Amazon Web Services

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER


Biting the hand that feeds IT © 1998–2020