04 Apr 2018

Tesla’s Autopilot System Involved in Fatal Crash, Concerns Over Lack of Data

In what has been a difficult time for autonomous vehicles, not helped by this recent incident involving a driver using the Tesoa Model X with the Autopilot feature. Unfortunately the driver later died in hospital, and the US National Transportation Safety Board (NTSB) has not been impressed by the lack of information released by Tesla.

shutterstock
Aleksandra Suzi / Shutterstock.com

 

The incident occurred on Friday 27 March at 9.27am on Highway 101, near Mountain View, California. Walter Huang was driving a Tesla Model X, with the autonomous feature switched on, when the car hit a concrete highway divider at full force in a fatal crash. Details about the crash have been hard to come by, even for Tesla, because of the extensive damage caused by the collision, making the vehicle logs difficult to retrieve. The NTSB was unhappy with the lack of cooperation from Tesla over the matter, though Elon Musk pointed out that it’s the NHTSA (National Highway Traffic Safety Administration) that regulates cars, not the NTSB, which is an advisory body.

Since managing to retrieve the vehicle logs, the company has released an update of its incident report, bringing more details to light. There had been several visual and aural hands-on warnings in the drive already, while in the six seconds before the crash, the driver’s hands had not been detected on the wheel at all. Before hitting the crash divider, the driver had apparently five seconds in which to act and 150 metres of unobstructed view of the obstacle, but no action was taken. Apparently the crush attenuator – the highway safety barrier designed to reduce the impact of the concrete divider in the event of a crash – had already been damaged in a previous incident and not properly replaced, meaning the damage in this crash was far more severe.

Tesla has been quick to cite the safety aspects of the autonomous system, how drivers have engaged the Autopilot in their Teslas at least 85,000 times on that same stretch of highway without any accidents. It may seem insensitive on the part of the company to cite statistics in the face of the tragedy, but it is important to remember the safer aspects of autonomous vehicles. Driverless cars, and indeed, any emerging technology that could pose a risk to human lives faces an uphill struggle to win over public sentiment. Tesla’s Autopilot has been involved in a crash before in 2016 but the company has been relatively untroubled by incidents since and has worked to improve its Autopilot feature. For driverless cars in general the timing is bad, with Uber forced to suspend its tests in Arizona after a pedestrian’s death earlier in March, though it must be stated that the technology involved in both these incidents is very different – Tesla’s Autopilot works at Level 2 autonomy while Uber’s cars are at Level 4. Still, the emerging autonomous automotive industry can ill afford too many of these PR backlashes while it is still in such a formative state.


 

References:     Tesla      Futurism 

 


This content is available to subscribers only. To continue reading...

Sign in to your account

Take a one-month free trial

If you aren't a subscriber, please sign up for a one-month free trial to access all Robotics Law Journal content, including:

  • All premium online content
  • Daily newsletters
  • Breaking news alerts


If you require further information, please email subscriptions@roboticslawjournal.com or contact call us on +44 (0) 20 7193 5801.