NHTSA Is Looking Into Fatal Crash Of Tesla Model S In Autopilot Mode

Last fall, Tesla released a beta version of Autopilot, a software upgrade that would let the car take over some driving functions, including steering, cruise control, and lane changes. Today, the company announced some sad news: the first fatal crash in of one of the company’s vehicles while in autopilot mode happened in northern Florida in May.

The company shared the news in a blog post, and this summary comes from a combination of that account and the police blotter from a local newspaper. The crash occurred on a divided highway, where the 2015 Model S collided with a tractor-trailer that was making a left turn across the Tesla driver’s lane.

Neither the driver nor the autopilot system saw the tractor-trailer, a problem that Tesla blames on the combination of a “brightly lit sky” and the reflective surface of the trailer. The car drove under the trailer, with the bottom of the trailer hitting the windshield, then shearing off the car’s roof. The Tesla kept driving and stopped about 100 feet away from the road past the tractor-trailer.

“The customer who died in this crash had a loving family and we are beyond saddened by their loss,” the company said in a blog post, noting that he was known to the company and to other electric vehicle drivers, and a great supporter of technology and progress. “We would like to extend our deepest sympathies to his family and friends,” the post concluded.

The Verge pointed out that the driver in this crash had posted a modestly viral dashcam video taken in early April of this year when his car swerved to avoid a truck that drifted into his lane. He also posted a collection of videos of his car’s autopilot in action to YouTube.

The National Highway Traffic Safety Administration is investigating the crash after Tesla reported it since the car was in partially autonomous mode at the time. The crash “calls for an examination of the design and performance of any driving aids in use at the time of the crash,” the agency notes in the paperwork that opened the investigation. Depending on the results, the car could be subject to a recall and a software update if the autonomous driving mode contributed to the crash.

“The NHTSA’s Office of Defects Investigation will examine the design and performance of the automated driving systems in use at the time of the crash,” the agency said in a statement. “During the Preliminary Evaluation, NHTSA will gather additional data regarding this incident and other information regarding the automated driving systems.”

In their statement, Tesla notes that in the United States, there’s an average of one traffic death for every 92 million miles driven, but drivers taking part in the autopilot beta have racked up 130 million miles before the first driver or passenger died.

Here’s the problem with a semi-autonomous vehicle, though: the car and the driver were sharing control of the vehicle, yet neither of them saw the truck. The autopilot mode is not designed for napping or watching cat videos while you’re on the highway: the driver is supposed to pay attention to the road and keep their hands on the wheel.

Yes, the software is in beta, turned off by default, and the company points out that users must acknowledge that when they activate it. Patrick Lin, director of the Ethics and Emerging Sciences Group at California Polytechnic State University and a philosopher of autonomous vehicles, explained to Consumerist that there’s a problem: drivers may not fully understand what they’re agreeing to when they activate autopilot.

“Even if the user has consented to this testing, other drivers and pedestrians around the robot car haven’t,” Lin explains. In the Florida crash, the driver of the truck was unharmed, but what about the next crash? What else is the autopilot system unable to see?

Humans are simply not good at sitting around, watching a machine, and occasionally pressing a button. Passenger jet pilots’ jobs are mostly automated, but they have extensive training and rules on drug use and even how much rest they must have between shifts. To own a semi-autonomous car, you need a bunch of money and a card that says you passed a brief and controlled driving test years or even decades ago. Perhaps drivers of semi-autonomous cars need extra training, too.

Our colleagues at Consumers Union, the policy and mobilization arm of Consumer Reports, weighed in on the incident today, urging NHTSA to “thoroughly investigate this fatal incident, and find out exactly what went wrong.”

“Self-driving cars must be rigorously tested — under a variety of environmental conditions — to prevent tragedies like this in the future,” said William Wallace, policy analyst for Consumers Union.

FURTHER READING/VIEWING:

Talking Cars on the Pros and Cons of Tesla Autopilot [Consumer Reports]

Want more consumer news? Visit our parent organization, Consumer Reports, for the latest on scams, recalls, and other consumer issues.