Tesla Autopilot system contributed to 2016 fatal crash

Tesla Inc.’s design of its Autopilot system contributed to a 2016 fatal crash in Florida, U.S. accident investigators concluded as they recommended all automakers prevent autonomous driving systems from being used on roads for which they are not designed.

The National Transportation Safety Board, in its first probe of autonomous driving technologies, recommended Tuesday that systems such as Tesla’s Autopilot be unavailable when the vehicle is traveling on a road where its use is inappropriate.

The accident occurred on a divided road with occasional intersections and the company had warned owners not to use Autopilot in those conditions. In spite of those warnings, the car’s software allowed drivers to go as fast as 90 miles an hour under automated steering, the NTSB found.

“In this crash, Tesla’s system worked as designed,” NTSB Chairman Robert Sumwalt said. “But it was designed to perform limited tasks in a limited range of environments. The system gave far too much leeway to the driver to divert his attention to something other than driving.”

The accident also highlighted how the latest in vehicle sensors and auto-braking technology still can’t reliably detect crossing traffic ahead. “System safeguards were lacking,” Sumwalt said.

Joshua Brown, a former Navy SEAL, died May 7, 2016 when his Model S struck a truck crossing the road in front of him on a Florida highway. His car showed no signs that he tried to brake or evade the truck, which was making a left turn, as he drove at 74 miles an hour. Investigators concluded the car was driving itself.

The truck driver’s failure to yield as he made the turn and Brown’s over-reliance on Tesla’s automation were the primary causes of the accident, the NTSB found. It also concluded that the automation contributed because it permitted Brown’s “prolonged disengagement from the driving task.”

The safety board’s findings and recommendations could have broad implications for how self-driving technology is phased in on vehicles and trucks, and it comes as Congress is debating legislation to spur autonomous vehicle systems. Technology and auto companies are pouring billions of dollars into a race to develop self-driving vehicles, which carmakers from Tesla to Volvo AB say could be deployed in less than 10 years.

The NTSB’s cautionary tone on the emergence of self-driving technology contrasted with the Department of Transportation, which revised its policy on self-driving vehicles Tuesday in an attempt to remove obstacles to the testing of such vehicles.

The safety board directed recommendations to the industry and to U.S. regulators to restrict how automation is used and to collect better safety data on this new class of vehicle. The NTSB, which has no power to regulate, issues recommendations to agencies and industry to improve safety. While they have no legal requirement, the majority are adopted.

Tesla defended its Autopilot system in an emailed statement, saying it has helped reduce crashes.

“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” the company said. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

The NTSB ruled out mechanical deficiencies, distraction or driver impairment in the accident. Both drivers had roughly 10 seconds to see the other vehicle before the impact, its crash reconstruction concluded.

While the NTSB praised Tesla for making improvements in its technology since the crash, it said that the system still gave drivers too much leeway to activate the automation in conditions where it might be unsafe. Sumwalt read from portions of the company’s manual that offered contradictory instructions on the use of automation.

Self-Driving Misconceptions

“It sounds to me like Tesla is sort of speaking out of both sides of their mouth in this respect,” he said.

Sumwalt also sought to clear up what he called misconceptions about the growing use of automation in cars. While technology that alerts drivers when they leave their lane or that automatically hits the brakes can improve safety, they still require a driver to pay attention and remain in control.

“It’s true that some companies are experimenting with and testing self-driving cars, but there are no self-driving cars on the market today,” he said.

The NTSB staff say that the way Tesla and other carmakers measure whether a driver is paying attention by monitoring steering wheel movement doesn’t accurately reflect whether people are even looking at the road.

Even though Brown’s Model S warned him seven times during the 37 minutes before the crash that his hands weren’t on the steering wheel, he was able to touch the wheel momentarily and the system continued driving itself, according to the NTSB. Newer versions of Tesla’s Autopilot stop the car after the third such infraction, but drivers can go for minutes at a time without steering or quickly stop the warning, according to NTSB.

Better Collection

The board recommended that regulators find better ways to measure driver attentiveness, such as using scanners that monitor where a person’s eyes are looking. And it wants manufacturers to use global-positioning technology to identify a car’s location and prohibit activation of automatic steering in places where it isn’t safe.

The NTSB is also seeking better collection of data from automated vehicles, which are highly computerized and could offer a wealth of information about driving patterns and how the systems function. Such data collection has been done for decades in the aviation world, and auto regulators have been in touch with the Federal Aviation Administration to see how they could adopt such measures, said Robert Molloy, director of NTSB’s Office of Highway Safety.

While Tesla and growing numbers of vehicles are equipped with sensors that prevent rear-end collisions, that technology isn’t adept at sensing a crossing vehicle. The NTSB reiterated earlier recommendations it has made to spur the development of vehicle-to-vehicle radio transmissions to prevent such accidents.

“We would love to move from crash mitigation to crash prevention,” said Deb Bruce, the NTSB’s chief investigator on the case.

Brown, who “loved technology,” believed the Tesla automation has saved lives, according to a statement released by his family on Monday through their attorneys. “We heard numerous times that the car killed our son,” said the statement issued by the law firm Landskroner Grieco Merriman LLC. “That is simply not the case.”

The statement also praised Tesla for improving its Autopilot software after the accident, changes it said were a direct result of the crash.

Now read: Audi Aicon is so safe, it doesn’t have seat belts

Latest news

Partner Content

Show comments

Recommended

Share this article
Tesla Autopilot system contributed to 2016 fatal crash