The design of Tesla’s Autopilot feature contributed to a January 2018 accident in which a Model S sedan smashed into the back of a fire truck in Southern California, according to federal safety investigators. It is the second time the National Transportation Safety Board has found Tesla partially responsible for a crash involving the semiautomated feature. The federal board says it’s also investigating two other Autopilot-involved crashes.
No one was hurt in the 2018 crash, but investigators found that the driver had flipped on Autopilot about 14 minutes before the crash, and that the driver had not actively steered for the final 13 minutes. Investigators said the driver’s inattention and overreliance on Autopilot were probable causes of the crash. During those final 14 minutes, the car warned the driver to apply pressure to the steering wheel four times, but he did not apply pressure in the roughly four minutes before the crash, investigators found
Investigators said the driver’s use of Autopilot was “in ways inconsistent” with Tesla’s guidance. The driver said he learned how to use Autopilot from a Tesla salesperson, but did not read the owner’s manual, which tells drivers exactly when and where they should use Autopilot.
The incident emphasizes what industry watchdogs and even Tesla itself has said before: Autopilot isn’t a self-driving technology. It requires drivers’ attention, even when the road ahead looks like smooth sailing.
But investigators also seem to believe that Tesla isn’t doing enough to make Autopilot safe. In its report, the NTSB highlighted a recommendation following another Autopilot-involved crash, which killed a Florida driver in 2016. The panel asked automakers to “develop applications to more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking” when using “automated vehicle control systems.” Tesla has changed how Autopilot works, requiring drivers to put pressure on the wheel more frequently while the feature is engaged. But the NTSB seems to believe it’s not enough.
“Fool me once, shame on you, fool me twice, shame on me, fool me four, five, or six times now—that’s too much,” says David Friedman, former acting head of the National Highway Traffic Safety Administration and now the director of advocacy at Consumer Reports. “If Tesla doesn’t fix Autopilot, then [the federal government] should do it for them.” (The NTSB can only recommend safety improvements; NHTSA can enact regulations.)
In a statement, Tesla said, “Tesla drivers have driven billions of miles with Autopilot engaged, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot remain safer than those operating without assistance. While our driver-monitoring system for Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored, we’ve also introduced numerous updates to make our safeguards smarter, safer and more effective across every hardware platform we’ve deployed. Since this incident occurred, we have made updates to our system including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated.”
The vehicle in the 2018 crash, in Culver City, California, was a 2014 model. Tesla has since revamped the hardware—the front-facing cameras and radar, the ultrasonic sensors—in its vehicles. (CEO Elon Musk has famously said that today’s Teslas have all the hardware they need to drive themselves. The electric automaker is still working on the software part.)
social experiment by Livio Acerbo #greengroundit #wired https://www.wired.com/story/feds-tesla-autopilot-partly-blame-2018-crash