The Uber self-driving car crash that killed a pedestrian in March 2018 was the fault of the vehicle’s operator, who wasn’t paying attention at the time and was likely looking at her cell phone, the National Transportation Safety Board has determined. But the safety watchdog didn’t end the blame game there. At a board meeting in Washington, DC, Tuesday afternoon, it said that a slew of terrible decisions—by Uber, the state of Arizona, and the federal government—contributed to the death of Elaine Herzberg, the 49-year-old woman who was fatally struck.
The safety board, which has no regulatory power, also issued a series of recommendations its members believe will help avoid a repeat crash. The six prompts—to the National Highway Traffic Safety Administration, which is charged with overseeing vehicle safety in the US; to the state of Arizona, which has very few rules governing automated vehicle testing; to the organization that oversees local law enforcement and motor vehicle departments; and to Uber itself—show that the safety board is pushing for regulation of self-driving vehicles on public roads, including codifying today’s federal guidelines into proper, enforceable rules.
But not too much regulation, which self-driving vehicle developers say might stifle innovation and prevent the roll-out of what they call a life-saving technology. “We haven’t really put the meat cleaver to this and tried to stifle innovation,” NTSB Chair Robert Sumwalt told reporters after the board meeting Tuesday. “We’re just trying to put some bounds on the testing on the roadways.”
First, the safety panel thinks NHTSA should do more to gauge how self-driving developers are running their test operations on public roads. NHTSA’s guidelines for testing robocars are a set of principles rather than a blueprint for safety, and while the agency invites companies to submit self-assessment safety reports, it doesn’t evaluate those. As a result, the 16 voluntary safety assessment letters that AV companies have submitted “are kind of all over the place,” NTSB investigator Ensar Becic said at the meeting. (Sixty-two companies are registered to test their robots in California.) “Some have a good amount of detail, while others quite frankly read like marketing brochures.” Jennifer Homendy, one of three NTSB board members, called the setup “laughable.”
The Board voted unanimously to recommend that NHTSA make those reports mandatory, and create a process for actually assessing them. In a statement, NHTSA said it’s still working on its own investigation into the crash, and that it will “carefully review” the NTSB’s report and recommendations.
social experiment by Livio Acerbo #greengroundit #wired https://www.wired.com/story/feds-blame-uber-crash-on-human-driver-call-for-better-rules