Wednesday, May 16, 2018

NHTSA Opens Investigation Into A Tesla Crash in Utah - autopilot caused

Just yesterday we wrote that it was starting to feel like the movie Groundhog Day for Tesla when it comes to the company's executive departures.  Now, there is again a distinct Groundhog Day feel to Tesla - but this time as it relates to U.S. National Highway Traffic Safety Administration (NHTSA) investigations.
Continuing in what has been an unprecedented storm of negative headlines for Elon Musk's company that started a couple months ago, it was announced today that the NHTSA is going to be investigating a recent Tesla crash in Utah. Reuters reported:
The U.S. National Highway Traffic Safety Administration (NHTSA) said on Wednesday that it was sending a team to investigate the crash of a Tesla Inc vehicle last week in Utah that the driver said occurred while the car was in autopilot mode. It is the third Tesla crash that may be linked to the semi-autonomous Autopilot system being investigated by the government agency since January.
“The agency has launched its special crash investigations team to gather information on the South Jordan, Utah, crash,” the agency said on Wednesday. “NHTSA will take appropriate action based on its review.”
That crash, as described previously, involved a Tesla Model S sedan which smashed into a Salt Lake City fire truck while traveling at 60 miles per hour last Friday night. The driver said she was using autopilot at the time of the crash and suffered a broken ankle.
Tesla auto pilot LOVES slamming into parked vehicles on side of road. Loves it. Great software running those rolling crematoriums
The big surprise, as the article points out, is that this is the third investigation that the NHTSA has launched into Tesla since January. This should be of concern for the company as the NHTSA has the authority to force the company into issuing recalls which, if costly enough, could be a financial hurdle that Tesla might have trouble surviving in its current financial state.
NHTSA is also investigating a fatal crash in March that involved a Tesla Model X using Autopilot. It is also probing the January crash of a Tesla vehicle apparently traveling in Autopilot that struck a fire truck. Both incidents were in California.
Last week, NHTSA also said it would probe a May 8 Tesla accident in Florida that killed two teenagers and injured another. Autopilot was not thought to play a part.
NHTSA can order a recall if it finds a defect poses an unreasonable risk to safety.
Certainly the timing of this NHTSA regulatory scrutiny couldn’t be worse for the company, because in addition to its precarious financials, the company main liason to regulators sch as the NTSB and NHTSA departed the company just days ago.
Last Saturday, Tesla competitor Waymo announced that Tesla's Matt Schwall has begun working for the self-driving car unit. According to Schwall's LinkedIn Bio, he had been Tesla's "primary technical contact" with both the NTSB & NHTSA, suggesting the company's troubles with government regulators may be set to escalate.
Regulators aside, analysts have continued to question why the revolving door of executives continues and explore what the potential complications could be as a result. Needless to say, it isn't good news for Tesla.
Also as a reminder, on Sunday morning the WSJ wrote a scathing critique of Tesla's autopilot," In Self-Driving Car Road Test, We Are the Guinea Pigs" in which it questioned the validity of Elon Musk's claims about Tesla's safety record:
Tesla says that its cars with autonomous driving technology are 3.7 times safer than the average American vehicle. It’s true that Teslas are among the safest cars on the road, but it isn’t clear how much of this safety is due to the driving habits of its enthusiast owners (for now, those who can afford Teslas) or other factors, such as build quality or the cars’ crash avoidance technology, rather than Autopilot.
In the wake of a fatal 2016 crash, which happened when Autopilot was engaged, Tesla cited a report by the National Highway Traffic Safety Administration as evidence that Autopilot mode makes Teslas 40% safer. NHTSA recently clarified the report was based on Tesla’s own unaudited data, and NHTSA didn’t take into account whether Autopilot was engaged. Complicating things further, Tesla rolled out an auto-braking safety feature—which almost certainly reduced crashes—shortly before it launched Autopilot.
As the WSJ also notes, "there isn’t enough data to verify that self-driving vehicles cause fewer accidents than human-driven ones." A Rand Corp. study concluded that traffic fatalities already occur at such relatively low rates—on the order of 1 per 100 million miles traveled—that determining whether self-driving cars are safer than humans could take decades.
For now, we'll wait for the NHTSA's verdict.