The accident renews questions about Autopilot, a signature feature of Tesla vehicles, and whether the company has gone far enough to ensure that it keeps drivers and passengers safe.
“At the very least, I think there will have to be fundamental changes to Autopilot,” said Mike Ramsey, a Gartner analyst who focuses on self-driving technology. “The system as it is now tricks you into thinking it has more capability than it does. It’s not an autonomous system. It’s not a hands-free system. But that’s how people are using it, and it works fine, until it suddenly doesn’t.”
On Saturday, Tesla declined to comment on the California crash or to make Mr. Musk or another executive available for an interview. In its blog post on Friday about the crash, the company acknowledged that Autopilot “does not prevent all accidents,” but said the system “makes them much less likely to occur” and “unequivocally makes the world safer.”
For the company, the significance of the crash goes beyond Autopilot. Tesla is already reeling from a barrage of negative news. The value of its stock and bonds has plunged amid increasing concerns about how much cash it is using up and the repeated delays in the production of the Model 3, a battery-powered compact car that Mr. Musk is counting on to generate much-needed revenue.
It is also facing an investor lawsuit related to Tesla’s acquisition of SolarCity, a solar-panel maker where Mr. Musk was serving as chairman. Meanwhile, competition is mounting from other luxury car makers that have developed their own electric cars, while Waymo, the Google spinoff, General Motors and others seem to have passed Tesla in self-driving technology.
“There’s a lot going on that undermines Elon’s credibility right now,” said Karl Brauer, a senior analyst at Kelley Blue Book.
Autopilot uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver. Tesla readily points out that Autopilot — despite the implications in its name — is only a driver-assistance system and is not intended to pilot cars on its own.
Drivers are given warnings on the dashboard and in the owner’s manual to remain engaged and alert while using it. Tesla originally described it as a “beta” version, a term that usually refers to software still in the developmental stage.
At the time of the Florida crash, it was possible to engage Autopilot and cruise on highways for several minutes without the driver holding the steering wheel. In that crash, the Autopilot’s camera, then the primary sensor in the system, failed to recognize a white truck as it was crossing a rural highway. Tesla said the camera was confused because the truck appeared against a bright sky.
Software modifications introduced that fall included more frequent warnings to drivers to keep their hands on the steering wheel. After three warnings, the new software prevents Autopilot from operating until the driver stops, turns off the car and restarts.
The new version also made radar the primary sensor, and Mr. Musk said the new radar would have been able to see the truck in the Florida crash despite the bright sky.
Autopilot does not use lidar — a kind of radar based on lasers — that Waymo and others have maintained are crucial for fully autonomous vehicles. Mr. Musk has said he believes lidar is not necessary for Autopilot be safe.
At least three people have now died while driving with Autopilot engaged. In January 2017, a Chinese owner was at the wheel of a Model S when the car crashed into a road sweeper on a highway.
The National Transportation Safety Board is now investigating the March 23 crash that killed Mr. Huang. Its investigation of the 2016 Florida accident concluded that Autopilot “played a major role,” and said that it lacked safeguards to prevent misuse by drivers.
An earlier investigation, by the National Highway Transportation Safety Administration, said that the company’s Autopilot-enabled vehicles did not need to be recalled. That inquiry, however, focused only on the question of whether any flaws in the system had led to the crash; it found no such flaws.
A version of this article appears in print on April 1, 2018, on Page A21 of the New York edition with the headline: New Questions About Tesla’s Self-Driving System.
Continue reading the main story