LOS ANGELES – Federal safety regulators are sending a team to California to investigate a fatal highway crash involving Tesla, shortly after authorities in Auckland arrested a man in another Tesla who was rolling on the highway without anyone behind the wheel.
Experts say both cases are putting pressure on the National Highway Traffic Safety Administration to take action on Tesla’s TSLA.
a partially automated driving system called Autopilot, which has been involved in a number of crashes that have resulted in at least three deaths in the United States.
The May 5 crash investigation in Fontana, California, east of Los Angeles, is the 29th Tesla case to which the agency has responded.
“Details of whether Tesla was offline are still being investigated,”
The driver of Tesla, a 35-year-old man, whose name was not disclosed, was killed and another man was seriously injured when the electric car hit an overturned minibus. The injured man, a 30-year-old passing driver, was hit by a Tesla while helping the driver of the semi-factory stay from the wreckage.
“We have launched a special investigation into the accident. NHTSA continues to monitor the safety of all motor vehicles and equipment, including automated technology, “the agency said in a statement on Wednesday.
The investigation comes shortly after the California Highway Patrol arrested another man, whom authorities say was in the back seat of Tesla, which was traveling on Interstate 80 without anyone behind the wheel.
Param Sharma, 25, has been charged with reckless driving and disobeying a peaceful officer, a WTO statement said on Tuesday.
The statement did not say whether officials had determined whether Tesla was operating on autopilot, which could keep a car centered in its lane and a safe distance behind the vehicles in front of it.
But it is likely that either the autopilot or “Full self-steering” were in place so that the driver could be in the back seat. Tesla allows a limited number of owners to test its self-management system.
Tesla, which disbanded its public relations department, did not respond to messages seeking comment on Wednesday.
The Fountain investigation, in addition to the probes for two crashes in Michigan earlier this year, shows that NHTSA is taking a closer look at Tesla’s systems. Experts say the agency needs to master such systems because people tend to trust them too much when they can’t manage on their own.
“I think they’re probably getting serious about this, and we can actually start to see some action in the not-too-distant future,” said Sam Abuelsamid, chief mobility analyst at Guidehouse Insights, who follows automated systems.
“I definitely think the growing number of accidents adds more fuel to the fire so NHTSA can do more,” said Missy Cummings, a professor of electrical engineering and computer engineering at Duke University who studies automated vehicles. “I think they will be stronger in this.”
On its website and in owner’s manuals, Tesla says that for both driver assistance systems, drivers must be prepared to intervene at all times. But drivers repeatedly drift off using the autopilot, leading to crashes in which neither the system nor the driver stops for obstacles on the road.
The federal agency may declare the autopilot defective and demand its withdrawal, or force Tesla to limit areas where the autopilot can be used to restricted highways. This can also make the company install a stronger system to ensure that drivers pay attention.
The automotive industry, with the exception of Tesla, is already doing a good job of limiting the ability of such systems to work and moving to self-regulation, Cummings said. Tesla seems to be on this path. She is now installing driver-facing cameras on the latest models, she said.
Tesla has a driver monitoring system to make sure they pay attention by detecting power from the hands of the steering wheel.
The system will issue warnings and eventually turn off the car if it does not find hands. But critics say Tesla’s system is easy to fool and could take up to a minute to shut down. Consumer Reports said in April that it had tricked Tesla into driving in Autopilot mode without anyone behind the wheel.
In March, a Tesla official also told regulators in California that “Full Self-Management” is a driver assistance system that requires human supervision. In notes released by the State Department of Motor Vehicles, the company could not say whether Tesla’s technology would improve to fully self-driving by the end of the year, contrary to statements by the company’s CEO Elon Musk.
In the case of driving in the back seat, authorities received multiple 911 calls Monday night that a man was in the back of a Tesla Model 3 while the vehicle was traveling on Interstate 80 across the San Francisco-Auckland Bay Bridge.
A motorcycle employee spotted Tesla, confirmed that the solo passenger was in the back seat, took action to stop the car and saw the passenger move to the driver’s seat before the car stopped, the California Highway Patrol said in a statement.
Authorities said they quoted Sharma on April 27 for such behavior.
In an interview with the Associated Press on Wednesday, Sharma said he had done nothing wrong and would continue to ride in the back seat without anyone behind the wheel.
Musk wants him to keep doing this, he said. “It was actually designed to be driven in the back seat,” Sharma said. “I feel more confident in the back seat than in the driver’s seat, and I feel more confident with my car on autopilot, I trust my car autopilot more than anyone on the road.”
He believes that his Model 3 can drive alone and does not understand why he had to spend a night in prison.
“The way we’re standing right now, I can run a self-driving Tesla from Emeryville all the way to downtown San Francisco from the back seat,” he said, adding that he drove about 40,000 miles in Tesla cars without being a driver. place.
Sharma’s comments suggest he is among a number of Tesla pilots who rely too much on the company’s management systems, Duke Cummings said.
“It shows people the thought process behind people who have too much confidence in very unproven technology,” she said.