قالب وردپرس درنا توس
Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Business https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Tesla has not solved an autopilot problem for three years, and now another person is dead

Tesla has not solved an autopilot problem for three years, and now another person is dead



On May 7, 2016, a 40-year-old man named Joshua Brown was killed when his Tesla Model S sedan faced a tractor trailer that crossed his way along 27A highway near Williston, Florida. Nearly three years later, another Tesla owner, 50-year-old Jeremi Beren Banner, was also killed on a highway in Florida under ominously similar circumstances: his Model 3 collided with a trailer-trailer that crossed his way by cutting the roof into process.

There was another great similarity: the two drivers were found by the investigators to have used the advanced system to assist the Tesla Autopilot driver during their respective catastrophes. The autopilot is a level 2 semi-autonomous system as described by the Automotive Engineers' Company, which combines adaptive cruise control, sailing maintenance, parking and, recently, the ability to automatically change the lanes. Tesla sees him as one of the safest systems on the road today, but the death of Brown and Banner raises questions about these claims and suggests that Tesla has neglected to overcome a major weakness in his leading technology.

Between the two crashes there are big differences. For example, the Brown and Banner cars had very different driver assistance technologies, although both are called Autopilot. Brawn's autopilot in Model S is based on the technology provided by Mobileye, an Israeli startup, after it has been acquired by Intel. Brown's death was partly responsible for the two companies that parted in 201

6. Banner's Model 3 was equipped with a second-generation autopilot developed by Tesla in the house. This suggests that Tesla has been able to cope with the so-called "extreme event" or unusual circumstance when he redesigns the autopilot, but has so far failed to do so. After Brown's death, Tesla said the camera had failed to recognize the white truck against a bright sky; The US National Road Traffic Safety Administration (NHTSA) found essentially that Brown did not pay attention to the road and released Tesla. He determined to put his cruise control on his car at 74 miles per hour about two minutes before the crash and had to have at least seven seconds to see the truck before he encountered it.

Federal investigators have not yet decided in Banner's death. In a preliminary report released on May 15, the National Traffic Safety Council (NTSB) said Banner was engaged in Autopilot about 10 seconds before the collision. "From less than eight seconds before the crash to the point of impact, the car has not found the hand of the driver," said NTSB. The car was traveling at 68 mph when it crashed.

In a statement, Tesla's spokesperson formulated it differently, changing the passive "vehicle did not reveal the driver's hands" to the more active "driver immediately took his hands off the steering wheel."

In the past, Tesla's CEO, Elon Musk, blames autopilot crashes on the driver's overconfidence. "When there is a serious disaster, it's almost always, in fact, perhaps, always, the case that it is an experienced user, and the question is rather self-satisfied," Musk said last year.

The latest catastrophe comes at a time when Musk advertises Tesla's plans to deploy autonomous taxis in 2020. "In one year, we will have over a million cars with full self-drive, software, everything," he said at a recent "Independence Day" for investors

These plans would be meaningless if federal regulators decide to stop autopilot. Consumer advocates urge the government to begin an investigation into the advanced driver assistance system. "Or the Autopilot can not see the 18-wheeler's broader side or can not react safely to it," said David Friedman, vice president of advocacy "This system can not stand alone on normal road situations and can not hold the driver just when it is needed."

Car Safety Experts note that adaptive speed control systems such as Autopilot rely heavily on radar to avoid hitting other vehicles on the road. The radar is good at detecting moving objects, but not on moving objects. He also has difficulty in discovering objects as a vehicle crossing the road that does not move in the direction of the car's movement.

Radar exits of open objects are sometimes ignored by vehicle software to cope with the generation of "false positive results," says Raj Rajkumar, Professor of Electrical Engineering and Computer Engineering at Carnegie Mellon University. Without them, the radar will "see" an overpass and report that this is an obstacle that causes the vehicle to hit the brakes. On the computer vision side of the equation, algorithms using the camera exit must be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations there are vehicles in the front, rear and side, but the perpendicular vehicle is much less common.

"In essence, the same incident is repeated in three years," said Rajkumar. "This shows that these two issues have not yet been addressed." Machine training and artificial intelligence have inherent limitations. If the sensors "see" what they have never seen before, they do not know how to deal with these situations. "Tesla does not handle AI's well-known limitations," he added.

Tesla has not yet explained in detail how he intends to solve this problem. The company publishes a three-month safety report for Autopilot, but this report does not contain details. This means that researchers in the research community have no solid data to allow them to compare Autopilot's performance with other systems. Only Tesla has a 100% understanding of the logic and source code of Autopilot, and keeps these secrets closely.

"Exact details of exposure are needed when, where and what conditions drivers use the autopilot," said Brian Raymer, a researcher at the MIT Transport and Logistics Center, in an email to The Verge "So we can better quantify the risk with respect to other vehicles of this age and class."

Other Tesla owners have spoken of Autopilot's problem of picking up trucks on the road. An anonymous Twitter user who uses the @ greentheonly hack "Model X" and published Twitter and YouTube. They have done this to "monitor the autopilot from within," they told you in an email to The Verge. In March, their X model encountered a trailer-trailer perpendicular to their path, similar to Brown and Banner. The car would try to drive under the truck if the driver did not intervene.

According to @ greentheonly, the semi-trailer is not an obstacle. But they decided not to tempt fate: "I did not try to get close to the trailer and see if any of the entrances would change (but I did not bet)."


Source link