Auto Safety Agency Expands Tesla Investigation

Auto Safety Agency Expands Tesla Investigation


The federal government’s top auto safety agency, Tesla, and its autopilot driver assistance system are significantly stepping up investigations to determine if the technology poses a security risk.

The agency, the National Highway Traffic Safety Administration, said Thursday that it was. Upgrading its initial diagnosis From autopilot to engineering analysis, a deeper level of testing is required before ordering a recall.

The analysis will consider whether the autopilot fails to divert drivers’ attention from the road and prevent them from engaging in other predictive and dangerous behavior using the system.

Jonathan Adkins, executive director of the Governors Highway Safety Association, said: “We’ve been calling for a close inspection of autopilots for some time now, coordinating state efforts to promote safe driving.

The NHTSA said it was aware of 35 crashes when the autopilot was activated, including nine that killed 14 people. But he said on Thursday that he had not determined whether the autopilot had defects that could have caused a car accident during the engagement.

Extensive research has covered 830,000 vehicles sold in the United States. These include all four Tesla cars – models S, X, 3 and Y – in model years 2014 to 2021. The agency will look into the autopilot and its various component systems that handle steering, braking and other driving functions, and a more sophisticated system called the Tesla Full Self Driving.

Tesla did not respond to a request for comment on the agency’s move.

Of Early diagnosis The focus was on 11 accidents in which auto-controlled Tesla cars collided with parked emergency vehicles with flashing lights. In the review, NHTSA said Thursday that the agency learned of 191 crashes – not limited to emergency vehicles – that needed further investigation. The agency said the incident occurred when the cars were operating under auto-pilot, full self-driving or related features.

Tesla says complete self-driving software can guide vehicles on city roads but does not make it completely autonomous and drivers need to be careful. It is also available to only a limited set of users called Tesla “beta” or test version which is not fully developed.

The depth of the investigation indicates that the NHTSA is more seriously considering the safety concerns posed by the lack of precautions to prevent drivers from using the autopilot in a dangerous manner.

“This is not your common fault,” said Michael Brooks, acting executive director of the Center for Auto Safety, a nonprofit consumer advocacy group. “They are actively looking for a problem that can be solved, and they are looking at the driver’s behavior, and that problem cannot be a part of the car.”

Tesla and its chief executive, Elon Musk, have been criticized for hyping autopilot and full self-driving in ways that suggest they are capable of piloting cars without driver input.

Mr Adkins of the Governors Highway Safety Association said: “At the very least, they should be renamed.” “These names confuse people into thinking they can do more than they really deserve.”

Competitive systems developed by General Motors and Ford Motor use infrared cameras that closely monitor the driver’s eyes and sound a sound warning if the driver looks away from the road for more than two or three seconds. Tesla initially did not include such a driver monitoring system in its cars and later added only a standard camera which is much less accurate than infrared cameras in eye tracking.

Tesla tells drivers to use autopilots only on divided highways, but the system can be activated on any road with lines in between. GM and Ford systems – also known as Super Cruise and Blue Cruise – can only be activated on highways.

The autopilot was first introduced in Tesla models in late 2015. It uses cameras and other sensors to drive, accelerate and brake with very little driver input. Owner’s guidelines require drivers to keep their hands on the steering wheel and keep their eyes on the road, but early versions of the system required drivers to keep their hands off the wheel for five minutes or more under certain conditions. Allowed

Unlike the technicians at almost every other company working on self-driving cars, Mr Musk insisted that autonomy could only be achieved by tracking the cameras around him. But Many Tesla engineers asked Was it safe to rely on cameras without other sensing devices?

Mr Musk has regularly promoted autopilot skills, saying autonomous driving is a “solved problem” and predicts that drivers will soon be able to fall asleep while their cars drive them to work. Are

Questions about the system arose in 2016 when an Ohio man was killed when his model collided with a tractor trailer on a highway in Florida while the autopilot was active. The NHTSA investigated the crash and said in 2017 that it had found no safety flaws in the autopilot.

But the agency Issued a bulletin In 2016, driver assistance systems that fail to keep drivers busy “could also be an unreasonable threat to safety,” it said. And in a separate investigation, The National Transportation Safety Board concluded. That autopilot system “played a major role” in the Florida crash because it lacked safety precautions to prevent misuse.

Tesla is facing lawsuits from families. Victims of fatal accidentsAnd some consumers have. Sued the company On auto-pilot and full self-driving claims.

Last year, Mr Musk acknowledged that building autonomous vehicles was more difficult than he thought.

The NHTSA began its initial autopilot assessment in August and initially focused on 11 accidents in which Teslas co-workers working with autopilots collided with police cars, fire trucks and other emergency vehicles that came to a halt. And their lights were shining. One person was killed and 17 were injured in the crash.

While reviewing these accidents, he discovered six more, including emergency vehicles, and excluded one of the original 11 from further study.

At the same time, the agency learned of dozens of other accidents that occurred during the autopilot’s operation, which did not include emergency vehicles. Of these, the agency first focused on 191, and removed 85 from further scrutiny because if the autopilot was a major cause, it could not obtain enough information to obtain a clear picture.

In about half of the remaining 106 cases, the NHTSA found evidence that drivers were not paying full attention to the road. A quarter of the 106 occurred on roads where autopilot should not be used.

In engineering analysis, NHTSA’s Office of Defective Investigation sometimes obtains the vehicles it is testing and tries to identify the defects and mimic the problems they cause. Arranges for testing. In the past it has isolated components to detect defects, and has sought detailed data from manufacturers on how components work, often including proprietary information.

This process can take months or even a year or more. NHTSA aims to complete the analysis within a year. If it concludes that there is a security flaw, it may put pressure on the manufacturer to recall and correct the problem.

On rare occasions, carmakers challenged the agency’s findings in court and managed to prevent a return.

Leave a Reply

Your email address will not be published.