Teslas with autopilot one step closer to remembering after shipwrecks

DETROIT – Teslas with partially automated driving systems are one step closer to being withdrawn after the United States raised its investigation into a series of collisions with parked emergency vehicles or trucks with warning signs.

The National Highway Traffic Safety Administration said Thursday it is updating the probe to an engineering analysis, another sign of greater scrutiny by the electric vehicle maker and automated systems that perform at least some driving tasks.

An engineering analysis is the final phase of an investigation, and in most cases, the NHTSA decides within a year whether there should be a withdrawal or whether the probe should be closed.

Documents released by the agency on Thursday raise serious concerns about Tesla’s autopilot system. The agency found that it is being used in areas where its capabilities are limited and that many drivers are not taking steps to prevent accidents despite vehicle warnings.

NHTSA reported that 16 crashes were found against vehicles and emergency trucks with warning signs, causing 15 injured and one dead.

The probe now covers 830,000 vehicles, almost all that the Austin, Texas-based automaker has sold in the United States since the start of the 2014 model year.

Researchers will evaluate additional data, vehicle performance and “explore the extent to which autopilot and associated Tesla systems can aggravate human factors or behavioral safety risks that undermine the effectiveness of driver supervision,” the agency said. .

A message was left on Thursday seeking Tesla comments.

In most of the 16 crashes, Teslas issued direct collision alerts to drivers just before the impact. Automatic emergency braking has intervened to at least brake cars in about half of the cases. On average, the autopilot relinquished control of the Teslas less than a second before the crash, according to NHTSA in documents detailing the probe.

NHTSA also said it is investigating accidents involving similar patterns that did not include emergency vehicles or trucks with warning signs.

The agency found that in many cases, drivers had their hands on the steering wheel but did not take steps to prevent an accident. This suggests that drivers are complying with the Tesla system that makes them keep their hands on the wheel, the agency wrote. However, this does not necessarily ensure that they are paying attention.

In accidents where video is available, drivers should see first responders vehicles an average of eight seconds before the impact, the agency wrote.

The agency will have to decide if there is a safety defect with the autopilot before proceeding with the withdrawal.

The researchers also wrote that the use by a driver or misuse of the driver control system “or the unintentional operation of a vehicle does not necessarily prevent a defect in the system.”

In all, the agency analyzed 191 accidents but eliminated 85 of them because other drivers were involved or there was not enough information to make a definitive assessment. Of the remaining 106, the main cause of about a quarter of crashes appears to be running Autopilot in areas where it has limitations or conditions may interfere with its operations. “For example, operation on roads other than restricted access highways, or operation in low traction or visibility environments such as rain, snow or ice,” the agency wrote.

The National Transportation Safety Board, which also investigated some of Tesla’s accidents dating back to 2016, recommended that NHTSA and Tesla limit the use of autopilot to areas where it can operate safely. The NTSB also recommended that the NHTSA require Tesla to have a better system in place to make sure drivers are paying attention. The NHTSA has yet to take action on the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.

In 2020, the NTSB blamed Tesla, drivers, and lax NHTSA regulation for two collisions in which Teslas on autopilot crashed under tractor-trailers. The NTSB has taken the unusual step of accusing NHTSA of contributing to the accident by failing to ensure that carmakers have put in place safeguards to limit the use of electronic driving systems.

The agency made the decision after investigating a 2019 accident in Delray Beach, Florida, in which the driver of a 50-year-old Tesla Model 3 died. The car was driving on autopilot when neither the driver nor the autopilot system braked or attempted to prevent a tractor-trailer from crossing in its path.

In a statement, the NHTSA said there are no vehicles available for purchase today that you can drive on your own. “Every available vehicle requires the human driver to be in control at all times and all state laws hold the human driver responsible for the operation of their vehicles,” the agency said.

Driver assistance systems can help prevent accidents, but they must be used correctly and responsibly, according to the agency.

NHTSA began its investigation in August last year after a series of accidents since 2018 in which Teslas using autopilot or the company’s traffic-conscious cruise control systems hit vehicles in scenes where first responders used flashing lights. flares, an illuminated arrow plate, or warning cones. of dangers.

Teslas with autopilot one step closer to remembering after shipwrecks

Source link Teslas with autopilot one step closer to remembering after shipwrecks

Back to top button