Sat. May 18th, 2024

Tesla’s Autopilot Linked to Hundreds of Crashes in a New Safety Probe

Samantha Parker By Samantha Parker May10,2024 #finance

The NHTSA launches new probe into adequacy of 2-million vehicle recall in December, after discovering more crashes linked to Tesla’s Full Self Drive FSD.

Another FSD Probe

The Wall Street Journal reports U.S. Regulators Tie Tesla’s Autopilot to More than a Dozen Fatalities, Hundreds of Crashes

Federal auto-safety regulators have opened an investigation into the adequacy of Tesla’s December recall of 2 million vehicles equipped with Autopilot software, tying the technology to at least 14 fatalities, several dozen injuries and hundreds of crashes.

The National Highway Traffic Safety Administration said in a report published Friday that its examination of Tesla’s Autopilot, a driver-assist system that automates some driving tasks, uncovered a trend of “avoidable crashes involving hazards that would have been visible to an attentive driver.”

On Friday, the regulator said it was closing its earlier probe and opening the new one into the adequacy of the recall remedy, which was deployed through a software update. The recall in December was among Tesla’s largest to date and involved nearly all the vehicles it had sold in the U.S. 

NHTSA also compared Autopilot to similar systems deployed by auto-industry rivals, saying it found Tesla’s approach was an “industry outlier.” The agency said the Autopilot name “elicits the idea of drivers not being in control,” while other systems use terms like “assist” or “team” to imply that active supervision is required.

In its latest report, NHTSA took issue with Tesla’s statements that a portion of the recall remedy required opt-in from the owner and could be reversed at the driver’s discretion. It also said some Tesla updates appeared to address Autopilot issues that the NHTSA raised without identifying them as remedies.

“This investigation will consider why these updates were not a part of the recall or otherwise determined to remedy a defect that poses an unreasonable safety risk,” the agency said.

Since the Autopilot update was rolled out, regulators have received an unusually high number of complaints about changes made to the controls. Some drivers say warnings have become excessive and are triggered when performing routine tasks.

Recall Created New Problems

The Tesla recall did not address any existing problems but did create new ones.

The Verge has additional comments.

In 59 crashes examined by NHTSA, the agency found that Tesla drivers had enough time, “five or more seconds,” prior to crashing into another object in which to react. In 19 of those crashes, the hazard was visible for 10 or more seconds before the collision. Reviewing crash logs and data provided by Tesla, NHTSA found that drivers failed to brake or steer to avoid the hazard in a majority of the crashes analyzed.

“Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” NHTSA said.

“A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities,” the agency said.

Even the brand name “Autopilot” is misleading, NHTSA said, conjuring up the idea that drivers are not in control. While other companies use some version of “assist,” “sense,” or “team,” Tesla’s products lure drivers into thinking they are more capable than they are. California’s attorney general and the state’s Department of Motor Vehicles are both investigating Tesla for misleading branding and marketing.

NHTSA acknowledges that its probe may be incomplete based on “gaps” in Tesla’s telemetry data. That could mean there are many more crashes involving Autopilot and FSD than what NHTSA was able to find.

Sels Driving When?

On July 6, 2023, Elon Musk said Tesla will have ‘level 4 or 5’ self-driving this year

Every year since 2016 he has promised to deliver a vehicle capable of going cross country without a drive.

More Elon Musk Vaporware

On April 8, I commented Tesla’s Robotaxi August Launch Will Be More Elon Musk Vaporware

Expect to hear more promises, always a year away.

Elon Musk will make an announcement on robotaxis on August 8. Tesla lags Waymo so badly that Musk is not even near the ballpark.

Tesla Rebounds On Musk Promises With No Details

After a dismal first quarter and a stock market plunge of 40 percent this year, I commented on April 24, Tesla Rebounds On Musk Promises With No Details, It Won’t Stick

Musk said that the new models would come “early 2025 if not late this year.” Beyond that, though, he gave few details, pointedly declining to answer analysts’ questions on the topic during the call.

Refusal to provide details means one thing: Musk has no details to offer.

The name Full Self Drive is a blatant lie. If you do a string of searches on who is ahead in driverless technology, you can find any answer you want.

But if you compare levels of autonomy, Waymo is level 4, driverless.

Despite the name FSD, Tesla says your hands need to be on the wheel. Tesla is level 2, far behind Waymo in capability.

Samantha Parker

By Samantha Parker

Samantha is a seasoned journalist with a passion for uncovering the truth behind the headlines. With years of experience in investigative reporting, she has covered a wide range of topics including politics, crime, and entertainment. Her in-depth analysis and commitment to factual accuracy make her a respected voice in the field of journalism.

Related Post

2 thoughts on “Tesla’s Autopilot Linked to Hundreds of Crashes in a New Safety Probe”
  1. Do these findings suggest that Tesla’s Autopilot may not be as safe as initially claimed?

    1. Not really surprised by these findings. It’s evident that Tesla’s Autopilot is still far from being foolproof and requires constant attention from the driver.

Leave a Reply

Your email address will not be published. Required fields are marked *