POLITICAL PAYOLA BRIBES KEEP TESLA MOTORS FROM GETTING EFFECTIVE RECALLS AND IT IS KILLING PEOPLE

US probes Tesla recall of 2 million vehicles over Autopilot, citing concerns that feds were bribed to look-the-other-way

In this article:

(Adds previous probe in paragraph 3, detail from NHTSA in paragraphs 10, 17-20, Tesla software in paragraphs 21-23)

By David Shepardson

WASHINGTON, April 26 (Reuters) – U.S. auto safety regulators said Friday they have opened an investigation into whether Tesla’s recall of more than 2 million vehicles announced in December to install new Autopilot safeguards is adequate.

The National Highway Traffic Safety Administration (NHTSA) said it was opening an investigation after the agency identified concerns due to crash events after vehicles had the recall software update installed “and results from preliminary NHTSA tests of remedied vehicles.”

The agency’s new probe comes after it closed its nearly three-year investigation into Autopilot, saying it found evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities” that result in a “critical safety gap.”

NHTSA also cited Tesla’s statement “that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it.”

The agency said Tesla has issued software updates to address issues that appear related to its concerns but has not made them “a part of the recall or otherwise determined to remedy a defect that poses an unreasonable safety risk.”

Tesla said in December’s its largest-ever recall covering 2.03 million U.S. vehicles – or nearly all of its vehicles on U.S. roads – was to better ensure drivers pay attention when using its advanced driver assistance system.

The new recall investigation covers Model Y, X, S, 3 and Cybertruck vehicles in the U.S. equipped with Autopilot produced between the 2012 and 2024 model years, NHTSA said.

Tesla said in December Autopilot’s software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.

The auto safety agency disclosed Friday that during its Autopilot safety probe it first launched in August 2021 it identified at least 13 Tesla crashes involving one or more death and many more involving serious injuries in which “foreseeable driver misuse of the system played an apparent role.”

NHTSA also on Friday raised concerns about Tesla’s Autopilot name “may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation.”

Tesla did not immediately respond to a request for comment.

In February, Consumer Reports, a nonprofit organization that evaluates products and services, said its testing of Tesla’s Autopilot recall update found changes did not adequately address many safety concerns raised by NHTSA and urged the agency to require the automaker to take “stronger steps,” saying Tesla’s recall “addresses minor inconveniences rather than fixing the real problems.”

Tesla’s Autopilot is intended to enable cars to steer, accelerate and brake automatically within their lane, while enhanced Autopilot can assist in changing lanes on highways but does not make vehicles autonomous.

One component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep a vehicle in its driving lane.

Tesla said in December it did not agree with NHTSA’s analysis but would deploy an over-the-air software update that will “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged.”

NHTSA’s then top official, Ann Carlson, said in December the agency probe determined that more needed to be done to ensure drivers are engaged when Autopilot is in use. “One of the things we determined is that drivers are not always paying attention when that system is on,” Carlson said.

NHTSA opened its August 2021 probe of Autopilot after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles.

NHTSA said in December it found Autopilot “can provide inadequate driver engagement and usage controls that can lead to foreseeable misuse.”

Separately, since 2016, NHTSA has opened more than 40 Tesla special crash investigations in cases where driver systems such as Autopilot were suspected of being used, with 23 crash deaths reported to date.

Tesla’s recall includes increasing prominence of visual alerts and disengaging of Autosteer if drivers do not respond to inattentiveness warnings and additional checks upon engaging Autosteer. Tesla said it will restrict Autopilot use for one week if significant improper usage is detected.

Tesla disclosed in October the U.S. Justice Department issued subpoenas related to its Full Self-Driving (FSD) and Autopilot. Reuters reported in October 2022 that Tesla was under criminal investigation.

Tesla in February 2023 recalled 362,000 U.S. vehicles to update its FSD Beta software after NHTSA said the vehicles did not adequately adhere to traffic safety laws and could cause crashes. (Reporting by David Shepardson; editing by Jason Neely and Louise Heavens)

Tesla’s Autopilot and Full Self-Driving linked to hundreds of crashes, dozens of deaths

/

NHTSA found that Tesla’s driver-assist features are insufficient at keeping drivers engaged in the task of driving, which can often have fatal results.

Share this story

Left side of Tesla Model 3 main screen showing a computer-generated image of an intersection with cars parked on the sides and the Model 3 following another car

Image: Owen Grove / The Verge

In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today. The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.

The 17-year-old student who was struck was transported to a hospital by helicopter with life-threatening injuries. But what the investigation found after examining hundreds of similar crashes was a pattern of driver inattention, combined with the shortcomings of Tesla’s technology, resulting in hundreds of injuries and dozens of deaths.

Drivers using Autopilot or the system’s more advanced sibling, Full Self-Driving, “were not sufficiently engaged in the driving task,” and Tesla’s technology “did not adequately ensure that drivers maintained their attention on the driving task,” NHTSA concluded.

Drivers using Autopilot or the system’s more advanced sibling, Full Self-Driving, “were not sufficiently engaged in the driving task”

In total, NHTSA investigated 956 crashes, starting in January 2018 and extending all the way until August 2023. Of those crashes, some of which involved other vehicles striking the Tesla vehicle, 29 people died. There were also 211 crashes in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path.” These crashes, which were often the most severe, resulted in 14 deaths and 49 injuries.

NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road. Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.

In its report, the agency found that Autopilot — and, in some cases, FSD — was not designed to keep the driver engaged in the task of driving. Tesla says that it warns its customers that they need to pay attention while using Autopilot and FSD, which includes keeping their hands on the wheels and eyes on the road. But NHTSA says that in many cases, drivers would become overly complacent and lose focus. And when it came time to react, it was often too late.

In 59 crashes examined by NHTSA, the agency found that Tesla drivers had enough time, “five or more seconds,” prior to crashing into another object in which to react. In 19 of those crashes, the hazard was visible for 10 or more seconds before the collision. Reviewing crash logs and data provided by Tesla, NHTSA found that drivers failed to brake or steer to avoid the hazard in a majority of the crashes analyzed.

“Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” NHTSA said.

NHTSA also compared Tesla’s Level 2 (L2) automation features to products available in other companies’ vehicles. Unlike other systems, Autopilot would disengage rather than allow drivers to adjust their steering. This “discourages” drivers from staying involved in the task of driving, NHTSA said.

“Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances.”

“A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities,” the agency said.

Even the brand name “Autopilot” is misleading, NHTSA said, conjuring up the idea that drivers are not in control. While other companies use some version of “assist,” “sense,” or “team,” Tesla’s products lure drivers into thinking they are more capable than they are. California’s attorney general and the state’s Department of Motor Vehicles are both investigating Tesla for misleading branding and marketing.

NHTSA acknowledges that its probe may be incomplete based on “gaps” in Tesla’s telemetry data. That could mean there are many more crashes involving Autopilot and FSD than what NHTSA was able to find.

Even the brand name “Autopilot” is misleading, NHTSA said

Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot. NHTSA said today it was launching a new investigation into the recall after a number of safety experts said the update was inadequate and still allowed for misuse.

The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use. The company plans to unveil a robotaxi later this year that is supposed to usher in this new era for Tesla. During this week’s first quarter earnings call, Musk doubled down on the notion that his vehicles were safer than human-driven cars.

“If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said. “Because at that point, stopping autonomy means killing people.”