“Trusted Autonomous Driving” — Tesla Under Suspicion of Withholding Crash Reports, U.S. NHTSA Launches Investigation
Input
Changed
Delayed crash reports by up to several months Investigation into compliance of submitted reports Facing lawsuits over concealing critical risks of Robotaxis

Tesla, the U.S. electric vehicle manufacturer led by Elon Musk, is under investigation by U.S. transportation authorities over allegations that it failed to comply with crash reporting regulations related to autonomous driving. While Tesla attributed the delays to data collection issues, regulators intend to scrutinize whether the company met its reporting obligations. The probe comes amid growing controversy over autonomous driving safety, and follows mounting scrutiny from Robotaxi pilot operations and shareholder lawsuits that now threaten to undermine Tesla’s reputation as a symbol of self-driving innovation.
Delayed crash reporting
According to the Wall Street Journal (WSJ) on the 21st (local time), the U.S. National Highway Traffic Safety Administration (NHTSA) has launched an investigation into whether Tesla complied with mandatory reporting requirements for collisions involving specific driver-assistance features. In federal filings disclosed the previous day, the agency stated, “Reported collisions occurred months or more before the filing date,” and emphasized, “Most of these reports were required to be submitted within one to five days of Tesla being notified of the crashes.”
Many Tesla vehicles are equipped with the Autopilot system, which assists drivers on highways, as well as the Full Self-Driving (FSD) feature, which guides vehicles on both highways and city roads. However, both systems require drivers to remain attentive and ready to take control at all times.
Tesla explained that reporting delays were caused by internal data collection issues, which it claims have since been resolved. Nevertheless, regulators have initiated a standard “audit query” to assess the causes and extent of the delays, as well as the measures Tesla has introduced to address them. The agency also said it would closely examine whether Tesla’s submitted reports contain all data required under federal regulations.
Separately, since last October, NHTSA has been conducting an investigation into the safety of Tesla’s FSD software, which the company markets as self-driving. The inquiry focuses on crashes that occurred while FSD was operating under adverse conditions such as fog and dust that impaired road visibility.
“Claimed to outperform humans”
Legal rulings have already emerged over accidents that occurred while Autopilot was engaged. Earlier this month, a federal jury in Miami ordered Tesla to pay $243 million in damages over a 2019 crash in Florida. The accident took place at night on a two-lane road, when a Tesla Model S struck a parked SUV before colliding with a young couple standing nearby. The crash killed a woman and left a man severely injured. At the time, the Tesla driver had lowered his head to retrieve a dropped phone while Autopilot was activated.
The jury found Tesla 33% liable for the crash, citing technical flaws and inadequate safety warnings, despite the driver’s negligence. The court ordered Tesla to pay $43 million in compensatory damages and $200 million in punitive damages, totaling $243 million. This marks the first U.S. case in which Tesla was found guilty in a lawsuit involving Autopilot. Previous cases had largely been dismissed or settled.
Jurors cited three primary reasons for Tesla’s liability: Autopilot failed to properly detect road boundaries, stationary vehicles, and pedestrians; Tesla did not clearly warn against using the system on non-highway roads despite its highway-oriented design; and finally, “the duty to maintain forward attention cannot rest solely on the driver,” even in cases of driver negligence.

Half of Americans say, “We’ll never ride Robotaxis”
Tesla now also faces a shareholder lawsuit alleging fraud over its Robotaxi program. On August 5, some investors filed a complaint in the U.S. District Court in Austin, Texas, accusing Musk and Tesla of overstating autonomous driving capabilities and concealing safety risks. They claim that Tesla deliberately withheld critical risks associated with Robotaxi pilot operations, which began on June 22 near company headquarters. Vehicles involved in the trial runs reportedly displayed safety lapses such as speeding, abrupt braking, curb-hopping, improper lane entry, and passenger drop-offs in the middle of multi-lane roads.
Public sentiment toward Robotaxis is deteriorating. According to a recent survey by the Electric Vehicle Intelligence Report (EVIR), half of the 8,000 U.S. adults surveyed said their interest in Robotaxis decreased after reading media reports highlighting safety problems. In total, 46% of respondents said they would never consider using Robotaxis, while 31% expressed a more cautious stance, saying “not now, and likely never.” Meanwhile, 1% reported having tried a Robotaxi once but said they would not use it again.
The survey also showed sharp divides by age and geography. Among respondents aged 65 and older, 53% said they would refuse to ride a Robotaxi, compared with 35% of those aged 18–44. In rural areas, 53% of respondents rejected Robotaxi use, compared with 37% in urban areas and 46% in suburban areas.
Moreover, 53% of respondents said Robotaxis should be subject to formal legal regulation before broader adoption. The results stand in stark contrast to the optimism of the late 2010s, when both the auto and tech industries envisioned the imminent commercialization of self-driving vehicles. Today, however, critics argue that expectations have far outpaced the actual performance and safety of the technology.