The U.S. National Highway Traffic Safety Administration intensifies its probe into Tesla's Full Self-Driving system, marking a critical step that could trigger a mandatory recall
The U.S. National Highway Traffic Safety Administration (NHTSA) has significantly escalated its investigation into Tesla's Full Self-Driving (FSD) system, upgrading the probe to a formal engineering analysis—a critical regulatory step that brings the agency closer to potentially mandating a widespread vehicle recall. The intensified investigation, announced on March 19, 2026, now encompasses approximately 3.2 million Tesla vehicles across multiple models and model years.
This escalation represents a pivotal moment in the ongoing scrutiny of autonomous and semi-autonomous driving technologies, as regulators grapple with ensuring public safety while the automotive industry races toward a driverless future.
From Preliminary Evaluation to Engineering Analysis
The upgrade from a preliminary evaluation to an engineering analysis signals that NHTSA's Office of Defects Investigation (ODI) has identified sufficient evidence of potential safety defects to warrant a more thorough technical examination. This phase involves deeper engineering review, including analysis of system design, failure modes, and the adequacy of any proposed remedies.
According to regulatory documents and reports, the investigation—designated PE25012—was initially opened on October 7, 2025, following reports of traffic safety violations linked to Tesla's FSD system. Since then, the scope and severity of documented incidents have expanded considerably.
> "The upgrade in the probe marks a significant step that could lead to vehicle recalls or enforcement action if regulators identify safety defects in Tesla's driver-assistance technology."
Core Safety Concerns: A Multi-Faceted Investigation
The NHTSA investigation is examining several critical areas of concern regarding Tesla's FSD system, which operates under various software iterations including FSD (Supervised) and FSD (Beta).
1. Traffic Signal Violations
A primary focus of the investigation centers on the FSD v12.x software series and its ability to accurately detect and respond to traffic signals. Regulators have documented numerous instances where FSD-equipped vehicles failed to adhere to traffic controls:
- Running red lights: Multiple reports indicate vehicles proceeding through intersections against red traffic signals
- Improper lane selection: Vehicles proceeding straight through turn-only lanes or executing turns from through-only lanes
- Wrong-way driving: Instances of vehicles entering opposing lanes of travel or attempting to turn onto roads against traffic flow
According to NHTSA documents, the agency has identified at least 80 instances of traffic violations, drawn from 62 driver complaints, 14 reports submitted by Tesla, and 4 media accounts. This represents a 60% increase from initial counts of approximately 50 violations.
Several crashes have resulted from these failures, including incidents at a problematic intersection in Joppa, Maryland, where multiple FSD-equipped vehicles exhibited similar violations.
2. Low-Visibility Performance Failures
The investigation is particularly scrutinizing how FSD performs under low-visibility conditions, including:
- Sun glare and extreme brightness
- Fog and atmospheric haze
- Airborne dust and particulate matter
- Nighttime driving challenges
NHTSA has cited multiple crashes occurring under these conditions, including at least one fatal incident where a Tesla vehicle struck and killed a pedestrian. An additional crash in similar conditions resulted in reported injuries.
The agency is examining whether Tesla's vision-only approach—which relies exclusively on cameras and neural networks without radar or lidar sensors—adequately addresses the challenges posed by reduced roadway visibility. Critics have long argued that camera-based systems struggle with scenarios where visual cues are obscured or washed out.
3. Software Stability Issues
The investigation is also examining the FSD version 14.2.2.3 release for reported stability concerns. User reports and field data have documented various issues with this software build, including:
- Lane drift and positioning errors
- Navigation logic failures
- Inconsistent decision-making at intersections
- Erratic behavior in complex driving scenarios
These stability issues compound the safety concerns, as unpredictable system behavior makes it more difficult for drivers to anticipate when intervention may be necessary.
4. Legacy Beta Version Review
NHTSA is reviewing legacy FSD Beta versions to determine whether adequate engineering controls were implemented to mitigate risks during challenging operating conditions. The investigation seeks to establish whether:
- Earlier software versions contained known deficiencies
- Tesla implemented sufficient safeguards for beta software operating on public roads
- The "beta" designation was appropriately applied given the system's capabilities
The Warning System Question
A critical component of the NHTSA probe focuses on driver warning systems and whether FSD provides:
- Adequate advance warning when the system is approaching operational limits
- Sufficient time for driver intervention when the software makes an error
- Clear, conspicuous, and timely indications of the system's intended driving responses
Regulatory documents reveal that investigators are specifically examining whether drivers receive appropriate alerts that would allow them to safely supervise the automated driving task and intervene when necessary.
> "The adequacy of the indications and warnings of the system's intended driving responses includes consideration of whether they are, among other things, accurate, conspicuous and sufficiently timely such that a driver may safely supervise the automated driving task and intervene as necessary."
Tesla's Response and Data Submission Challenges
Tesla has faced significant challenges in responding to NHTSA's information requests. The agency initially set a deadline of January 19, 2026, for Tesla to deliver critical crash data including video footage, event data recorder (EDR) files, and CAN bus data.
However, Tesla requested and received multiple deadline extensions:
- First extension: Pushed deadline to February 23, 2026
- Second extension: Further pushed to March 9, 2026
Tesla cited the burden of responding to multiple simultaneous NHTSA investigations—including separate probes into delayed crash reporting and inoperative door handles—as justification for the delays. The company reported having 8,313 records requiring manual review, processing approximately 300 per day.
NHTSA has the authority to levy civil penalties of up to $139.4 million for noncompliance with information requests, though the agency has thus far focused on securing the requested data rather than punitive measures.
The Broader Regulatory Context
This investigation unfolds against a backdrop of increasing regulatory scrutiny of automated driving systems:
Previous NHTSA Actions
- January 2025: Investigation opened into "Actually Smart Summon" feature failures
- August 2025: Probe initiated into delayed reporting of FSD/Autopilot crashes
- September 2025: Investigation opened into inoperative electronic door handles
- December 2025: Escalation of traffic violation investigation (PE25012-01)
Industry Implications
The outcome of this investigation will likely set precedent for how regulators approach Level 2 driver-assistance systems across the automotive industry. Tesla's approach—shipping software to millions of consumer vehicles and iterating through over-the-air updates—differs fundamentally from competitors like Waymo, which operates in limited geographic areas with dedicated hardware and incremental expansion.
> "The investigation's resolution will set precedent for every company pursuing autonomous driving."
Technical Underpinnings: The Vision-Only Debate
At the heart of the investigation lies a fundamental technical question: Can a vision-only autonomous driving system achieve sufficient safety without complementary sensors like radar or lidar?
Tesla eliminated radar from its vehicles in 2021, betting that cameras combined with neural networks could achieve superior performance at lower cost. However, the documented failures in low-visibility conditions highlight the inherent limitations of camera-based perception:
- Sun glare can wash out traffic signals and lane markings
- Fog and dust reduce contrast and detection range
- Night conditions challenge dynamic range and object recognition
- Adverse weather can obscure critical visual cues
Safety experts argue that redundant sensing modalities—combining cameras with radar and potentially lidar—provide essential backup when one sensor type is compromised.
Potential Outcomes and Next Steps
NHTSA faces several potential paths forward:
1. Continue Engineering Analysis: If the investigation confirms safety defects, NHTSA could mandate a recall requiring Tesla to address identified issues through software updates or other remedies.
2. Require Operational Restrictions: Regulators could mandate that FSD disable itself in conditions where it has demonstrated inadequate performance.
3. Enhanced Warning Requirements: NHTSA could require more robust driver monitoring and warning systems to ensure adequate supervision.
4. Marketing and Labeling Changes: The agency could pressure Tesla to modify how FSD is marketed to better reflect its actual capabilities and limitations.
Conclusion: A Defining Moment for Autonomous Driving
The NHTSA's escalation of the Tesla FSD investigation to engineering analysis represents a watershed moment for the autonomous vehicle industry. With 3.2 million vehicles under scrutiny and documented incidents including fatalities, the stakes could not be higher.
As Tesla continues to push toward its vision of fully autonomous transportation—including planned robotaxi services—the company's ability to demonstrate that FSD can operate safely across all conditions will determine not only its regulatory future but also public trust in autonomous driving technology itself.
The investigation's findings will likely influence regulatory frameworks for years to come, shaping how automakers develop, test, and deploy automated driving systems on public roads. For Tesla, the outcome could mean anything from mandated software changes to operational restrictions—or, if the company can demonstrate adequate safety, a pathway toward broader deployment of its autonomous driving ambitions.
This article is based on NHTSA investigation documents, regulatory filings, and news reports as of March 19, 2026.