Autonomous Vehicle Liability: Who is Responsible in a Self-Driving Car Crash?
SHARE
20. April 2026
Admin
Autonomous Vehicle Liability: Who is Responsible in a Self-Driving Car Crash?
Self-driving cars are no longer science fiction β they are on public roads today. But when an autonomous vehicle crashes, traditional tort law struggles to assign responsibility. Is it the human safety driver? The vehicle owner? The software developer? The sensor manufacturer? This guide explains the emerging legal framework for autonomous vehicle liability, the parties who can be sued, and the unique challenges of proving fault in crashes involving AI-driven cars.
Tip: Preserve all vehicle data after an autonomous vehicle crash β including event data recorders (black boxes), telematics logs, and software version histories. This data is often deleted or overwritten quickly.
1. Levels of Autonomy β Why They Matter for Liability
Not all self-driving cars are equally autonomous. The SAE's six levels (0-5) determine who is legally responsible when a crash occurs.
Level 0-2 (Driver assistance): Human driver monitors constantly. Liability follows traditional rules β driver is presumptively responsible
Level 3 (Conditional automation): Vehicle handles driving under specific conditions, but human must intervene when requested. Liability allocation is contested
Level 4 (High automation): Vehicle performs all driving tasks within operational design domain (e.g., geofenced city area). No human intervention expected
Level 5 (Full automation): Vehicle performs all driving tasks everywhere, under all conditions. Human is purely a passenger
Legal bright line: At Levels 4 and 5, traditional driver negligence becomes irrelevant β product liability and manufacturer negligence dominate
2. Potential Defendants in an Autonomous Vehicle Crash
Unlike traditional car crashes with one or two defendants, autonomous vehicle collisions can implicate multiple deep-pocketed parties.
Vehicle manufacturer (OEM): Tesla, GM, Ford, Mercedes β alleged design defects, inadequate testing, or failure to recall
Autonomous software developer: Waymo, Cruise, Aurora, Tesla (FSD) β algorithmic errors, perception failures, or planning mistakes
Human safety driver: For Level 2-3 vehicles β failure to monitor, late intervention, or distraction
Other human driver: Traditional negligence still applies to human-driven vehicles involved in crashes with AVs
3. Legal Theories for Suing AV Manufacturers
Plaintiffs have several causes of action against autonomous vehicle companies following a crash.
Product liability (design defect): AV's software or hardware design is unreasonably dangerous under ordinary use
Product liability (manufacturing defect): A specific vehicle deviated from intended design due to production error
Failure to warn: Manufacturer did not adequately warn users about known limitations or risks
Negligence: AV developer breached duty of care in testing, deployment, or software updates
Breach of warranty: Express or implied warranties about autonomous capabilities were false
Wrongful death / personal injury: Traditional tort claims against corporate defendants
4. Notable Autonomous Vehicle Crash Lawsuits
Real-world litigation is establishing early precedents for AV liability.
Rafaela Vasquez v. Uber (2023): Safety driver pleaded guilty to endangerment after fatal 2018 crash in Tempe. Uber settled with victim's family for undisclosed amount
Tesla Autopilot litigation (ongoing): Multiple wrongful death lawsuits alleging Tesla's "Full Self-Driving" and Autopilot caused crashes through design defects
Cruise pedestrian drag incident (2023): Robotaxi struck pedestrian, then dragged her 20 feet. California DMV suspended Cruise's permits. Multiple lawsuits pending
Muir v. Tesla (2022): California jury found Tesla not liable β first Autopilot trial verdict for defense. Plaintiff appealed
Waymo arbitration cases: Multiple confidential settlements over minor crashes involving Waymo robotaxis in Phoenix and San Francisco
5. The Black Box Problem β Proving What Happened
Autonomous vehicle crashes often lack eyewitnesses, and human drivers may not understand what the AV was doing. Data is everything.
Event Data Recorder (EDR): Federal regulations require EDRs in most new vehicles, recording speed, braking, steering, and airbag deployment
AV-specific logging: Self-driving cars record perception data (what sensors detected), planning data (intended path), and control data (steering/braking commands)
Spoliation risk: AV companies may automatically delete or overwrite data. Issue litigation hold immediately after crash
Source code access: Plaintiffs increasingly seek access to AV software source code β courts have granted limited access under protective orders
Expert reconstruction: Accident reconstruction experts must understand both traditional vehicle dynamics and AV software behavior
6. Preemption and Federal Regulation
Federal and state laws interact in complex ways for autonomous vehicle liability.
NHTSA authority: National Highway Traffic Safety Administration sets safety standards and can order recalls β but has not issued AV-specific regulations
State liability laws: Traditional tort claims (negligence, product liability) are governed by state law, not preempted by federal law
AV START Act (pending): Proposed federal legislation would preempt some state AV regulations but explicitly preserves state tort remedies
No federal preemption for design defect: Courts have rejected arguments that federal vehicle safety standards preempt state product liability claims
State AV testing rules: Many states require reporting of AV crashes to DMV β these reports are discoverable in litigation
7. Human Driver vs. AV: Comparative Fault Complications
Many crashes involve both an autonomous vehicle and a human-driven car. Allocating fault is legally complex.
Human driver negligence: Traditional rules apply β speeding, distracted driving, running red lights, DUI
AV fault allocation: If AV contributed, manufacturer may be partially liable under comparative fault principles
Joint and several liability: In some states, any defendant found partially at fault can be held liable for entire judgment
Unexpected behavior defense: Human drivers may argue AV's unpredictable behavior caused them to react unreasonably
Last clear chance doctrine: Even if AV was negligent, human driver who had final opportunity to avoid crash may bear full liability
8. Damages Available in AV Crash Lawsuits
Victims of autonomous vehicle crashes can seek the same damages as traditional car accidents β plus potentially more.
Medical expenses: Past and future treatment costs, rehabilitation, assistive devices
Lost wages and earning capacity: Income loss from temporary or permanent disability
Pain and suffering: Non-economic damages for physical and emotional distress
Property damage: Vehicle repair or replacement costs
Punitive damages: Available if AV company acted with conscious disregard for safety (e.g., known software defect but deployed anyway)
Wrongful death damages: Loss of financial support, companionship, and funeral expenses for fatal crashes
Class action potential: Software defects affecting thousands of vehicles may support class action for economic losses (not personal injury)
9. Unique Defenses Available to AV Companies
Autonomous vehicle defendants have raised novel defenses not available to human drivers or traditional automakers.
Preemption by federal standards: Argue NHTSA approval of AV system preempts state tort claims (largely unsuccessful to date)
Assumption of risk: User accepted known risks of beta AV technology by agreeing to terms of service
Software as a "product" dispute: Argue software is a service, not a product β avoiding strict product liability (untested)
Human misuse: Safety driver or owner modified or misused the AV system beyond intended design
Unavoidable accident: Even with perfect perception and planning, crash was unavoidable due to physics or other driver's sudden actions
Arbitration clauses: Tesla and others require arbitration for certain claims through clickwrap agreements
10. Steps to Take After an Autonomous Vehicle Crash
If you are involved in a crash with a self-driving car β as driver, passenger, pedestrian, or cyclist β take these steps.
Seek medical attention immediately: Some injuries appear hours or days after a crash
Preserve evidence: Photograph the AV, license plate, and any visible sensors (LiDAR domes, camera arrays)
Identify the operator: Was a human in the driver's seat? Is it a robotaxi (Waymo, Cruise) or privately owned vehicle?
Obtain the AV company's information: Insurance details, fleet operator name, and contact information
Do not sign anything at the scene: AV companies may present tablets or forms β consult an attorney first
Issue a spoliation letter: Demand that the AV company preserve all EDR, telematics, and software log data
Consult an AV accident attorney: Traditional car accident lawyers may lack experience with product liability and software defect claims
Act quickly: Statute of limitations for personal injury is typically 1-3 years depending on state
Conclusion
Who is responsible when a self-driving car crashes? The answer depends on the vehicle's autonomy level, the cause of the crash, and what the data reveals. For Level 0-2 vehicles, human drivers remain primarily liable. But for Level 4-5 autonomous vehicles, liability shifts to manufacturers, software developers, sensor suppliers, and fleet operators β through product liability, negligence, and warranty claims. Early litigation against Tesla, Uber, Cruise, and Waymo is establishing precedent, but no Supreme Court ruling has yet clarified the rules. Victims of AV crashes should preserve all vehicle data, avoid signing quick settlements, and consult attorneys experienced in both traditional accident reconstruction and emerging autonomous vehicle law. As self-driving technology becomes mainstream, expect AV liability to be one of the most contested areas of tort law for the next decade.
β οΈ Note: Autonomous vehicle laws vary significantly by state and are rapidly evolving. This guide is educational and not legal advice. Consult a qualified attorney with specific experience in autonomous vehicle litigation. Review the NHTSA Automated Vehicles Safety page for federal guidance.