Tesla Sued by Deceased Driver’s Family Over ‘Fraudulent Misrepresentation’ of Autopilot Safety
| Tesla Supercharger stations are seen in a parking lot in Austin, Texas |
Tesla is facing a lawsuit from the family of a deceased driver, accusing the company of “fraudulent misrepresentation” regarding the safety and performance of its Autopilot system.
The lawsuit stems from a 2023 collision in Walnut Creek, California, involving a Model S sedan. Genesis Giovanni Mendoza-Martinez, the driver, died in the crash, while his brother, Caleb, sustained serious injuries as a passenger.
The family filed the lawsuit in October in Contra Costa County but recently had the case moved to federal court in California’s Northern District. Plaintiffs typically face a higher burden of proof for fraud claims in federal court, according to reports.
Details of the Case
The incident occurred while using Tesla’s Autopilot, a partially automated driving system. According to Mendoza’s attorneys, the crash involved a 2021 Model S colliding with a parked fire truck. The family’s legal team argues that Tesla and CEO Elon Musk have consistently exaggerated and misrepresented the capabilities of Autopilot over several years through tweets, blog posts, press interviews, and financial earnings calls. They argue these representations created false excitement about Tesla’s vehicles to boost financial performance.
In response, Tesla maintains that the crash was the result of “the driver’s own negligent acts or omissions” and that Tesla’s representations did not significantly contribute to the collision. The company also asserted that its vehicles and systems have a "reasonably safe design" compliant with all applicable state and federal laws.
Tesla has yet to comment further on the case. Brett Schreiber, the attorney representing the Mendoza family, declined to make his clients available for interviews.
Broader Legal Context
This lawsuit is among at least 15 other active cases that claim similar incidents occurred while Tesla’s Autopilot or its premium Full Self-Driving (FSD) system was in use before fatal or injurious crashes. Notably, three of these lawsuits have been moved to federal courts.
The National Highway Traffic Safety Administration (NHTSA) has been investigating Tesla's Autopilot system since August 2021. This investigation led to changes in Tesla’s systems via software updates.
Additionally, NHTSA has launched a second investigation into whether Tesla’s “recall remedy” has resolved concerns related to Autopilot’s behavior near stationary first responder vehicles. The agency has also flagged concerns over Tesla’s social media posts, warning they may mislead drivers into thinking their cars can act as fully autonomous robotaxis.
Moreover, the California Department of Motor Vehicles has sued Tesla, accusing the company’s claims about Autopilot and FSD of false advertising.
Tesla’s Continued Push for Full Self-Driving Capabilities
Despite these legal challenges, Tesla continues to roll out its latest version of Full Self-Driving (FSD) to customers. Musk recently took to X (formerly Twitter), encouraging his 206.5 million followers to demonstrate FSD to friends, stating that “it feels like magic.”
Musk has long promised that Tesla vehicles will soon achieve full autonomy without requiring human intervention, a goal the company has pursued since 2014. Although Tesla has showcased concepts like the two-seater autonomous "CyberCab," commercial robotaxis remain absent from Tesla's offerings. Meanwhile, other companies like WeRide, Pony.ai in China, and Alphabet’s Waymo in the U.S. are already operating their own robotaxi services.
The outcome of this lawsuit and the broader investigations could further shape the debate about Tesla’s Autopilot safety and the path toward fully autonomous driving technology

Comments
Post a Comment