A Tesla Supercharger station is visible in a parking lot in Austin, Texas, on September 16, 2024.
Brandon Bell | Getty Images
Tesla The family of a driver killed in a 2023 crash has sued the company, claiming the company’s “fraudulent misrepresentations” about its Autopilot technology were to blame.
Tesla driver Genesis Giovanni Mendoza-Martinez died in a Model S sedan crash in Walnut Creek, California. His brother Caleb, who was a passenger at the time, was seriously injured.
The Mendoza family sued Tesla in Contra Costa County in October, but in recent days Tesla moved the case from state court to federal court in the Northern District of California. independent The change of venue was first reported. Plaintiffs typically face a higher burden of proof on fraud claims in federal court.
The incident involved a 2021 Model S that crashed into a parked fire truck while the driver was using Tesla’s Autopilot, a semi-autonomous driving system.
Mendoza’s attorneys accuse Tesla and Musk of exaggerating or falsely promoting the Autopilot system for years in an effort to “stimulate interest in the company’s vehicles and thereby improve its financial position.” They pointed to tweets, company blog posts and comments made on earnings calls and media interviews.
Tesla lawyers said in response that the accident was attributable to the driver’s “own negligent acts and/or omissions” and that “reliance on any representations, if any, made by Tesla was not a cause of injury.” important factors”. driver or passenger. They claim Tesla’s vehicles and systems have “reasonably safe designs” and comply with state and federal laws.
Tesla did not respond to a request for comment on the case. Brett Schreiber, an attorney representing the Mendoza family, declined to make his clients available for interviews.
There are at least 15 other ongoing cases involving similar claims involving Tesla accidents in which Autopilot or its FSD (Fully Self-Driving (Supervised)) was used before a fatal or injury-related crash. Three of the cases have been transferred to federal court. FSD is an advanced version of Tesla’s partial autonomous driving system. While Autopilot is standard on all new Tesla vehicles, owners will need to pay an upfront fee or subscribe monthly to use FSD.
The crash at the center of Mendoza-Martinez’s lawsuit is also part of a broader Tesla Autopilot investigation launched by the National Highway Traffic Safety Administration in August 2021. Changes were made, including numerous over-the-air software updates.
The agency has launched a second investigation, which is ongoing, to evaluate whether Tesla’s “recall remedies” to address issues with Autopilot behavior around stationary emergency vehicles were effective.
The U.S. National Highway Traffic Safety Administration (NHTSA) warned Tesla that its social media posts could mislead drivers into thinking its cars are robotaxis. In addition, the California Department of Motor Vehicles has sued Tesla, alleging that its Autopilot and FSD claims constitute false advertising.
Tesla is currently rolling out a new version of FSD to customers. Weekend, Musk instruct He has attracted more than 206.5 million followers on X to “show Tesla Autopilot to friends tomorrow,” adding, “It feels like magic.”
Since about 2014, Musk has been promising investors that Tesla’s cars would soon be able to drive autonomously without a driver. Tesla has yet to put it into practice.
Meanwhile, Chinese competitors include WeRide and Pony.ai, alphabetical US-based Waymo already operates a commercial robotaxi fleet and service.
watch: Tesla FSD tests ‘very good’