Tesla is facing a lawsuit alleging its claims about Autopilot and Full Self Driving's (FSD) capabilities contributed to a fatal crash, giving the courts yet another chance to hash out claims similar to those in previous lawsuits.
In this instance, driver Genesis Giovanni Mendoza Martinez died, and his brother Caleb was seriously injured, when the former's Tesla Model S slammed into a fire truck parked diagonally across two lanes of a California interstate highway for traffic control in an unrelated incident.
According to the lawsuit [PDF], originally filed in California's Contra Costa Superior Court in October, the plaintiffs' lawyers claim Mendoza's Model S was operating under Autopilot at the time of the collision, and he had "generally maintained contact with the steering wheel until the time of the crash." The lawsuit was recently removed to the US District Court for the Northern District of California following a filing by Tesla.
Like several previous cases involving fatalities or serious injuries that occurred while a Tesla was operating with Autopilot/FSD active, Mendoza's surviving family argue that, while he was using it with appropriate caution, he was nonetheless misled into believing Tesla's self-driving capabilities were more capable than they actually are due to "Tesla's long-term advertising campaign designed to persuade the public that its vehicles were capable of driving themselves."
"Not only was [Giovanni] aware that the technology itself was called 'Autopilot,' he saw, heard, and/or read many of Tesla or Musk's deceptive claims on Twitter, Tesla's official blog, or in the news media," the lawsuit argued. "Giovanni believed those claims were true, and thus believed the 'Autopilot' feature with the 'full self driving' upgrade was safer than a human driver, and could be trusted to safely navigate public highways autonomously."
Third time's the charm?
The argument that Tesla is overblowing Autopilot and FSD's capabilities isn't a new one: Tesla has fought, and is still fighting, so many lawsuits and regulator investigations into the matter that it's difficult to keep count.
One of the cases it's fighting involves the 2019 death of Jeremy Banner, whose Model 3 smashed into a tractor-trailer in cross traffic. That case bears a striking similarity to the 2016 death of Joshua Brown, whose Model S also collided with a tractor-trailer crossing a highway ahead of him. Tesla claimed that it addressed the issue linked to Brown's death, but given the similarities to Banner's death, regulators have been worried the carmaker might not be doing all it can to prevent such deaths.
Mendoza's fatal accident bears similarities to both Brown's and Banner's cases, as well as a prior investigation by the US National Highway Traffic Safety Administration that found Tesla Autopilot tended not to notice emergency vehicles stopped on the side of the road. That investigation resulted in a voluntary recall and over-the-air software update by Tesla that a lawyer for Mendoza's family said was insufficient.
Genesis Mendoza's death caused by the failure of Tesla's vision system is yet another example of Tesla overstating and overhyping what its technology can do
"Genesis Mendoza's death caused by the failure of Tesla's vision system is yet another example of Tesla overstating and overhyping what its technology can do; knowing full well that it was incapable of identifying and responding to an emergency vehicle flashing lights," lawyer Brett Schreiber told The Register in an email.
"Rather than taking the responsible step of recalling these vehicles, Tesla simply pushed an over the air update," Schreiber continued. "This limited bug fix left tens of thousands of vehicles on the road continuing to suffer from the same defect, putting both Mr. Mendoza, members of the public and emergency first responders needlessly at risk.
"The time for Tesla to be held accountable is coming," Schreiber concluded.
Whether that will actually happen remains to be seen, however. Tesla has managed to escape liability in two prior cases that made many of the same arguments that Mendoza's lawyers make in their case, namely that Tesla overhyped Autopilot and FSD's capabilities, fostering overreliance among drivers based on Tesla's safety assurances.
Justine Hsu, who alleged that her Tesla Model S swerved onto a curb in Autopilot mode in 2019, lost her case when a jury decided her vehicle acted as it should have, and that the company had disclosed everything it should have regarding the safety of the system. Several months after the Hsu verdict, Tesla defeated a case that made similar allegations in the fatal Autopilot accident that killed Micah Lee and injured two of his passengers.
Tesla's answer to the Mendoza lawsuit (included in the PDF linked above) has been much the same: Namely, Autopilot and FSD worked as intended, and the accident was instead caused by "the negligence, acts or omissions of another, not Tesla."
Whether the Mendoza case cracks Tesla's legal show remains to be seen, but there is precedent: While Tesla defeated the aforementioned two Autopilot accident cases last year, it chose to settle a third earlier this year involving the death of Walter Huang on the condition the amount it paid out remained secret.
Tesla said in April that, if the payout became public knowledge, it could be perceived as evidence of Tesla's potential liability, and be devastating to the EV maker if others took up legal arms in response. Both Tesla and Mendoza's lawyers have requested trial by jury in this most recent case.
Neither Tesla nor its lawyers in this case responded to questions. ®