Tesla has been making headlines recently as the automotive company just won its first U.S. trial involving a fatal crash that was alleged to be caused by its autopilot feature.
Two passengers injured in a 2019 crash filed a case in California state court accusing Tesla of selling vehicles with the Autopilot feature when they knew it was defective, as CNBC reported. Tesla rebutted, arguing it was human error, not technological error, that caused the deadly crash.
The gruesome crash killed the driver when the vehicle hit a tree and burst into flames. Two other passengers, including the driver's eight-year-old son, were seriously injured.
One of Tesla's popular but contentious features is the Autopilot feature, which allows the vehicle to make certain readjustments to keep the car and its riders safe. The Autopilot feature, the most basic of its driver support systems, only offers lane correction and traffic-aware cruise control. Neither of these are close to being considered fully autonomous, rather they are similar features that many other current vehicles on the market have.
Tesla denied liability in the crash, stating that the driver had consumed alcohol before driving and that there was no way of proving the Autopilot feature was engaged when the crash occurred.
As autonomous vehicles and assisted driving technology continue to expand in the automotive market, both drivers and vehicle manufacturers are assuming a lot of risk. The technology is new, and terminology can be confusing.
With lofty and speculative headlines about fully autonomous vehicles, drivers may not understand the difference between "autopilot" or assisted driving support versus fully autonomous driving, where the driver does not need to take the wheel at all (literally and figuratively). As seen in this case, that confusion can be deadly.
The case was further confused when the plaintiffs argued that the Autopilot was poorly designed, while the jury was only asked to judge whether there was a manufacturing defect. Matthew Wansley, former counsel of an automated driving startup, said of the case, "If I were a juror, I would find this confusing."
This case could set a precedent that would protect automotive manufacturers from assuming the blame for these types of crashes. This case puts the fault on the driver, who possibly misunderstood and misused the technology, which Tesla argues is not their fault.
One commenter on Reddit added that "too much confidence in the tech is part of the problem, humans disengage... and that is where the problems start."
Join our free newsletter for cool news and actionable info that makes it easy to help yourself while helping the planet.