05/27/2021 / By Franz Walker
The question of who is to blame when a fully autonomous vehicle crashes is holding up legislation that the auto industry says it needs to advance.
In a normal crash involving only human-operated vehicles, it’s much more straightforward who is to blame. The vehicle or vehicles involved have human drivers after all.
“If another driver hits you, it’s clear who the driver is,” Sarah Rooney, senior director of federal and regulatory affairs for the American Association for Justice, said. “It’s the human being.”
But when an autonomous, self-driving car is involved, who to blame is much less clear cut. Instead of the driver, the fault may instead lie with the manufacturer and the software. But it may also like with the vehicle’s owner if they haven’t properly updated its software. In addition, should the manufacturer be at fault, a victim may seek to sue under product liability standards, as with a normal car.
Fully autonomous, self-driving vehicles are currently still in the beta stage. But issues have held up legislation that would allow automakers to test and sell tens of thousands of them – something the industry says it needs to develop and eventually market the technology to consumers.
One bill to do that did sail through the House several years ago, but it was bogged down in the Senate over questions of liability.
Earlier this month, a move to merge the bill with must-pass legislation faltered over an initiative by some automakers to include language meant to prevent consumers from suing or forming class-action cases. Instead, consumers would have to submit disputes to binding arbitration. While this isn’t usually the case with automobiles, it is more common with technology products.
But the move was pulled on the even of a committee vote after facing stiff resistance from safety groups and trial lawyers, both influential among Senate Democrats. Supports have said that they’re working to address the liability issues in the hopes of moving the legislation forward this year.
These liability issues aren’t just theoretical. They’ve been highlighted over the years thanks to a number of high-profile incidents and crashes involved self-driving vehicles.
In 2018, a self-driving SUV being tested by Uber was involved in a crash in Tempe, Arizona that killed a woman who was jaywalking. During the investigation into the crash, there were attempts to shift the blame away from the vehicle and its operator – Uber’s test vehicles ran with a human supervisor onboard – to the victim, claiming that they had jumped out suddenly into the vehicle’s path. But videos released by the police later showed that the woman was more than halfway across the road and walking slowly along with her bicycle.
A National Transportation and Safety Board (NTSB) investigation later ruled that the system design for the vehicle – a modified Volvo SUV that Uber was using to test the technology – didn’t “include a consideration for jaywalking pedestrians.”
The fatal crash amplified calls for regulations on the testing of self-driving vehicles.
“We need smart, strong safety rules in place for self-driving cars to reach their life-saving potential,” Ethan Douglas, senior policy analyst for Consumer Reports, stated to The Associated Press.
Meanwhile, Tesla has seen a handful of crashes of its vehicles equipped with its Autopilot – a system that provides a level of automation, but still far below the capabilities needed for full self-driving.
In these cases, Tesla has pointed to the fact that its Autopilot isn’t a fully autonomous system and that it still requires drivers to pay attention at all times, keeping their hands on the steering wheel.
But many have pointed out that the way Tesla promotes its technology, calling them “Autopilot” and “Full Self-driving” could mislead consumers, with fatal results. The most recent of these was a crash on May 5 of a Tesla Model 3 driver in Fontana, California. Video from the driver’s apparent Tiktok shows him praising the automaker’s “full self-driving” features and driving with his hands off the wheel. (Related: Video proves that Tesla autopilot can get you killed.)
Tesla has since faced a number of lawsuits, both in the U.S. and abroad, in regards to crashes involved its Autopilot. Last year, a German court ruled that the company misled consumers on the capabilities of its automated driving systems. As a result, Tesla’s German business unit was banned from including the phrases “full potential for autonomous driving” and “autopilot inclusive” in its marketing materials.
While celebrating the victory, a lawyer for the Center for Protection Against Unfair Competition – the non-profit that filed the lawsuit – lamented the lack of legislation for autonomous driving.
“A legal framework for autonomous inner-city driving doesn’t even exist yet in Germany,” said Andreas Ottofuelling, one of the lawyers representing the group, in a press statement. “And other functions aren’t working yet as advertised.”
Follow RoboCars.news for more on the issues plaguing self-driving vehicles.
Sources include:
Tagged Under:
AI, artificial intelligence, automobiles, autonomous cars, autonomous vehicles, autopilot, bad tech, cars, Congress, Full Self-Driving, Google, liability, NTSB, self-driving cars, Senate, tesla, Uber, Waymo
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 GLITCH.NEWS
All content posted on this site is protected under Free Speech. Glitch.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Glitch.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.