Ethical Standards for Self-Driving Car Testing Are Still in Their Beta Stage

Tesla Autopilot Under Investigation
Tesla's autopilot function is under investigation after several crashes into parked emergency vehicles were reported. Telsa employees work outside a Tesla showroom in Burbank, California, March 24, 2020. Robyn Beck/Getty Images

For most auto manufacturers and technology companies, testing of self-driving or driverless vehicles is being done on a relatively small scale. Engineers and software developers for major automakers have been working for over a decade on ways to increase the effectiveness of these systems before bringing them, in stages, to market.

Tesla has taken a different approach. For years, the electric vehicle (EV) company has been offering customers access to a "Full Self Driving" beta software, allowing Tesla owners to effectively become guinea pigs for the company's tech. Over the last year, videos posted to social media have shown the technology failing in several cases.

The company is now under investigation by the National Highway Traffic Safety Administration (NHTSA) for several crashes where the Autopilot system was engaged.

Dr. Bryan Reimer, a research scientist at the MIT Center for Transportation, leads a team that explores the intersection of human behavior and automated driving features in production and future vehicles.

In a recent interview, he told Newsweek that the push towards autonomous driving is a balancing act between what autonomous vehicles can do versus what a driver is capable of doing in conjunction with that technology. The goal is to take some of the more routine driving tasks out of the hands of the driver.

"Humans in their very nature, without appropriate support, become over-reliant on automation and often use it beyond the designer's perception of the system," he explained.

To combat that aspect of human nature, Reimer says that autonomous systems should be seen as a collaborative part of driving in helping drivers make moment-to-moment decisions.

Autonomous Vehicle Ride
In this April 7, 2021 file photo, a Waymo minivan moves along a city street as an empty driver's seat and a moving steering wheel drive passengers during an autonomous vehicle ride in Chandler, Ariz. Ross D. Franklin/AP Photo

He points to Tesla's Autopilot and Full Self Driving products. Recent studies show that drivers using those technologies have become less attentive on average.

Reimer concedes that not even the best engineering team in the world can account for every variable, like stationary emergency vehicles, but after repeated, similar incidents ethical questions should be raised.

"Once we understand a situation it becomes foreseeable misuse," Reimer argues. "We can foresee, at this point, the inability of Autopilot to detect stationary emergency vehicles on the side of the road."

To prevent this type of harm becoming standardized, Reimer is in favor of a more careful approach to autonomous testing.

"The standard, to me, needs to be continual process improvement," he said. "It needs to be third-party validated scientific data."

Dr. Nicholas Evans agrees. As a professor of philosophy at University of Massachusetts-Lowell he studies ethical questions surrounding emerging technologies and has been involved in research on autonomous driving.

He and Reimer agree that the uneven nature of autonomous vehicle testing is partly down to the absence of a relevant regulatory environment.

"If an autonomous vehicle was a drug, then we'd know exactly how to test it," Evans said in an interview.

Both researchers are in favor of a regulatory body similar to the Food and Drug Administration (FDA) that can oversee the development of these technologies and insure that they're being safely tested and used. This FDA-style body would not only analyze the hard data but set ethical standards for testing.

Evans said that incidents that lead to injuries or deaths in unregulated technology development are setbacks for companies and industries as a whole. The benefits of having more oversight outweigh any reductions of speed in testing cycles.

"One thing the automotive industry knows really well is what happens when you don't respect consumer safety," Evans argued. "The people that I talk to from the automotive industry remember the Ford Pinto. They remember the Takata airbag situation. When these things happen, the automotive industry...loses many billions of dollars."

When it comes to the ethical questions surrounding testing itself, Evans said that that kind of regulation is needed for testing something marketed as having social value.

"They're marketed as interventions," he asserted. "Tesla doesn't just say 'having an autonomous vehicle is going to be cool for you.' Tesla says that having an autonomous vehicle on the road is good for everyone because it'll make us safer and more efficient."

One of the most prevalent questions is whether the public has a right to know when autonomous vehicles are being tested in their area.

According to Dr. Heidi Furey, a philosophy professor at Manhattan College who has done research on the ethical implications of autonomous vehicles, whether or not automakers should tell the public if testing is happening near them is a gray area.

"It's really difficult for new and emerging technologies because it's not always clear that the public can totally understand what the technologies are," she said. "They're really subject to what philosophers would call the fallacy of risk, where people tend to overemphasize risks that are emotionally salient and underestimate risks that are more mundane."

That presents a difficulty in achieving informed consent for testing in a geographic area. Evans says that community engagement and education is the best way to approach the issue, but that slapping warning stickers on test cars can create adverse outcomes like observer effects that can disrupt testing.

"I think that the public has a right to have these things tested like that," he said. "But they don't necessarily need to know which cars are the test cars on any given day."

During development and testing, Furey says that engineers would do well to consider the trolley problem, a classic thought experiment designed to explore ethical dilemmas. If you were in control of a runaway train, for example, would you hit five people on a track or switch tracks and just hit one person?

Sooner or later, autonomous vehicles will have to make a choice between options that are imperfect, Furey presses.

"We have to decide how we are going to make the best of a bad situation," she said. "Do we just care about the number of lives? Do we care about ages or if they're following the law or not?"

She says that while the trolley dilemma is a good starting point for highlighting the "ethically sticky" parts of a situation, it needs to lead to broader conversations about moral assumptions and the mundane parts of driving decisions.

For Reimer, these conversations between public and private entities about testing and ethics are lacking. An absence of policy guidance from the federal government leaves a vacuum in which suboptimal testing can thrive.

"We need to be having much more difficult conversations than anybody's comfortable having," he said.

He thinks that a lack of involvement from the government, combined with an equal lack of transparency from industry, may lead to a series of regulations that could hinder the development of autonomous technology.

"Demonstrating that you can walk in small environments before you're allowed to run on public roads is huge," Reimer added

But those conversations, according to Evans, obscure the bigger ethical picture. While conversations currently focus on getting testing right, there should be more dialogue about the ethical implications of who is affected by more self-driving vehicles on the road and the unintended consequences.

For Furey, she's most worried that ethical concerns aren't going to be considered until the novelty of autonomous vehicles wears off, too late for the public to care. There's emotional investment now because it's a new technology, but that goes away when autonomous vehicles are an everyday fact of life.

"That's going to be the biggest challenge," she said. "When this stops seeming new and interesting, how do we still do the work that we need to do ethically?"