Why Did Google's Self-Driving Car Crash into a Bus?

google self-driving car bus crash
Google has admitted fault for a crash involving one of its self-driving cars, opening up questions surrounding liability and culpability. Google

Google has revealed that a self-driving car crashed into a bus on a public road last month, marking the first time one of its cars has been responsible for causing an accident.

A report filed by Google on February 23 with California regulators stated that the autonomous vehicle pulled out in front of a bus traveling at 15 miles per hour. The human test driver in the car reported that he had thought the bus would give way to the car, though Google said it bears "some responsibility" for the crash.

"If our car hadn't moved, there wouldn't have been a collision," Google said in a statement released on Monday. "That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that."

It was a relatively minor scrape in which nobody was injured, and after more than 1.4 million miles of test driving Google engineers may see it as testament to the technology that it has taken this long. However, the incident raises much broader ethical and regulatory questions surrounding autonomous vehicles.

Self-driving cars hold the potential to dramatically reduce road accidents, though analysts have highlighted the recent accident as proof that crashes are unlikely to be eliminated altogether. This means software will need to be improved in order to determine why the car crashed and who was at fault.

"The recent crash between a municipal bus and Google's self-driving Lexus RX450h demonstrates that driverless cars are not necessarily 'crash-less' cars," Jonathan Hewett, head of strategy at Octo Telematics, tells Newsweek.

"As autonomous cars are driven by navigational data provided by a series of sensors, any accident will require increasingly sophisticated crash reconstruction, with software and hardware analysis to understand why it occurred."

Once it has been determined which vehicle was at fault, it will then need to be decided who was at fault: the car's owner, the car's manufacturer or the software developer who programmed the car.

In 2015, Volvo said that it will accept full liability for accidents involving its driverless cars, though if widely adopted this bold policy risks killing driverless cars off completely through endless lawsuits. Ultimately, the question of liability will need to be solved if this new era of technology is to survive.