The rapid advance of self-driving or autonomous cars reached an important milestone last month. For the first time, Google officials have publicly acknowledged that one of its autonomous vehicles was at least partly to blame for causing a wreck.
The incident between a Google self-driving Lexus RX450h and a municipal bus illustrates how our state and federal laws need to catch up with the truly dazzling technological advances that make these cars possible. Among the concerns for our civil justice system to address: just who will be liable for damage and injuries when one of these vehicles malfunctions? In fact, Google and other proponents of self-driving cars are actively lobbying Congress to address this issue and replace the patchwork of state laws regulating liability and other issues.
Until this Feb. 23 incident near the Google headquarters in Mountain View, California, Google officials had claimed a perfect safety record for their autonomous vehicles. While Transportation Secretary Anthony Foxx and NHTSA administrator Mark Rosekind recently boasted that autonomous driving technology could one-day reduce the transportation fatality rate to zero, fundamental sticking points remain, including vulnerabilities from cyber attacks and the ability to operate seamlessly in bad weather.
In the most recent case, Google reported the crash occurred when the Google car sought to get around some sandbags obstructing a road. The vehicle was traveling at less than 2 miles per hour, while an approaching bus was moving at about 15 miles per hour.
The vehicle and the test driver “believed the bus would slow or allow the Google (autonomous vehicle) to continue,” according to Google. But three seconds later, as the car in autonomous mode re-entered the center of the lane, it struck the side of the bus, causing damage to the left front fender, front wheel and a driver side sensor. No one was injured.
Reports Google: “we clearly bear some responsibility, because if our car hadn’t moved, there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.”
The company also said it has reviewed this incident “and thousands of variations on it in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.”
This incident raises the question—are today’s product liability laws read for self-driving cars?