Wednesday was supposed to be the triumphant launch of a free, driverless cruise ship in downtown Las Vegas. Designed by the French company Navya, operated by another French company called Keolis, sponsored by the city and the American Automobile Association, the year-long pilot project should show the potential of the vehicles The autonomous vehicle is poised to change the transit industry.
Instead, within hours, the project was greeted with the worst possible headline: “The self-driving bus crashed on the first day.”
Titles, and abundant of others like she, was technically correct but a little misleading. Officials say another vehicle — a semi truck that crashed into the bus while backing up — was at fault. And it was a low speed attack that nobody got hurt. The crash “only dented the front plastic panels of the vehicle,” according to Jeff Zurschmeide of Digital Critical.
Therefore, the interpretation of many headlines—that the bus malfunctioned and caused a major crash—is completely wrong.
What is not clear is whether the bus could have done more to prevent the attack.
Drive yourself
Zurschmeide happened to be on the bus at the time of the crash and explained what happened:
On the other hand, the bus does exactly what it was set up to do, and that’s the important point. The self-driving system does not account for the vehicle in front accidentally backing up. We had about 20 feet of empty road behind us (I looked) and most people driving would have thrown the car in the opposite direction and used some of that space to get away from the truck. Or at least move to the view and make our presence difficult to miss. The bus does not have those answers in its system.
But Keolis, a self-driving car company, told reporter Pete Bigelow the opposite:
2. The shuttle is capable of changing directions. He didn’t in this case, because the cars stopped behind him. It’s a special box, says Chris Barker, VP of new mobility at Keolis.
— Pete Bigelow (@PeterCbigelow) November 9, 2017
police report, due out next weekcan help clear things up.
On the one hand, the obvious lesson here is that the cars that drive have to be able to do more than avoid accidents. They also need to plan to take the same common sense steps that human drivers would take hindrance accidents even when they are the technical fault of the other driver.
A 2015 study that looked at the first 1.2 million miles of personal car data found cars (which were mostly Google cars, although Audi and Delphi also had some cars on way) seems to get more accidents than usual. cars of people. Accidents are mostly small fender-benders, and all of them are the fault of other vehicles. Even a fatal accident that caused one of Uber’s self-driving research vehicles to roll over earlier this year was due to driver error in another vehicle.
One possible explanation for those 2015 findings is that human drivers often fail to report minor accidents of this type. But another possibility is that these early self-driving cars drove in ways that human drivers might find confusing or unpredictable—like slowing down at a stoplight earlier than a human driver would. If self-driving cars are legally at fault in having no accidents but still driving in a way that leads to other drivers hitting them at a high rate, that becomes a problem in its own right.
The good news is that these types of accidents rarely if ever cause harm to people. So while this is certainly something that Navya and other driverless car companies should work on, it’s not something to fight about. It shouldn’t be difficult to teach a driverless car to notice an impending crash and take action—like backing up or moving forward—to prevent it. Someone at the driverless car company needs to do the job.