For self-driving cars to become a reality, they will face their biggest problem: human shortcomings.
Ordinary small car accidents will not be widely reported on a global scale. However, the case of a minibus crashing into a truck in Las Vegas last week is a completely different story.
The bus drove itself in the accident in Las Vegas last week.
The main difference of this accident is that the minibus has no driver to control the car. Inside it is a series of sensors and microprocessors, allowing the vehicle to recognize things around and take visitors around without the operator. The accident was also particularly noticeable when it happened just an hour after the bus was officially tested, a “launch screen” could not be worse with a self-driving car.
Except one thing: the accident is not the driver’s fault. The driver of the cargo truck when he stepped back did not see the bus and crashed into it. Luckily, there was no damage to people, and in fact, the bus worked as expected: it stopped when it sensed that the truck was backing up. However, the carelessness of the driver caused it to “give up”.
This incident is the latest incident in a series of car-related car accidents, and they all have one thing in common: the error originates from the other vehicle. Earlier this year, Uber’s self-driving car when tested in Arizona was overturned while passing through a yellow light, after a manned car tried to cross the road junction and crashed into it. Google’s self-driving car accidents are mostly not their fault.
These incidents have made the arguments for self-driving cars more powerful: robots are clearly better drivers than humans. We will need more self-driving cars on the road, accelerating their growth.
However, things are not simple at all. Statistics show that self-driving cars are more likely to get into accidents, even if they’re not their fault. The 2015 study by the University of Michigan Transportation Research Institute shows that on average, one million miles (about 1.6 million km) of self-driving vehicles will involve 9.1 accidents, compared with 4.1 numbers of manned vehicles.
Uber’s self-driving car had an accident when passing yellow lights earlier this year.
This is very contradictory. Self-driving cars collide more, but most collisions are not their fault. If they are related to more accidents, how can they be safer?
The most logical answer is probably a self-driving car that makes people worse drivers. While self-driving cars are programmed to never go too fast, give way to other vehicles as much as possible, generally comply with all traffic regulations – to be perfect drivers – we is not.
People who have been driving self-driving cars can prove this: they move perfectly, never “crashing” another vehicle and will definitely never pass a red light. If someone stands in front of the vehicle to drive, they will stop immediately with an extraordinary reflex.
But this creates problems for the rest of us. We are so used to interacting with other drivers, predicting their shortcomings and intentions. The passengers on the self-driving bus in Vegas did not notice the car’s carelessness, and the bus itself did not predict it.
In the Uber crash earlier this year, the driver violated traffic laws and cut off two lanes, but the other drivers of the two lanes were both aware of and backing up, and the car drove itself. is not.
The tech industry has a special phrase for this type of problem: “You’re holding it wrong”, rooted in the famous excuse of Steve Jobs when there are guests. The shop complained about their iPhone 4 not receiving the call. It has become the “catch phrase” to blame people for technical errors. Self-driving cars are similar to the “you’re holding the wrong way” problem: technology can work perfectly, but people don’t.
Waymo’s self-driving car will be designed to operate “more like humans”.
When the street is completely absent from human drivers, we will be much safer – human error involves 90% of accidents – but what about during the period from now until that day?
The appearance of self-driving cars will not be like turning on a switch. There will be a transition period, which can take decades for self-driving cars to completely replace people.
Besides the occurrence of many accidents like this, there will be other problems that arise as the vehicle is too compliant with the speed limit or “too polite” – in 2015, a self-driving car of Google was The police asked to stop because it was too slow. Perhaps, the public’s opinion of self-driving cars, which is not very good, can get worse because of their absolute loyalty to the rules.
Technology companies are doing their best to combat this. The cars tested by Waymo, the division that separated from Google last year, will now operate more “aggressively”.
This is a good example of how technologists must understand the imperfect world that their technology is living and adapting to. For self-driving cars to become a reality, they will face their biggest problem: human shortcomings.