The Real Problem With Self-Driving Cars: They Actually Follow Traffic Laws

Image courtesy of Ford

In the century or so that people have been driving, two different sets of rules have developed: The official laws that we’re supposed to obey, and the unofficial code of the road that bends and often breaks those laws. For example, we all know — whether we like it or not — that many highway drivers are going to exceed the speed limit. But what happens when you introduce self-driving cars that are designed to always follow the rules and don’t understand why other drivers are extending their middle fingers in their direction?

That seems like it would be a good thing, and it would be if all vehicles on the road were autonomous ones that communicated with each other and followed the same rules. The problem is that they will still have to share the road with human drivers.

New York state is about to allow autonomous cars to begin testing on its public highways and roads, adding another area to the territory that the vehicles can cover, and yet another set of unspoken rules of traffic that engineers will have to teach to their vehicles.

Companies testing the vehicles will have to file the routes their cars will travel with the state, but it’s other drivers who pose the real problem.

“There’s an endless list of these cases where we as humans know the context, we know when to bend the rules and when to break the rules,” a Carnegie Mellon University professor in charge of autonomous car research told the Associated Press.

The artificial intelligence systems that run self-driving cars, however, don’t understand these subtleties in the same way most humans can. They would need to not only learn the special traffic quirks of every area they travel through. For instance, your city might have a “no right turn on red” rule; there are no signs indicating this restriction, but everyone knows about it and only 50% of them obey.

“It’s hard to program in human stupidity or someone who really tries to game the technology,” a spokesman for Toyota’s autonomous driving unit told the AP.

When we hear about autonomous vehicles crashing with human-piloted vehicles, the cause is usually a human error that the software didn’t account for, like when a Google-programmed car didn’t recognize that a human driver had run a red light.

We’re already getting a glimpse into what a future traffic system that includes both robot cars and human jerks could look like. As cars in production and for sale to the public gain more autonomous driving features, rule-bending drivers are taking advantage.

In a few documented cases, motorists have intentionally pulled in front of vehicles that are known to have automatic braking, like cars from Tesla Motors.

Limited tests of autonomous cars are happening now, sure, but experts think that we have at least a decade and a half before cars can safely drive themselves among humans. It might take even longer to safely operate the vehicles in cities with especially chaotic traffic, like Beijing.

Want more consumer news? Visit our parent organization, Consumer Reports, for the latest on scams, recalls, and other consumer issues.