A Google driverless car. People who actually drive cars are idiots, and are slamming into the driverless ones. |
According to Bloomberg, the "problem" is driverless cars follow the rules.
"They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well.
As the accidents have piled up -- all minor scrape-ups for now -- the arguments among programmers at places like Google Inc and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?"
The first driverless car crash with injures came in July, when three Google employees in a driverless car suffered relatively minor neck injuries when another driver rear-ended them, says CBS News. The driver of the car that re-ended the driverless vehicle suffered minor injuries.
The accident rate for driverless cars is twice as high for driverless cars as it is for regular cars, according to the University of Michigan's Transportation Research Institute, as reported by Michigan Public Radio.
However, so far, none of the crashes studies were the "fault" of the driverless cars. Most have been by inattentive drivers slamming into the backs of driverless cars.
The driverless cars tend to obey the speed limit. A huge proportion of people who drive DON'T obey the speed limit (Yeah, I'm talkin' to you!)
Add in inattentiveness - let's text while we're driving since everybody else is doing it! -- and people end up rear-ending the slower driverless cars.
As a Google engineer put it, driving is a social game, not just plugging in the exact parameter of what is legal on the roadways and what is not.
Should Google and anyone else testing self-driving cars to break the rules every once in awhile, just to make things on the road flow more smoothly? And if so, to what extent? Limit it to things like crossing a double yellow line to make room for a bicyclist? Or speed a little bit on the freeway to keep up with the flow of traffic?
Where's the line between breaking the rules to make things go better and becoming a robot scofflaw?
And what of that "social game"? Will road rages become even more enraged when a robot car does not respond emotionally to some jerk yelling and screaming about something? Do driverless cars call police to report their suspicions that the car ahead is being driven by a drunk?
You can imagine, also, whether to program driverless cars to make "moral" choices, even if it's saving others from their own stupidity.
For example, asks Bloomberg's reporting, "should an autonomous vehicle sacrifice its occupant by swerving off a cliff to avoid killing a school bus full of children?"
Driverless cars will ultimately be programmed differently, so different models will "behave" differently in traffic.
That begs the question, posited by another Bloomberg article: Who would be at fault if two driverless vehicles collided?
After all, says Bloomberg. Computers routinely crash. Won't robot cars crash, too?
Lawyers are loving this, as you can imagine. Yes, driverless cars are autonomous robots. But would be made by humans. Who make mistakes. Who might program a car incorrectly.
Says Bloomberg:
"With no one behind the wheel, lawyers say they can go after almost anyone even remotely involved.
'You're going to get a whole host of new defendants,' said Kevin Dean, who is suing General Motors Co over its faulty ignition switches and Takata Corp over air bag failures. 'Computer programmers, computer companies, designers of algorithms, Google, mapping companies, even states. It's going to be very fertile ground for lawyers.'"
Well, at least lawyers can look forward to full employment, since robots and computers are going to take all the rest of the jobs from us humans.
All these driverless car questions leads to the largest question of all: Will the machines just take over? And another question: Will they do a better job? Because sometimes I think us humans have really mucked things up.
,
No comments:
Post a Comment