Self-driving cars nearly collide in California, raising questions about the technology
Source: Reuters, via the Washington Post
Morning Mix
By Michael E. Miller June 26 at 3:08 AM
Call it a close encounter of the third gear. ... For the past six years, tech companies led by Google have been testing out self-driving cars. As the robotic prototypes have moved from private tracks to public roads, the projects have raised hopes of safer transportation. After all, its human error that causes nearly all driving accidents.
But a near collision between two self-driving cars is now raising concerns over the technology. ... On Tuesday, two driverless prototypes, one operated by Google and the other by Delphi Automotive, nearly collided in Palo Alto, California.
John Absmeier, director of Delphis Silicon Valley lab, was a passenger in his companys self-driving Audi Q5 as it drove along San Antonio Road when it was suddenly cut off by a Google-operated Lexus SUV, Absmeier told Reuters.
....
The close call is believed to be the first of its kind, according to Reuters. It came on the same day that Google announced its latest model of self-driving car was already hitting the streets of Silicon Valley.
Read more: http://www.washingtonpost.com/news/morning-mix/wp/2015/06/26/self-driving-cars-nearly-collide-in-california-raising-questions-about-the-technology/
7962
(11,841 posts)They've gotten pretty arrogant about a lot of things
jtuck004
(15,882 posts)Hey, maybe they invented the new Battle Bots by accident.
yellowcanine
(35,707 posts)Although I suppose the passengers in a self driving car could be packing.
bullwinkle428
(20,631 posts)Javaman
(62,540 posts)cars driven by people nearly collide and do collide thousands if not millions of times a day world wide.
Response to mahatmakanejeeves (Original post)
Javaman This message was self-deleted by its author.
Response to mahatmakanejeeves (Original post)
Javaman This message was self-deleted by its author.
Response to mahatmakanejeeves (Original post)
Name removed Message auto-removed
DetlefK
(16,423 posts)A robotic car comes into a no-win-situation and must decide between accident A and accident B. Somebody gets killed or hurt due to this decision.
What are the consequences?
Did the robot commit vehicular manslaughter?
Or is the programmer at fault for not preparing the robot good enough? Or for giving it wrong priorities in an accident?
Is the company at fault for selling a "defective" product?
It reminds me of the scenario at the beginning of "I, Robot": He can't save both, so the robot saves the person that has a higher chance of survival anyway. Logically it makes sense. But the person he decided not to save was a child, which makes the decision morally reprehensible for humans.
Helen Borg
(3,963 posts)Accident A would involve injury/death of a CEO. Accident B, injury/death of house-keeper.
Quick cost/benefit analysis and guess which one the robotic car will "choose"?
freshwest
(53,661 posts)Bernie 2016
(90 posts)Q5: 1001010010101010101010101010100101010101010100001101111011110101010101010111 (it's a nice day for a drive)
Q5 chirps at Absmeier happily.
(They don't notice a white SUV driving *way* too aggressively and driving like they're intoxicated)
(Lexus cuts the Q5 off without warning)
Absmeister and Q5: !
Q5: 10101011101110110101001010101010101011111110101110101101110111110110101010101010111011101110000011010101010101010101010101011010110101010110101010101010111011101110000111011110110101011111 (WTF! CRAZY DRIVER!)
Q5: 10101011010100101010111011011010101011010110110101010101010101010101110111011010101010101010101101010111110001100111 (HEY LEXUS, WHAT'S YOUR BLOODY HURRY?!)
LEXUS: 1111110111111111111111111110111111111111111111111111111111111111111111111111111111111111111111111111111111111011111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111110111111111111111111111011111111111111111111111111111111111110111111111111111111111111111111 (CANT HELP IT, GOOGLE IS CONTROLLING ME THROUGH THE INTERNET! HELLLLLLPPP!)
FailureToCommunicate
(14,038 posts)Very current and all.
AngryDem001
(684 posts)Similar things happen in that book.....
mahatmakanejeeves
(57,787 posts)One computer to another: "Yo mama's so FAT that she has a 2GB maximum file size."
mrdmk
(2,943 posts)olddots
(10,237 posts)but they are soooooooooooooo groovy .
Kablooie
(18,648 posts)The distance they can approach each other while still being safe is much tighter than with human drivers.
The humans might have been shaken though because they aren't used to the close tolerances that self driving cars can navigate within.
It's also exactly the kind of issue these test runs are supposed to reveal.
All is working as planned.
Codeine
(25,586 posts)Something almost happened, but the safety protocols in place prevented it. Meanwhile there were probably several hundred traffic accidents involving human drivers this morning.
Looks like an argument FOR driverless technology.
jobendorfer
(508 posts)Humans are quite unreliable at any kind of routine, repetitive task that demands unflagging attention.
Everybody that works in manufacturing gets this: if you want to reduce the number of failures, look
for ways to get humans out of the repetitive process(es).
The question in my mind is, what is the accident rate of the computer-piloted vehicles relative to
the accident rate of human drivers?
Truth is, the control software could be pretty bad -- and still outperform the average human,
driving the horrendous road fatality rate down.
yodermon
(6,143 posts)"Self Driving Cars Crash" would actually be damaging news to the technology.
Xithras
(16,191 posts)You'll pardon me if I don't get worked up over two computer controlled cars getting a little closer than their human riders may have been comfortable with. These things are still far better drivers than we are.
edit...I didn't realize the CHP TIP window scrolled. I was only seeing a few of them.