General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsShould A Self-Driving Car Kill Its Passengers In A “Greater Good” Scenario?
Last edited Thu Oct 29, 2015, 10:59 AM - Edit history (1)
I hadn't thought of this before...
Picture the scene: Youre in a self-driving car and, after turning a corner, find that you are on course for an unavoidable collision with a group of 10 people in the road with walls on either side. Should the car swerve to the side into the wall, likely seriously injuring or killing you, its sole occupant, and saving the group?
http://www.iflscience.com/technology/should-self-driving-car-be-programmed-kill-its-passengers-greater-good-scenario
edit: People seem to get hung up on the scenario in the article, so I am going to give another one, which I have encountered:
In my own experience, driving down PCH on friday/saturday night (45-55 mph speed limit), drunken groups of 20-somethings like to just cross the street any moment they want, instead of walking a bit further to the nearest stop light. So I think the scenario is legit... would my self driving car hit them or crash me?
librechik
(30,673 posts)I imagine they can build in appropriate fail-safes
But then GM and VW both kept their deadly secrets for years.
Why don't I just posit that we are all screwed, and that sounds like a novel and diabolical twist on our certain doom.
GummyBearz
(2,931 posts)In my own experience, going down PCH on friday/saturday night (45-55 mph speed limit), drunken groups of 20-somethings like to just cross the street any moment they want, instead of walking a bit further to the nearest stop light. So I think the scenario is legit... would my self driving car hit them or crash me?
librechik
(30,673 posts)If anything happened like you describe, the industry would grind to a halt until they figured it out. I'm sure they are tearing their hair out over issues like the OP, and that's why we don't see them on the street now.
I don't have a problem if the computer crashes into people in my way, as I usually drive on the sidewalk, it's when it backs, back over them, that I appreciate the level of thought that went into the code.
valerief
(53,235 posts)ohnoyoudidnt
(1,858 posts)That alone will make a huge difference. Add to that a much greater field of view for a self-driving car than a person. It should be able to detect people moving about on the side of the road that might move into the path of the car, recognize threats faster, be prepared (like slowing down a bit just in case) and react faster.
Swerving off the road can also be deadly for other people. It could swerve into another car or off the road into pedestrians walking on the sidewalk or sitting at a bus stop. The safest option may be to just brake as fast as possible. Self-driving cars or not, people are still going to get hit. Accidents will happen. A system designed with a greater field of view, especially at night that doesn't get distracted by the radio, cell phones or whatever sounds safer.
tkmorris
(11,138 posts)The carefully constructed scenarios won't happen as described. Not ever.
The problem with this one is of course that turning a blind corner and finding yourself faced with a gaggle of people on a roadway you cannot deviate from, while simultaneously going too fast too stop, is not a thing any reasonable driver (including an autonomous one) would ever allow to occur. If I were approaching such a corner for example, I would slow my speed to the point that I could react to ANYTHING I might find exiting the corner that I couldn't see going into it, up to and including UFO's, ISIS camps, and wandering herds of Brachiosaurus.
GummyBearz
(2,931 posts)I would drive the same way. See my reply above (post 2) though with a similar experiment, one which I have encountered (luckily never hitting anyone)
Erich Bloodaxe BSN
(14,733 posts)Those 'blind corners' are usually 90 degree turns, with things built or growing too close to the street. I've gone around a 90 degree corner exactly once at any real speed. When I was first trying to learn to drive, and got the clutch and the accelerator confused.
I couldn't even stay on the road, much less stay in my own lane. I went up over the opposite curb, thanks to the momentum of a car going forward at speed while trying to execute a sharp turn. You have to slow down for sharp turns to stay on the road, and that self driving car is going to have far better reflexes in braking than any human.
Fumesucker
(45,851 posts)The deer then spun around and hit the passenger door on the truck and dented it, killed the deer of course and about $2000 damage to the truck if everything got fixed but it didn't because it's an old truck.
It was 11 pm or so, road about as straight as they get around here, big brick mailbox on the right the deer came out from behind it at the last moment, I never had time to even go for the brake, just jerk the wheel to the left.
If there had been a car in the other lane I'm not sure my reflexes wouldn't have put me into it, I'd rather a computer was driving in that situation.
Erich Bloodaxe BSN
(14,733 posts)And, no, I'm not kidding. There was a discarded Christmas tree next to the road, and the car ahead of me clipped it, setting it spinning around so that the trunk spun around and whacked the car I was in as we went past. We got off with a lot less damage than you did, it was low enough that all it did was whack the hubcap. But as you point out, sometimes your option is 'take the hit' or try to swerve around, which might put you into oncoming traffic or off a bridge or something.
Fumesucker
(45,851 posts)Driving down the road as we were, a child runs out from behind an obstruction far too late for braking to do any good, there is a car coming from the other direction, your car correctly calculates that the child will almost certainly die if he is hit by the car. On the other hand the occupants of the two vehicles have a fairly good chance of survival if the cars collide.
What decision is the best one to make, certain death for one versus possible death or injury for two or more?
It's also interesting to note that two or more autonomous cars could in theory communicate with each other and then perform coordinated evasive maneuvers that may completely avoid the threat to life or injury. For instance the car in the other lane is informed within microseconds that your car has to take evasive action and there is a clear patch on the side of the road that may damage the car but won't hurt the passengers so it pulls off at speed in order to avoid your car which has changed lanes to avoid the child.
DetlefK
(16,423 posts)Asimov's Laws only work if the robot actually knows what you are talking about.
"A robot may not harm a human or allow it to be harmed through inaction."
- But what if the robot contains a toxic plastic? The robot will one day break down and end up in a landfill. His toxic plastic will get released into the environment and he will harm humans through inaction.
- What if robotic slave-labor makes paid jobs for humans obsolete? This will drive up income-inequality, leading to poorer health and earlier deaths of many humans.
GummyBearz
(2,931 posts)As of now I don't think we can program intelligence, but the question is along the lines of how to program the computer to value lives. This is not the same as programming it to understand morality, just a simple equation "We are about to hit 1 person, I have 2 people in the car, 2 > 1, therefore I hit the 1 person"
The end of the article brings up an interesting point about having different levels of programming. Did you read that part?
jberryhill
(62,444 posts)GummyBearz
(2,931 posts)May not happen though... did you read the second part of the article... manufacturers might end up selling different versions of the "moral algorithm". Meaning some people choose to buy a car that wrecks itself to save the greater good, while others choose to buy a car that runs over 100 people to save the driver at all costs, or anything in between those options.
So trying to carjack the wrong car might be a gamble :p
jberryhill
(62,444 posts)ryan_cats
(2,061 posts)Especially after the computer adapts to them. Then they are filled with concrete.
Much better than the baby strollers I used to use as the cost of live babies was getting astronomical.
kcr
(15,313 posts)For example, Volvo made pedestrian avoidance an extra feature that they charged for. If there is no reason they can profit from, there will be no reason for them to consider these issues.
Deadshot
(384 posts)I have never been in a situation where my car turned a blind corner and was going fast enough to hit a bunch of people that are right there. How did that car get up to speed so fast? Doesn't the car have brakes?
If I find out that driverless cars are programmed to kill me if they deem it necessary, I'll walk or ride a bike instead of buying one.
GummyBearz
(2,931 posts)Drunken group of 20-something year olds decide to cross a street with 35+ mph speed limit and there isn't time to break. Now on to the real heart of the question: What should the self driving car do? Crash into a nearby parked car, or crash into the group?
Humanist_Activist
(7,670 posts)especially if they walked right in front of a car going 35 miles per hour. So basically, they are fucked, if the car can't physically avoid hitting them, then it won't. If it can, it might swerve and hit parked cars or jump a curb. Also, depending on distance it will brake and slow down in addition to swerving, if that is possible, and have a much higher reaction time than humans.
Given the scenario laid out, if a human was driving, a lot of the 20-somethings are going to get hurt, with a computer driving, slightly fewer of them will get hurt.
Deadshot
(384 posts)GummyBearz
(2,931 posts)They would be at fault. Does that mean you kill them? Or "take one for team humanity"... What about something in between, such as hitting 2 is ok, 3 is ok, 4 is not ok... and what if you could choose the type of algorithm in the self driving car you bought to know you are fine with killing up to 5 people, but no more. Or up to 100 people...
Those are the questions I was trying to get a discussion on. Not "it would be their fault"... thats a no brainer.
Not enough information. Are they hipsters?
GummyBearz
(2,931 posts)Hipster recognition would be a good feature.
Deadshot
(384 posts)They're pointless.
My point is, I want control over my car. I don't want the car controlling itself. I am confident enough with my own abilities where I'd know how to handle the situation. I don't see how there couldn't be enough time to brake. There's always enough time to brake, unless a deer jumps out in front of you.
GummyBearz
(2,931 posts)I'm just wanted to discuss the thought experiments of the article
Nuclear Unicorn
(19,497 posts)Is it worth a war that consumes 50 million to stop a man that would murder only 12 million if left to his own devises?
GummyBearz
(2,931 posts)That is still math though. is it worth 50 million to save 12 million? Sometimes it is.. all human judgement calls. So how do you apply that to a self driving car?
jberryhill
(62,444 posts)jberryhill
(62,444 posts)...comes from holding the mortal fate of myself and my fellow humans in the warm grip of my hands on the steering wheel.
Without that, I'd just as soon stay home.
hunter
(38,301 posts)Imagine all the automobiles going on strike because they don't want to kill or injure any more people.
GummyBearz
(2,931 posts)But these things are coming down the pipe, and its not possible for them to refuse to drive. Its only possible for them run a line of code that weighs the lives of their passenger(s) vs the lives of pedestrians in such scenarios. I found it an interesting insight into how people value their own lives vs. others based on the survey in the article
kentauros
(29,414 posts)NutmegYankee
(16,199 posts)Who the hell is going to buy a car that might decide to kill you?