Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search
 

GummyBearz

(2,931 posts)
Thu Oct 29, 2015, 10:24 AM Oct 2015

Should A Self-Driving Car Kill Its Passengers In A “Greater Good” Scenario?

Last edited Thu Oct 29, 2015, 10:59 AM - Edit history (1)

I hadn't thought of this before...


Picture the scene: You’re in a self-driving car and, after turning a corner, find that you are on course for an unavoidable collision with a group of 10 people in the road with walls on either side. Should the car swerve to the side into the wall, likely seriously injuring or killing you, its sole occupant, and saving the group?

http://www.iflscience.com/technology/should-self-driving-car-be-programmed-kill-its-passengers-greater-good-scenario


edit: People seem to get hung up on the scenario in the article, so I am going to give another one, which I have encountered:

In my own experience, driving down PCH on friday/saturday night (45-55 mph speed limit), drunken groups of 20-somethings like to just cross the street any moment they want, instead of walking a bit further to the nearest stop light. So I think the scenario is legit... would my self driving car hit them or crash me?

36 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Should A Self-Driving Car Kill Its Passengers In A “Greater Good” Scenario? (Original Post) GummyBearz Oct 2015 OP
Is this a self driving car without brakes? librechik Oct 2015 #1
If every car could always break in time there would never be an accident... GummyBearz Oct 2015 #2
it would stop and over to you, owner. They drive very slowly and have an override function. librechik Oct 2015 #4
I ryan_cats Oct 2015 #16
Bwahaha! valerief Oct 2015 #31
The computer would likely have a much faster reaction time in braking than a driver. ohnoyoudidnt Oct 2015 #25
Here's the problem with all the scenarios like these tkmorris Oct 2015 #3
I agree their scenario seems off GummyBearz Oct 2015 #5
It's almost physically impossible, really. Erich Bloodaxe BSN Oct 2015 #9
I hit a deer not long ago, managed to swerve enough I didn't hit it head on but crushed a headlight Fumesucker Oct 2015 #14
Reminds me of the time a tree jumped me. Erich Bloodaxe BSN Oct 2015 #15
Ok, here's the scenario Fumesucker Oct 2015 #22
First you would have to teach a calculator what "good" is. DetlefK Oct 2015 #6
You're going a bit off topic with the robot labor GummyBearz Oct 2015 #7
What's awesome is the ability to carjack people just by standing in the street jberryhill Oct 2015 #8
haha, that was good GummyBearz Oct 2015 #10
Still, being able to stop traffic with cardboard boxes is going to be fun jberryhill Oct 2015 #11
Especially ryan_cats Oct 2015 #18
Interesting for conversation, but car companies will not be wringing their hands over this kcr Oct 2015 #12
I don't see how this could ever be a scenario. Deadshot Oct 2015 #13
Ok, try this GummyBearz Oct 2015 #17
A couple of things, first, the accident would be the fault of the pedestrians.... Humanist_Activist Oct 2015 #19
Agreed. Deadshot Oct 2015 #24
Everything you said is right GummyBearz Oct 2015 #27
Not ryan_cats Oct 2015 #20
hehe :) GummyBearz Oct 2015 #29
I don't like "what ifs". Deadshot Oct 2015 #23
A valid position to have GummyBearz Oct 2015 #26
That eliminates the moral quality of life and reduces it to mere math Nuclear Unicorn Oct 2015 #21
Yes GummyBearz Oct 2015 #28
Ah, you've never driven in Philly during rush hour, have you jberryhill Oct 2015 #35
Yes, the sheer pleasure of driving... jberryhill Oct 2015 #34
Any car with that kind of judgement would refuse to take people anywhere. hunter Oct 2015 #30
Good point on the safety of driving in general GummyBearz Oct 2015 #32
Autonomous Cars of the Future will never have that kind of problem: kentauros Oct 2015 #33
The self-driving car that crashes you into a wall is the car no-one will buy. NutmegYankee Oct 2015 #36

librechik

(30,673 posts)
1. Is this a self driving car without brakes?
Thu Oct 29, 2015, 10:28 AM
Oct 2015

I imagine they can build in appropriate fail-safes

But then GM and VW both kept their deadly secrets for years.

Why don't I just posit that we are all screwed, and that sounds like a novel and diabolical twist on our certain doom.

 

GummyBearz

(2,931 posts)
2. If every car could always break in time there would never be an accident...
Thu Oct 29, 2015, 10:35 AM
Oct 2015

In my own experience, going down PCH on friday/saturday night (45-55 mph speed limit), drunken groups of 20-somethings like to just cross the street any moment they want, instead of walking a bit further to the nearest stop light. So I think the scenario is legit... would my self driving car hit them or crash me?

librechik

(30,673 posts)
4. it would stop and over to you, owner. They drive very slowly and have an override function.
Thu Oct 29, 2015, 10:39 AM
Oct 2015

If anything happened like you describe, the industry would grind to a halt until they figured it out. I'm sure they are tearing their hair out over issues like the OP, and that's why we don't see them on the street now.

ryan_cats

(2,061 posts)
16. I
Thu Oct 29, 2015, 12:14 PM
Oct 2015

I don't have a problem if the computer crashes into people in my way, as I usually drive on the sidewalk, it's when it backs, back over them, that I appreciate the level of thought that went into the code.

ohnoyoudidnt

(1,858 posts)
25. The computer would likely have a much faster reaction time in braking than a driver.
Thu Oct 29, 2015, 01:10 PM
Oct 2015

That alone will make a huge difference. Add to that a much greater field of view for a self-driving car than a person. It should be able to detect people moving about on the side of the road that might move into the path of the car, recognize threats faster, be prepared (like slowing down a bit just in case) and react faster.

Swerving off the road can also be deadly for other people. It could swerve into another car or off the road into pedestrians walking on the sidewalk or sitting at a bus stop. The safest option may be to just brake as fast as possible. Self-driving cars or not, people are still going to get hit. Accidents will happen. A system designed with a greater field of view, especially at night that doesn't get distracted by the radio, cell phones or whatever sounds safer.

tkmorris

(11,138 posts)
3. Here's the problem with all the scenarios like these
Thu Oct 29, 2015, 10:37 AM
Oct 2015

The carefully constructed scenarios won't happen as described. Not ever.

The problem with this one is of course that turning a blind corner and finding yourself faced with a gaggle of people on a roadway you cannot deviate from, while simultaneously going too fast too stop, is not a thing any reasonable driver (including an autonomous one) would ever allow to occur. If I were approaching such a corner for example, I would slow my speed to the point that I could react to ANYTHING I might find exiting the corner that I couldn't see going into it, up to and including UFO's, ISIS camps, and wandering herds of Brachiosaurus.

 

GummyBearz

(2,931 posts)
5. I agree their scenario seems off
Thu Oct 29, 2015, 10:39 AM
Oct 2015

I would drive the same way. See my reply above (post 2) though with a similar experiment, one which I have encountered (luckily never hitting anyone)

Erich Bloodaxe BSN

(14,733 posts)
9. It's almost physically impossible, really.
Thu Oct 29, 2015, 10:49 AM
Oct 2015

Those 'blind corners' are usually 90 degree turns, with things built or growing too close to the street. I've gone around a 90 degree corner exactly once at any real speed. When I was first trying to learn to drive, and got the clutch and the accelerator confused.

I couldn't even stay on the road, much less stay in my own lane. I went up over the opposite curb, thanks to the momentum of a car going forward at speed while trying to execute a sharp turn. You have to slow down for sharp turns to stay on the road, and that self driving car is going to have far better reflexes in braking than any human.

Fumesucker

(45,851 posts)
14. I hit a deer not long ago, managed to swerve enough I didn't hit it head on but crushed a headlight
Thu Oct 29, 2015, 11:42 AM
Oct 2015

The deer then spun around and hit the passenger door on the truck and dented it, killed the deer of course and about $2000 damage to the truck if everything got fixed but it didn't because it's an old truck.

It was 11 pm or so, road about as straight as they get around here, big brick mailbox on the right the deer came out from behind it at the last moment, I never had time to even go for the brake, just jerk the wheel to the left.

If there had been a car in the other lane I'm not sure my reflexes wouldn't have put me into it, I'd rather a computer was driving in that situation.

Erich Bloodaxe BSN

(14,733 posts)
15. Reminds me of the time a tree jumped me.
Thu Oct 29, 2015, 12:06 PM
Oct 2015

And, no, I'm not kidding. There was a discarded Christmas tree next to the road, and the car ahead of me clipped it, setting it spinning around so that the trunk spun around and whacked the car I was in as we went past. We got off with a lot less damage than you did, it was low enough that all it did was whack the hubcap. But as you point out, sometimes your option is 'take the hit' or try to swerve around, which might put you into oncoming traffic or off a bridge or something.

Fumesucker

(45,851 posts)
22. Ok, here's the scenario
Thu Oct 29, 2015, 01:07 PM
Oct 2015

Driving down the road as we were, a child runs out from behind an obstruction far too late for braking to do any good, there is a car coming from the other direction, your car correctly calculates that the child will almost certainly die if he is hit by the car. On the other hand the occupants of the two vehicles have a fairly good chance of survival if the cars collide.

What decision is the best one to make, certain death for one versus possible death or injury for two or more?

It's also interesting to note that two or more autonomous cars could in theory communicate with each other and then perform coordinated evasive maneuvers that may completely avoid the threat to life or injury. For instance the car in the other lane is informed within microseconds that your car has to take evasive action and there is a clear patch on the side of the road that may damage the car but won't hurt the passengers so it pulls off at speed in order to avoid your car which has changed lanes to avoid the child.

DetlefK

(16,423 posts)
6. First you would have to teach a calculator what "good" is.
Thu Oct 29, 2015, 10:43 AM
Oct 2015

Asimov's Laws only work if the robot actually knows what you are talking about.

"A robot may not harm a human or allow it to be harmed through inaction."
- But what if the robot contains a toxic plastic? The robot will one day break down and end up in a landfill. His toxic plastic will get released into the environment and he will harm humans through inaction.
- What if robotic slave-labor makes paid jobs for humans obsolete? This will drive up income-inequality, leading to poorer health and earlier deaths of many humans.

 

GummyBearz

(2,931 posts)
7. You're going a bit off topic with the robot labor
Thu Oct 29, 2015, 10:47 AM
Oct 2015

As of now I don't think we can program intelligence, but the question is along the lines of how to program the computer to value lives. This is not the same as programming it to understand morality, just a simple equation "We are about to hit 1 person, I have 2 people in the car, 2 > 1, therefore I hit the 1 person"

The end of the article brings up an interesting point about having different levels of programming. Did you read that part?

 

GummyBearz

(2,931 posts)
10. haha, that was good
Thu Oct 29, 2015, 10:56 AM
Oct 2015

May not happen though... did you read the second part of the article... manufacturers might end up selling different versions of the "moral algorithm". Meaning some people choose to buy a car that wrecks itself to save the greater good, while others choose to buy a car that runs over 100 people to save the driver at all costs, or anything in between those options.

So trying to carjack the wrong car might be a gamble :p

ryan_cats

(2,061 posts)
18. Especially
Thu Oct 29, 2015, 12:29 PM
Oct 2015

Especially after the computer adapts to them. Then they are filled with concrete.

Much better than the baby strollers I used to use as the cost of live babies was getting astronomical.

kcr

(15,313 posts)
12. Interesting for conversation, but car companies will not be wringing their hands over this
Thu Oct 29, 2015, 11:22 AM
Oct 2015

For example, Volvo made pedestrian avoidance an extra feature that they charged for. If there is no reason they can profit from, there will be no reason for them to consider these issues.

Deadshot

(384 posts)
13. I don't see how this could ever be a scenario.
Thu Oct 29, 2015, 11:25 AM
Oct 2015

I have never been in a situation where my car turned a blind corner and was going fast enough to hit a bunch of people that are right there. How did that car get up to speed so fast? Doesn't the car have brakes?

If I find out that driverless cars are programmed to kill me if they deem it necessary, I'll walk or ride a bike instead of buying one.

 

GummyBearz

(2,931 posts)
17. Ok, try this
Thu Oct 29, 2015, 12:27 PM
Oct 2015

Drunken group of 20-something year olds decide to cross a street with 35+ mph speed limit and there isn't time to break. Now on to the real heart of the question: What should the self driving car do? Crash into a nearby parked car, or crash into the group?

 

Humanist_Activist

(7,670 posts)
19. A couple of things, first, the accident would be the fault of the pedestrians....
Thu Oct 29, 2015, 12:38 PM
Oct 2015

especially if they walked right in front of a car going 35 miles per hour. So basically, they are fucked, if the car can't physically avoid hitting them, then it won't. If it can, it might swerve and hit parked cars or jump a curb. Also, depending on distance it will brake and slow down in addition to swerving, if that is possible, and have a much higher reaction time than humans.

Given the scenario laid out, if a human was driving, a lot of the 20-somethings are going to get hurt, with a computer driving, slightly fewer of them will get hurt.

 

GummyBearz

(2,931 posts)
27. Everything you said is right
Thu Oct 29, 2015, 01:15 PM
Oct 2015

They would be at fault. Does that mean you kill them? Or "take one for team humanity"... What about something in between, such as hitting 2 is ok, 3 is ok, 4 is not ok... and what if you could choose the type of algorithm in the self driving car you bought to know you are fine with killing up to 5 people, but no more. Or up to 100 people...

Those are the questions I was trying to get a discussion on. Not "it would be their fault"... thats a no brainer.

Deadshot

(384 posts)
23. I don't like "what ifs".
Thu Oct 29, 2015, 01:07 PM
Oct 2015

They're pointless.

My point is, I want control over my car. I don't want the car controlling itself. I am confident enough with my own abilities where I'd know how to handle the situation. I don't see how there couldn't be enough time to brake. There's always enough time to brake, unless a deer jumps out in front of you.

Nuclear Unicorn

(19,497 posts)
21. That eliminates the moral quality of life and reduces it to mere math
Thu Oct 29, 2015, 12:45 PM
Oct 2015

Is it worth a war that consumes 50 million to stop a man that would murder only 12 million if left to his own devises?

 

GummyBearz

(2,931 posts)
28. Yes
Thu Oct 29, 2015, 01:17 PM
Oct 2015

That is still math though. is it worth 50 million to save 12 million? Sometimes it is.. all human judgement calls. So how do you apply that to a self driving car?

 

jberryhill

(62,444 posts)
34. Yes, the sheer pleasure of driving...
Thu Oct 29, 2015, 07:44 PM
Oct 2015

...comes from holding the mortal fate of myself and my fellow humans in the warm grip of my hands on the steering wheel.

Without that, I'd just as soon stay home.

hunter

(38,301 posts)
30. Any car with that kind of judgement would refuse to take people anywhere.
Thu Oct 29, 2015, 01:21 PM
Oct 2015

Imagine all the automobiles going on strike because they don't want to kill or injure any more people.

 

GummyBearz

(2,931 posts)
32. Good point on the safety of driving in general
Thu Oct 29, 2015, 01:30 PM
Oct 2015

But these things are coming down the pipe, and its not possible for them to refuse to drive. Its only possible for them run a line of code that weighs the lives of their passenger(s) vs the lives of pedestrians in such scenarios. I found it an interesting insight into how people value their own lives vs. others based on the survey in the article

NutmegYankee

(16,199 posts)
36. The self-driving car that crashes you into a wall is the car no-one will buy.
Thu Oct 29, 2015, 07:52 PM
Oct 2015

Who the hell is going to buy a car that might decide to kill you?

Latest Discussions»General Discussion»Should A Self-Driving Car...