General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsIf an autonomous vehicle carrying a passenger (the owner) is going to crash, what should it hit?
Here's the scenario. There's an autonomous vehicle with a passenger. The passenger is the owner. The vehicle can "see" what's around it and make decisions. Most of these decisions are ordinary like accelerate, brake, turn, signal, etc. But the vehicle can also make safety decisions to avoid collisions.
Let's say the vehicle detects a collision. It's choices are to:
Hit a building and kill the driver
Hit a senior citizen, but save the driver
Hit a child, but save the driver
What should it choose? Why? Does fault play into any of these? Is it the drivers fault they chose an autonomous vehicle? Is it the pedestrian's fault that they crossed the road at the wrong time? What if it is none of their faults and another vehicle caused the situation? What if the passenger is a pregnant woman?
7 votes, 1 pass | Time left: Unlimited | |
Hit the building | |
7 (100%) |
|
Hit the senior citizen | |
0 (0%) |
|
Hit the child | |
0 (0%) |
|
1 DU member did not wish to select any of the options provided. | |
Show usernames
Disclaimer: This is an Internet poll |

bottomofthehill
(8,983 posts)Polly Hennessey
(7,668 posts)Renew Deal
(83,526 posts)And it is a real-world AI ethics question.
Polly Hennessey
(7,668 posts)Now I am going to be a mess for the rest of the night.
Lucid Dreamer
(589 posts)C_U_L8R
(46,204 posts)They'll say they try to design it to not hit anything.
Renew Deal
(83,526 posts)Generally, they are more flawed than the machines but that's besides the point.
DBoon
(23,457 posts)as every sentient monster destroys its creator
Takket
(22,862 posts)Does your car have the duty to kill you?
Ive seen these ethics quizzes before. They are tough to answer. You would hope the answer maybe lies in the technology being robust enough to avoid having to make the choice in the first place.
That being said, you have to weigh everything. The goal should be the most lives saved possible and logic says a person in a car is far more likely to survive hitting a building than a pedestrian hit by a speeding car.
Renew Deal
(83,526 posts)We have to assume that the car is not speeding, since it doesn't have the motivation to speed the way humans do. But I get your point.
BlueIdaho
(13,582 posts)The Revolution
(808 posts)We should start to codify questions like this into law now. If it is up to individual companies to decide, then an AI that will preserve the life of the owner and their family in all situations might be a selling point.
ret5hd
(21,320 posts)in some extreme circumstances to save others lives
that the technology was good enough at eliminating overall risk that the chances of a driver dying in an accident was cut by
I dunno, say 10%?
What about 20%?
50%?
At some point (reasonable?) people should agree that the trade off was worth it, right?
At some point, OVERALL risk should come into the equation.
Renew Deal
(83,526 posts)Would someone buy a car that will sacrifice the passengers to save others? I would guess no. The manufacturers have a duty to provide safe products, but isn't their first duty to their customers?
I'm not convinced that laws will come up with better answers.
Hassin Bin Sober
(26,877 posts)Its the only fair thing to do.
ret5hd
(21,320 posts)Hassin Bin Sober
(26,877 posts)That was the first movie my friends and I rented on the brand new Betamax.
unblock
(54,540 posts)The life of the driver counts no more or less than the others
*the driver* ought to choose self sacrifice, if they were making the decision
But the *ai system* is not sacrificing itself to save anyone
The *ai system* has no basis in that moment to identify with the driver and self-sacrifice to save the others. To the *ai system*, they're *all* others.
It has no basis on which to choose to kill one to save the others.
I have my cheeky answer though, which is what I think Elon musk would do:
The ai system should hit the building, killing the driver, but record data showing the human driver disengaged autopilot and took over seconds before the crash and selflessly veered away from the pedestrians, killing himself and dying a hero.
ChazII
(6,335 posts)Corgigal
(9,298 posts)He said, it will hit whatever the software engineer decided to hit.
I pause, and never asked about it again.
ForgedCrank
(2,528 posts)this earlier when I read the question and it opens up an entire new realm of liability.
Any of those families would have a valid wrongful death case against the company who approved the programming.
This is certainly a catch-22 if it ever came that this was an actual decision to make. And I can see none of those being an option, and AI making an attempt to avoid all three, even if it failed to do so.
ForgedCrank
(2,528 posts)choose my own life over someone else if a choice were given.
Any life I had after that would be worthless and full of guilty misery if I did.
madinmaryland
(65,275 posts)They are: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
So none of those options apply.
EarlG
(22,734 posts)I think? Is there any system in a car which is designed to protect the life of a pedestrian over the life of the driver, in any circumstance?
It seems counterintuitive because I think wed like to imagine that if we were in such a situation, we would drive ourselves into the building rather than hit the child. But at the same time, I cant imagine that people would feel comfortable buying a car that might choose to murder them if its algorithms determined that a pedestrians life was more valuable at any given moment.
Maybe they should just make it an option in the cars settings. In the event of a potentially fatal collision, protect me, or In the event of a potentially fatal collision, protect pedestrians.
roamer65
(37,390 posts)Its not human, anyway.
milestogo
(19,367 posts)WarGamer
(16,321 posts)Don't leave the roadway.
The programming never has the option to leave the "proper path" to avoid a collision. It will apply brakes.
flvegan
(64,761 posts)With all of the safety tech, the automatic result of the car making a decision is instant death. I mean what, is this car self-driving at 200 mph? Then chooses to hit a building without slowing down?
Maybe two sets of Brembos on all 4 wheels and better programming, or a non-idiot pilot still in control of the vehicle.
This is a poll for idiots (and salivating tort lawyers). Congrats!
Diablo del sol
(424 posts)Hit the nearest Tesla.
Disaffected
(5,345 posts)unless that can be done without colliding with something off the road (or in the oncoming lane). So, if someone suddenly walks/runs out in the street in front of the oncoming car, swerve/brake as is possible but don't leave the road if the car will hit the building (or an oncoming vehicle). If that results in colliding with the pedestrian, it was the pedestrian's fault in any case and the driver/passengers should not die instead.
Response to Renew Deal (Original post)
sarcasmo This message was self-deleted by its author.
rownesheck
(2,343 posts)open the door and dive out and roll along the ground like I've seen in movies and TV shows? That's what I would do. Then I'd hope the empty car hits the building.
PTWB
(4,131 posts)Is the vehicle being operated (by computer, in this case), in a safe and prudent manner? Are the pedestrians jaywalking or committing some sort of violation that places them at risk? Whats behind the brick wall and is there a significant risk of additional death or injury to people on the other side?
Renew Deal
(83,526 posts)Is a mistake possible like it doesn't detect we ground or sand, but unlike people, the computer is incapable of purposefully behaving improperly.
Demovictory9
(34,322 posts)lagomorph777
(30,613 posts)The car should be moving slowly in a highly congested environment.
edhopper
(35,373 posts)the Trolley Problem.
Lucid Dreamer
(589 posts)A driver should be accountable at all times.
I've spent thousands of hours flying airplanes that were on "autopilot" but I was still the one responsible for the safe completion of the flight.
Totally autonomous operation is appropriate in warehouse environments, for example. But turning auto-driving cars loose on the public roads is foolhardy in my opinion.
Your mileage may vary.
[Yes. I am aware I didn't answer the question asked. I'm pissed that the question even needs to be considered.]
LetsGoBiden
(58 posts)Self-destruct so it doesnt hurt anybody but the person who owns it self responsibility now I wont be buying one untill after Trump does lol Can we get hanity and him on a golf trip anybody say golf cart accident
PTWB
(4,131 posts)If the only choices are to hit the pedestrian or kill the occupants of the vehicle?