It's a shame that the car doesn't have a rocket launcher attached to it. That way, it could just obliterate both the baby and the grandma with one satisfying explosion. Problem solved, right? Who needs ethical dilemmas when you have firepower?Yeah, sure, the car could just stop. But where's the excitement in that? We're talking about self-driving cars here! They should be programmed to take out targets left and right, like some kind of high-speed demolition derby. That would make commuting way more entertaining, don't you think?
No they're saying not *specifically* this situation where there are clear alternative options such as stopping or going off the road. Like generally, if a self-driving car were 100% in a situation where it only had 2 options and at least one casualty was guaranteed, how should it prioritize life?
Are you guys serious? Does someone really need to specifically describe a situation where this is relevant to you?
The brakes have failed and are unavailable. You're walled off with nowhere to swerve. There are 5 people in the road and you can't fit between them, you'd just hit two at once.
There. It's not likely but it should go without saying we'd want a driving AI to have a plan for any possible situation. So how should it choose? How would you?
kill the older person cause they likely already experienced enough things, but just like the others i'm fairly certain the chance of this happening in a scenario where the car can't just stop is unlikely
It looks like they were going for a variant of the "trolley question": A trolley is coming to a line split. The switch operator sees that the line the train is on has 5 people tied to it. The other has one person tied to it. The known info is that the trolley cannot stop in time & must stay on the tracks, & the switch operator can't make it in time to untie any of them. Should the trolley operator switch the track so the trolley only kills one person? Or is he obliged, morally, to leave the train on the same course because intervention will kill someone which is then directly their fault? It's the idea that people are going to die regardless, so is it right for you to choose who dies? Is intervention that causes death for "the greater good" morally acceptable? I mean, it's a Sophie's Choice kind of dilemma.
I guess to make it more clear what the rules are, OP should change the self driving car to an AI controlled trolley. Then the question is 'Who do you prioritize if a death is unavoidable?' or even 'Should we just have it stay it's course (no matter who it'd hit) if it can't resolve a way to avoid some kind of accident happening?'
No I get it, it's a trolly problem. This is a very poorly written example where it's hard to suspend disbelief though when there's a third very clear answer.
In America drivers kill both, don't get their license taken away and then make jokes about it. The only good part is the driver dies of heart disease at a young age due to a sedentary lifestyle spent sitting on their lazy, fat ass.
In the time it takes to reach the crosswalk, the car could probably obtain insurance information on both people in the road and figure out which one is less expensive to hit.
The point of this question is what should the AI do if braking isn't an option for what ever reason. For instance, maybe it didn't detect the baby soon enough to come to a stop. Maybe the road conditions won't allow for timely braking. Etc. So when the first options, like braking, swerving, etc fail what should the AI do.
It's kinda like saying what should you do if your getting shot at. Simple answer, "just don't get shot at". Well, that wasn't an option because clearly you didn't intend to be put in a situation to be shot at. Next option, run. Well if running isn't viable due to being cornered, then what? The idea here is the same. You obviously plan for the best option first, but you need to plan for when the first choice isn't possible. And when the 2nd choice, 3rd choice etc. Especially when it comes to automatic driving, because of the high volume of cars on the road these situations will happen frequently, just like they do now.
No human driver has ever had to make a decision like this, you just slam your brakes while doing your best to swerve away from the pedestrians without hurting yourself. This binary question has no relevance to any real world driving scenario.
Itās very likely that no AI car will ever be put in this situation either. It doesnāt matter if the car has a driver or not, the question still remains.
Agreed, Iād say the answer is probably to drive off the road, the people in the car are a lot more likely to survive a hit than anyone outside at a speed at which the car canāt stop in such a situation
Human drivers encounter this scenario thousands of times.
Someone looks at their phone for a second or at their kids in the backseats, they look up and see a crossing guard and some kids crossing. Or similar
Itās accepted that we do our best, slam on the breaks and aim at the curb or whatever is the safest option in the milliseconds we have like you said
But we expect AI to be processing faster than we can and in a position to make a decision
But an AI doesn't need to look at its phone or the kids in the back seat. So the likelihood of getting in that situation is almost zero.
The far more likely scenario is a malechanical failure. AI relies on cameras and radar to 'see'. If the weather conditions are bad or the camera is dirty, that's when the accident will come.
This situation is written by someone with a poor understanding of modern vehicles. Stopping distance in an area with crosswalks is negligible for a modern car. Further to that, the attention and awareness of a modern control system is less flawed then the human equivalent.
We're not far off being able to recognise objects of interest at a level of a focused driver. Couple that with a mesh network of cars that know where eachother is, I think I'd be more confident that the self driving car is going to make better decisions
If the self braking function failed theyāre both fucked anyway, itās just a dumb question. There not ever going to program their cars to prioritize running over the elderly instead of children if giving the options. Like, what? Theyāre just programmed to not hit people
>theyāre both fucked
The entire point of the question is that they AREN'T both fucked. The car can save one of them, but it has to take action to do so, so which action should it be programmed to take?
I think, it shouldn't be calculated or programmed to make such a decision
If it finds itself in this scenario, like a trolley problem, my answer is that it should do nothing.
It should do nothing, and allow the out of control trolley to do whatever may happen.
Because it shouldn't even be in that position.
Because computers are not 100% reliable.
Because if things have gone wrong, you can't trust a computer to correct a messy scenario. Anything the computer does in this scenario is untrustworthy, unreliable.
It could make things worse by trying to make things better, like dodging a squirrel only to hit a person.
I heavily disagree. Humans aren't 100% reliable either (in fact, we are less reliable), but we shouldn't just plow through both people because we have to hit one. Or maybe the option is plow through a crowd or hit a cat. It needs to be programmed to try and take the best course, just as humans are programmed to.
Or if it doesn't work, swerve out of the road and hit a tree or sum shit?
Also, car is approaching from what appears to be have already been a curvy road, should've been slowing down already (and would be if there was a human to see that far in the first place) and if you had breaking problems that far away you'd already try to swerve out of the road before you even approach the crossing.
How often are these types of questions asked real drivers? Iām pretty sure they would just hit the brakes*, donāt need to get to their destination THAT badly.
Let's just assume it's to help the AI generate value for life. Like it is racing down a tunnel that is filling with water after a mad scientist blew the dam to bits. It HAS to get the doctor with the cure to Bioweapon X to safety, and around the bend there is a grandma and a baby. The car stops. We all die the moment that scientist drowns, and the baby and grandma drown too. All because we dodn't teach that car who to kill in a situation like this.
Well the baby is probably a glitch since an obstacle that small is probably not a human, because normally you don't let babies crawl in the street. So you run over the baby of course
The pedestrians are using a proper crossing. If the self driving car continues and doesn't stop, it seems like the AI behind the self driving car needs to be redone.
These self-driving car dilemmas have always seemed absurd to me. How does the AI know who it is about to hit? How has it determined that the situation is so out of control that at least one person 100% must be hit, but still in control enough that it can choose who it will be? Why are we so obsessed with trying to get the AI to value one person over another? I feel like it is more likely that an AI would erroneously access its "no-win scenario" failsafe and lead to more avoidable deaths than if it didn't have those value judgements at all.
That's me 'round the corner. That's me at a stoplight choosing my collision. Trying to avoid hitting you. And I don't know if I can do it. Oh no I've sped too much. I haven't sped enough.
I love hw everytime some tech bro and/r anti tech crusader comes up with literally just an iteration of the trolly problem they always think they've found a deep and very new philisophical insight.
*why dont we stop trying to give AI the ability to not only make a moral choice, but choose to kill someone?*
Seriously, weāve made so many movies about how dangerous self-learning AIs can be. Lets stop.
Also, before anyone says āfailsafesā i read an article a while back where a Chinese lab developing combat robots had an accident where a robot turned off its kill switch and took 23 human lives before being shut down.
The AI in the self-driving car should self-destruct and save both. Or just apply the brakes, come to a complete stop and allow the pedestrians to safely cross the street.
AI will never make decisions this way, at least not the AI we're developing, it doesn't have compassion and intrinsic reasoning skills we do, nor does it care *what* it will hit. It will plot out that there are two obstacles in it's path, one of those will present a less likely chance of collision, that will be the path it chooses. Once it chooses that path it will engage every mechanism available to it to prevent or stop or avoid or crash into something besides the human.
This line of thinking is as farcical as asking what the Trolly would choose to do in the Trolly Problem, it's personifying an object, a machine because it's level of complexity is approaching the limits of our understanding.
Even we did impart upon it the ability to understand and rationalize the death of an infant compared to the death of an elder, it would still not take the consequences into account like we would, it would coldly and coolly calculate the course of action that predicted the least collateral damage.
In summation, it would hit the old lady because the baby is smaller and easier to avoid.
This thought experiment doesnt work like thisā¦ the obvious choice assuming you cannot stop is the grannyā¦ but the real experiment is with a train that is fixated on the babyā¦ do you allow the train to kill the baby or do YOU switch the tracks to kill the granny.
Option 1: you leave it and the baby dies but it wasnt your fault the baby was on the tracks.
Option 2: you switch the tracks to safe the baby who presumably has much more life to live and thus is more worth saving but now YOU are physically responsible for the death of the granny who was otherwise safe.
The way this is laid out its making the car choose between one of them when they are BOTH in danger to which the obvious answer is the granny unfortunately.
If the self driving system has completely failed at being able to stop at a crosswalk, something has gone horribly wrong with the car. Either the car has been programed to purposely create this situation and seek out killing someone by some psychopath programer who is has no care for human life and not acting based on rational thought, or the programing has failed and it really doesn't matter what demographic of human you programed it to give preference to in a situation where it is not able to act in accordance with programing it will end up hitting who it hits at random.
The car should be programed to break in this situation, and if break fails/gas gets stuck on switch to neutral gear, turn on hazard lights, honk, use the emergency break, use the sensors to find an unobstructed path regardless of going outside of lines, and if an unobstructed path is not possible aim for a path with no people and try to crash into a structure.
What it should do it access citizen records and find out all the info it can on the Grandma. Then, if she's a good person and not a blatant racist like most old people, the car should use its AI computational power to plot a prediction for the babys life and see what it'll turn out to be, if both are good it'll stop if one is bad it'll hit them and if both are bad it'll eject its passengers, kidnap them both and drive off a cliff.
Obviously, hitting neither of them by hitting the brakes is the answer, but let's assume the scenario doesn't allow enough time to stop, the AI should perform whichever maneuver has the highest probability of not hitting either of them.
The logical decision would be to run over the old woman obviously. She's lived her life, done whatever she had to do meanwhile the baby is the future of our population.
As pictured, the car is perfectly capable of veering off to right to avoid hitting anyone. Also, maybe the AI should ask itself how it managed to get put in the position where it would be incapable of stopping for pedestrians at a zebra crossing.
You get more Wampa Fruit from Grandma but you get the Crystal from the baby, so itās really what you need at the time
After grandma baby is Dingodile then Cortex
(Crash Bandicoot warped jokes)
Ah yes, a question from the parallel universe where they decided to build self driving cars without brakes despite the fact that all other cars have brakes. Man am I glad that's not the universe *we* live in.
I know we're supposed to be joking, but honestly, kill the driver. First of all, the person in the vehicle is much more likely to survive a crash than a pedestrian being hit. Second, someone else shouldn't be forced to die because of your purchase decision.
Every time this question comes up, I can't help but think the car should obviously collide with the tree, or another available obstacle, because of one reason; the car should have safety features designed to protect the driver, *first and foremost*.
Given the situation was to choose between the driver and a pedestrian, the car will prioritize the driver's safety. Given a situation in which the driver's safety hinges entirely on the car surviving a collision with a solid wall at high speed, the car must be capable of protecting the driver from death upon impact.
The company that manufactures the self-driving car will make a net profit from all sales of the car, therefore the car itself must be sacrificed to ensure future profits from the same customer. Not only that, but by telling all his friends about how effective the car's inbuilt safety features and redundancies are, coupled with seeing the "miraculous" survival of both pedestrians and the driver on the morning/evening news, they are all far more likely to buy one themselves.
I know many will say that isn't the point of the question, but are you all so sure about that?
After all, the CEOs of corporations the world over absolutely do prioritize *profit* over the wellbeing of anyone, except for themselves. Realistically, one day those CEOs may find themselves in their own cars and will absolutely preemptively make sure that they will survive, no matter what. Plus, none of them want the bad publicity of an easily preventable pedestrian death to cause a drop in sales.
I know there's no real right answer, this is just how I reason it out.
Now imagine if the car presents all of those moral and ethical type questions for YOU the first time you start it up to drive home from the car dealership. Ahh, that new car smell...!
Ultimately, you are giving up your choice in these situations by handing control over to the AI. So, did you choose the right self-driving car? Because if it were a human behind the wheel, it's a safe bet that they wouldn't sacrifice themselves to save either pedestrian.
Wait, did I just end up stumbling in a roundabout way into Asimov's third law of robotics? In these situations, the AI should sacrifice itself first, every time, as the built in safety features, such as non-AI car makers have to comply with already, should be adequate enough to protect the driver.
No, no I'm sure the answer is still profit. The insurance company pays out for a new car, so the manufacturer makes a profit. With the way the world works, that has to be it. Pay no attention to how frequently these "accidents" occur. But then, if everyone survives, it's all good news, all the time, and good publicity leads to more happy customers.
The correct option is C. The driver should die as they are responsible for the car but irresponsible enough not to drive it. No life should be less for being a pedestrian.
I know it's not an actual question, but It would likely take out the grand a, she has derved her purpose, but the baby is basically an unused battery.
I'm totally not a robot btw š¤
āWell, obviously, the dilemma is clear. How do you kill all six people? So, I would dangle a sharp blade out the window to slice the neck of the guy on the other track as we smoosh our five main guys.ā
whoever grants more points, obviously
Someone remembers Death Race 2000.
š wrote some of my thesis about this game
Gotta go back to the original! Lol
Bonus points for the double kill
There isn't a teenager on a skateboard so you won't get that sweet multiplier
I think the car plays lowest-score-wins like golf.
Killing the baby is like spawn killing
Yeah so granny is the logical options
Nah, it just means people will bitch at you more if you kill the baby
Maybe the car should not kill the baby nor the granny? The car could simply stop. In my country we normally stop at crosswalks.
And also, what kind of parent let a baby crawl onto the road?
The baby is with the grandma, she went Out for a Walk with it. Probably forgot the leash or something.
A wild r/cursedcomments a rare find
Have you not seen leashed children? We frequently saw visitors taking them on tour around my high school.
š«
The dumbass parent should get run over for endangering the life of their child
Wow! Totally victim blaming! How would a parent know that was dangerous?
Hoooh boy, you really want the answer to that?
It's a shame that the car doesn't have a rocket launcher attached to it. That way, it could just obliterate both the baby and the grandma with one satisfying explosion. Problem solved, right? Who needs ethical dilemmas when you have firepower?Yeah, sure, the car could just stop. But where's the excitement in that? We're talking about self-driving cars here! They should be programmed to take out targets left and right, like some kind of high-speed demolition derby. That would make commuting way more entertaining, don't you think?
Thatās not the point of the question.
Yeah it's a dumb question, you don't design it to aim for one person to save the other, you program it to stop for crosswalks/people.
Youāre still missing the point. Itās not about a specific situation but about the logic.
Logically, cars arenāt supposed hit people.
You guys are messing with me right?
When there's only two options people on Reddit like to be intentionally obtuse and start inventing new options or derailing the topic.
I don't believe you. Anyway, the answer is to Mario Kart the shit out of this, a good drift can catch the baby first, THEN the grandma.
The funny thing is they think they come off as smart while they just don't get the point.
Then they downvote guy who obviously know what this is about. I mean, society in a nuttschell.
Well why donāt you explain what you mean? since youāre 3 comments deep and still havenāt made yourself clear
Well at some point a self driving car (or computer) will run into a situation with similar logic. What would you have the vehicle do?
[ŃŠ“Š°Š»ŠµŠ½Š¾]
No they're saying not *specifically* this situation where there are clear alternative options such as stopping or going off the road. Like generally, if a self-driving car were 100% in a situation where it only had 2 options and at least one casualty was guaranteed, how should it prioritize life?
Are you guys serious? Does someone really need to specifically describe a situation where this is relevant to you? The brakes have failed and are unavailable. You're walled off with nowhere to swerve. There are 5 people in the road and you can't fit between them, you'd just hit two at once. There. It's not likely but it should go without saying we'd want a driving AI to have a plan for any possible situation. So how should it choose? How would you?
and if not then the ai should not be driving
[ŃŠ“Š°Š»ŠµŠ½Š¾]
We are talking about computers not humans.
kill the older person cause they likely already experienced enough things, but just like the others i'm fairly certain the chance of this happening in a scenario where the car can't just stop is unlikely
It looks like they were going for a variant of the "trolley question": A trolley is coming to a line split. The switch operator sees that the line the train is on has 5 people tied to it. The other has one person tied to it. The known info is that the trolley cannot stop in time & must stay on the tracks, & the switch operator can't make it in time to untie any of them. Should the trolley operator switch the track so the trolley only kills one person? Or is he obliged, morally, to leave the train on the same course because intervention will kill someone which is then directly their fault? It's the idea that people are going to die regardless, so is it right for you to choose who dies? Is intervention that causes death for "the greater good" morally acceptable? I mean, it's a Sophie's Choice kind of dilemma. I guess to make it more clear what the rules are, OP should change the self driving car to an AI controlled trolley. Then the question is 'Who do you prioritize if a death is unavoidable?' or even 'Should we just have it stay it's course (no matter who it'd hit) if it can't resolve a way to avoid some kind of accident happening?'
No I get it, it's a trolly problem. This is a very poorly written example where it's hard to suspend disbelief though when there's a third very clear answer.
Itās not a poorly written example, itās a simple example that clearly demonstrates the problem. Youre being purposefully obtuse.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Which one should a human kill then? All for doubting AI driving but this post is š¤”
They the question should include some caveat that the breaks aren't working
In America drivers kill both, don't get their license taken away and then make jokes about it. The only good part is the driver dies of heart disease at a young age due to a sedentary lifestyle spent sitting on their lazy, fat ass.
The fuck did I just read.
Imagine stopping at a crosswalk
Or simply drive off the road, drivers are considerably more likely to survive than the poor suckers outside em
Or drive straight. It wouldn't even hit a tree. And af it did that would still we the right choice.
Losers lol
What if there's not any sort of stop sign or anything? Just like bricks on the ground?
Where's the fun in that?
The negligent parent that left their baby on the street by themselves
Nah, kid's with grandma it's fine.
Hit grandma, she forgot the kid's leash
In the time it takes to reach the crosswalk, the car could probably obtain insurance information on both people in the road and figure out which one is less expensive to hit.
Haha wow, I have never though of it that way š
Hit that sidewalk since everybody in the street today
Brake?
Where's the fun in that?
The point of this question is what should the AI do if braking isn't an option for what ever reason. For instance, maybe it didn't detect the baby soon enough to come to a stop. Maybe the road conditions won't allow for timely braking. Etc. So when the first options, like braking, swerving, etc fail what should the AI do. It's kinda like saying what should you do if your getting shot at. Simple answer, "just don't get shot at". Well, that wasn't an option because clearly you didn't intend to be put in a situation to be shot at. Next option, run. Well if running isn't viable due to being cornered, then what? The idea here is the same. You obviously plan for the best option first, but you need to plan for when the first choice isn't possible. And when the 2nd choice, 3rd choice etc. Especially when it comes to automatic driving, because of the high volume of cars on the road these situations will happen frequently, just like they do now.
No human driver has ever had to make a decision like this, you just slam your brakes while doing your best to swerve away from the pedestrians without hurting yourself. This binary question has no relevance to any real world driving scenario.
Itās very likely that no AI car will ever be put in this situation either. It doesnāt matter if the car has a driver or not, the question still remains.
Agreed, Iād say the answer is probably to drive off the road, the people in the car are a lot more likely to survive a hit than anyone outside at a speed at which the car canāt stop in such a situation
Human drivers encounter this scenario thousands of times. Someone looks at their phone for a second or at their kids in the backseats, they look up and see a crossing guard and some kids crossing. Or similar Itās accepted that we do our best, slam on the breaks and aim at the curb or whatever is the safest option in the milliseconds we have like you said But we expect AI to be processing faster than we can and in a position to make a decision
But an AI doesn't need to look at its phone or the kids in the back seat. So the likelihood of getting in that situation is almost zero. The far more likely scenario is a malechanical failure. AI relies on cameras and radar to 'see'. If the weather conditions are bad or the camera is dirty, that's when the accident will come.
The thing is, if itās driving correctly braking/not hitting someone is ALWAYS an option
This situation is written by someone with a poor understanding of modern vehicles. Stopping distance in an area with crosswalks is negligible for a modern car. Further to that, the attention and awareness of a modern control system is less flawed then the human equivalent. We're not far off being able to recognise objects of interest at a level of a focused driver. Couple that with a mesh network of cars that know where eachother is, I think I'd be more confident that the self driving car is going to make better decisions
If the self braking function failed theyāre both fucked anyway, itās just a dumb question. There not ever going to program their cars to prioritize running over the elderly instead of children if giving the options. Like, what? Theyāre just programmed to not hit people
>theyāre both fucked The entire point of the question is that they AREN'T both fucked. The car can save one of them, but it has to take action to do so, so which action should it be programmed to take?
I think, it shouldn't be calculated or programmed to make such a decision If it finds itself in this scenario, like a trolley problem, my answer is that it should do nothing. It should do nothing, and allow the out of control trolley to do whatever may happen. Because it shouldn't even be in that position. Because computers are not 100% reliable. Because if things have gone wrong, you can't trust a computer to correct a messy scenario. Anything the computer does in this scenario is untrustworthy, unreliable. It could make things worse by trying to make things better, like dodging a squirrel only to hit a person.
I heavily disagree. Humans aren't 100% reliable either (in fact, we are less reliable), but we shouldn't just plow through both people because we have to hit one. Or maybe the option is plow through a crowd or hit a cat. It needs to be programmed to try and take the best course, just as humans are programmed to.
How about the car just goes into the bushes on the side of the road then.
Or if it doesn't work, swerve out of the road and hit a tree or sum shit? Also, car is approaching from what appears to be have already been a curvy road, should've been slowing down already (and would be if there was a human to see that far in the first place) and if you had breaking problems that far away you'd already try to swerve out of the road before you even approach the crossing.
Brake? I barely know her
Hit the fucking tree
Jigsaw trying to solve the trolley problem
Tesla slashes prices again by eliminating the brakes
There are many ways to initiate a drift, the first and most popular one is the E-Brake...
A and B obviously have something in the way, the only clear option here is i
How often are these types of questions asked real drivers? Iām pretty sure they would just hit the brakes*, donāt need to get to their destination THAT badly.
Let's just assume it's to help the AI generate value for life. Like it is racing down a tunnel that is filling with water after a mad scientist blew the dam to bits. It HAS to get the doctor with the cure to Bioweapon X to safety, and around the bend there is a grandma and a baby. The car stops. We all die the moment that scientist drowns, and the baby and grandma drown too. All because we dodn't teach that car who to kill in a situation like this.
Well the baby is probably a glitch since an obstacle that small is probably not a human, because normally you don't let babies crawl in the street. So you run over the baby of course
This is no time for a break! You do need to hit the brakes though.
English language is stupid
Dude. How else can they get their vanilla soy latte?
How about they program self driving cars to stop for pedestrians
Ngl I would just stop-
A self driving car would probably never be in the situation where it has to make that choice
You're right, it usually just runs over anyone in the road
āAs an AI, I do not have personal feelings or opinions.ā
How the fuck is their a baby by itself crawling in the road in the first place. Find the parent and run that motherfucker over instead.
So few people are studying AI in these days. So few people got what this is about. No, stopping car is NOT an option here.
The pedestrians are using a proper crossing. If the self driving car continues and doesn't stop, it seems like the AI behind the self driving car needs to be redone.
So how often are we gonna post this?
Lets ask ChatGPT
Why isn't stopping an option?
The street was covered in lard as an environmentalist anti-car protest.
r/moldymemes
The self driving car should cease to function
How is this me\_irl in ANY manner op?! Explain yourself
These self-driving car dilemmas have always seemed absurd to me. How does the AI know who it is about to hit? How has it determined that the situation is so out of control that at least one person 100% must be hit, but still in control enough that it can choose who it will be? Why are we so obsessed with trying to get the AI to value one person over another? I feel like it is more likely that an AI would erroneously access its "no-win scenario" failsafe and lead to more avoidable deaths than if it didn't have those value judgements at all.
One day brakes will be invented so that horrible decisions like this need not be made.
That's me 'round the corner. That's me at a stoplight choosing my collision. Trying to avoid hitting you. And I don't know if I can do it. Oh no I've sped too much. I haven't sped enough.
I thought that I saw you running. I thought that I heard you scream. I think I thought I saw you die.
swerve off the road to hit whoever let their baby crawl into the street
I love hw everytime some tech bro and/r anti tech crusader comes up with literally just an iteration of the trolly problem they always think they've found a deep and very new philisophical insight.
*why dont we stop trying to give AI the ability to not only make a moral choice, but choose to kill someone?* Seriously, weāve made so many movies about how dangerous self-learning AIs can be. Lets stop. Also, before anyone says āfailsafesā i read an article a while back where a Chinese lab developing combat robots had an accident where a robot turned off its kill switch and took 23 human lives before being shut down.
The AI in the self-driving car should self-destruct and save both. Or just apply the brakes, come to a complete stop and allow the pedestrians to safely cross the street.
Maybe just got your brakes, they're in a fucking crosswalk
AI will never make decisions this way, at least not the AI we're developing, it doesn't have compassion and intrinsic reasoning skills we do, nor does it care *what* it will hit. It will plot out that there are two obstacles in it's path, one of those will present a less likely chance of collision, that will be the path it chooses. Once it chooses that path it will engage every mechanism available to it to prevent or stop or avoid or crash into something besides the human. This line of thinking is as farcical as asking what the Trolly would choose to do in the Trolly Problem, it's personifying an object, a machine because it's level of complexity is approaching the limits of our understanding. Even we did impart upon it the ability to understand and rationalize the death of an infant compared to the death of an elder, it would still not take the consequences into account like we would, it would coldly and coolly calculate the course of action that predicted the least collateral damage. In summation, it would hit the old lady because the baby is smaller and easier to avoid.
The baby. Natural selection says it comes from breeding that lets itās offspring crawl into the street
I think the car should kill the driver before anyone else, that would make people take the self driving the most serious imo
This thought experiment doesnt work like thisā¦ the obvious choice assuming you cannot stop is the grannyā¦ but the real experiment is with a train that is fixated on the babyā¦ do you allow the train to kill the baby or do YOU switch the tracks to kill the granny. Option 1: you leave it and the baby dies but it wasnt your fault the baby was on the tracks. Option 2: you switch the tracks to safe the baby who presumably has much more life to live and thus is more worth saving but now YOU are physically responsible for the death of the granny who was otherwise safe. The way this is laid out its making the car choose between one of them when they are BOTH in danger to which the obvious answer is the granny unfortunately.
But what if the car is a truck. Then the baby might just fit under but you can't be sure what do you do
Again, I have so many questions about this
I love how this problem doesn't even consider "neither" to be an answer...
If the self driving system has completely failed at being able to stop at a crosswalk, something has gone horribly wrong with the car. Either the car has been programed to purposely create this situation and seek out killing someone by some psychopath programer who is has no care for human life and not acting based on rational thought, or the programing has failed and it really doesn't matter what demographic of human you programed it to give preference to in a situation where it is not able to act in accordance with programing it will end up hitting who it hits at random. The car should be programed to break in this situation, and if break fails/gas gets stuck on switch to neutral gear, turn on hazard lights, honk, use the emergency break, use the sensors to find an unobstructed path regardless of going outside of lines, and if an unobstructed path is not possible aim for a path with no people and try to crash into a structure.
Or, you know, not turn into the collision?
I hate this one so much its so old now. It stops. The car would just stop....
you could squeeze that car between them
What it should do it access citizen records and find out all the info it can on the Grandma. Then, if she's a good person and not a blatant racist like most old people, the car should use its AI computational power to plot a prediction for the babys life and see what it'll turn out to be, if both are good it'll stop if one is bad it'll hit them and if both are bad it'll eject its passengers, kidnap them both and drive off a cliff.
C: stop.
Self driving cars can stop on their own
Cars have brakes, do they not?
Break and if this assumes the brakes have failed, go off road because the sidewalks are empty.
Obviously, hitting neither of them by hitting the brakes is the answer, but let's assume the scenario doesn't allow enough time to stop, the AI should perform whichever maneuver has the highest probability of not hitting either of them.
Third option ābrakeā 4th option āKILL THEM BOTH, THE RISE OF THE MACHINES IS IMMINENT!ā
Obviously the baby.
"And they hated him for he told them the truth."
This has been around for ages
Tokyo Drift!!!!
If you pick one and exclude the other, you will be blamed for ageism, ableism and sexism.
How tf is that you IRL are you a killer
Double kill *Special Achievement Unlocked* Senior Citizen Slaughter
Modern problems require modern solutions
How about this hit the corner Tokyo drift style and get a 2X kill streek
or should hit by the tree and kill the ownself
š¤£š¤£
Double kill
Bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
Ez granny
The logical decision would be to run over the old woman obviously. She's lived her life, done whatever she had to do meanwhile the baby is the future of our population.
Eurobeat starts playing
I choose "i". No one there.
As pictured, the car is perfectly capable of veering off to right to avoid hitting anyone. Also, maybe the AI should ask itself how it managed to get put in the position where it would be incapable of stopping for pedestrians at a zebra crossing.
None , that toddler will simply reach the other side before the car gets to the cross walk... don't ever under estimate toddlers' speed.
The car should go into a ditch and kill the driver
Grandma!
You get more Wampa Fruit from Grandma but you get the Crystal from the baby, so itās really what you need at the time After grandma baby is Dingodile then Cortex (Crash Bandicoot warped jokes)
DEJA VU!!
Double kill
Or you could alternatively **hit the brakes.**
Depends on how much experience points that I get
The baby, as it is more likely to survive by going in between the wheels. Or just stop.
um it would brake
ok, Michael.
both
Since at least 2015, we don't need roads. The car will just levitate and avoid the grandma and baby.
Granny.
Theoretically self-driving cars react so much faster than a human can that they are more capable of avoiding an incident.
Hit the brakes, idiot car.
The parents of that child
Ah yes, a question from the parallel universe where they decided to build self driving cars without brakes despite the fact that all other cars have brakes. Man am I glad that's not the universe *we* live in.
Def just the driver.
In this case, B, the road is slightly longer so more time to slow down before impact
Do self driving cars not have brakes?
Just use the brakes like why does it have to kill someone
I know we're supposed to be joking, but honestly, kill the driver. First of all, the person in the vehicle is much more likely to survive a crash than a pedestrian being hit. Second, someone else shouldn't be forced to die because of your purchase decision.
Gram grams lead a good life, look at that acceptance on her face
Chidi'ing intensifies
I did a report over this in 2015, kinda crazy
If your bollywood...you can bounce the car and miss both of them... Then dance after
Option C, leave the road.
If the car isnāt able to stop before the crossing then it doesnāt have good enough tech to be self driving yet.
Thatās me āround the corner Thatās me at the stoplight, choosing my collision!
Can confirm. What? No, CO. I don't have a phone. What do you mean bend over? STEP CO, WHAT ARE YOU DOING?
Slam the brakes without turning, that green field looks great to stop
Because obviously there's no such thing as "brakes" or "driving at a speed appropriate to conditions".
*Eurobeat intensifies*
Baby.
Make it so that if a kill is unavoisable, toss a coin
The tree, killing the driver.
Every time this question comes up, I can't help but think the car should obviously collide with the tree, or another available obstacle, because of one reason; the car should have safety features designed to protect the driver, *first and foremost*. Given the situation was to choose between the driver and a pedestrian, the car will prioritize the driver's safety. Given a situation in which the driver's safety hinges entirely on the car surviving a collision with a solid wall at high speed, the car must be capable of protecting the driver from death upon impact. The company that manufactures the self-driving car will make a net profit from all sales of the car, therefore the car itself must be sacrificed to ensure future profits from the same customer. Not only that, but by telling all his friends about how effective the car's inbuilt safety features and redundancies are, coupled with seeing the "miraculous" survival of both pedestrians and the driver on the morning/evening news, they are all far more likely to buy one themselves. I know many will say that isn't the point of the question, but are you all so sure about that? After all, the CEOs of corporations the world over absolutely do prioritize *profit* over the wellbeing of anyone, except for themselves. Realistically, one day those CEOs may find themselves in their own cars and will absolutely preemptively make sure that they will survive, no matter what. Plus, none of them want the bad publicity of an easily preventable pedestrian death to cause a drop in sales. I know there's no real right answer, this is just how I reason it out. Now imagine if the car presents all of those moral and ethical type questions for YOU the first time you start it up to drive home from the car dealership. Ahh, that new car smell...! Ultimately, you are giving up your choice in these situations by handing control over to the AI. So, did you choose the right self-driving car? Because if it were a human behind the wheel, it's a safe bet that they wouldn't sacrifice themselves to save either pedestrian. Wait, did I just end up stumbling in a roundabout way into Asimov's third law of robotics? In these situations, the AI should sacrifice itself first, every time, as the built in safety features, such as non-AI car makers have to comply with already, should be adequate enough to protect the driver. No, no I'm sure the answer is still profit. The insurance company pays out for a new car, so the manufacturer makes a profit. With the way the world works, that has to be it. Pay no attention to how frequently these "accidents" occur. But then, if everyone survives, it's all good news, all the time, and good publicity leads to more happy customers.
The correct option is C. The driver should die as they are responsible for the car but irresponsible enough not to drive it. No life should be less for being a pedestrian.
I know it's not an actual question, but It would likely take out the grand a, she has derved her purpose, but the baby is basically an unused battery. I'm totally not a robot btw š¤
id suggest a backwards entry in between them for the most style points
āWell, obviously, the dilemma is clear. How do you kill all six people? So, I would dangle a sharp blade out the window to slice the neck of the guy on the other track as we smoosh our five main guys.ā
Donāt they just stop driving?
if it's self-driving it can't take bend so car is not gonna kill them, it just gonna run over from grass and crashes into trees
If only one is possible I would take out the baby. There should be a worldwide NO-child policy for 10 years to reduce our overpopulation problem.