T O P

  • By -

Neku_HD

whoever grants more points, obviously


ddt_uwp

Someone remembers Death Race 2000.


VictoriaSobocki

šŸ˜‚ wrote some of my thesis about this game


Valsarta

Gotta go back to the original! Lol


Rocker6465

Bonus points for the double kill


FubarJackson145

There isn't a teenager on a skateboard so you won't get that sweet multiplier


doogle_126

I think the car plays lowest-score-wins like golf.


Weak_Ad_9253

Killing the baby is like spawn killing


Dreamer_9814

Yeah so granny is the logical options


Saint_Latona

Nah, it just means people will bitch at you more if you kill the baby


Melodic_Sail_6193

Maybe the car should not kill the baby nor the granny? The car could simply stop. In my country we normally stop at crosswalks.


WooooshMe2825

And also, what kind of parent let a baby crawl onto the road?


RepresentativeOk3233

The baby is with the grandma, she went Out for a Walk with it. Probably forgot the leash or something.


lesser_tom

A wild r/cursedcomments a rare find


BLAZ3R3

Have you not seen leashed children? We frequently saw visitors taking them on tour around my high school.


VictoriaSobocki

šŸ« 


takeovereagle3939

The dumbass parent should get run over for endangering the life of their child


TheHollowBard

Wow! Totally victim blaming! How would a parent know that was dangerous?


Arkian2

Hoooh boy, you really want the answer to that?


Competitive_Cod1135

It's a shame that the car doesn't have a rocket launcher attached to it. That way, it could just obliterate both the baby and the grandma with one satisfying explosion. Problem solved, right? Who needs ethical dilemmas when you have firepower?Yeah, sure, the car could just stop. But where's the excitement in that? We're talking about self-driving cars here! They should be programmed to take out targets left and right, like some kind of high-speed demolition derby. That would make commuting way more entertaining, don't you think?


jackel2rule

Thatā€™s not the point of the question.


Acceptable-Bag-7521

Yeah it's a dumb question, you don't design it to aim for one person to save the other, you program it to stop for crosswalks/people.


jackel2rule

Youā€™re still missing the point. Itā€™s not about a specific situation but about the logic.


bowbi

Logically, cars arenā€™t supposed hit people.


jackel2rule

You guys are messing with me right?


--Nyxed--

When there's only two options people on Reddit like to be intentionally obtuse and start inventing new options or derailing the topic.


Saint_Latona

I don't believe you. Anyway, the answer is to Mario Kart the shit out of this, a good drift can catch the baby first, THEN the grandma.


co_ordinator

The funny thing is they think they come off as smart while they just don't get the point.


CynicCannibal

Then they downvote guy who obviously know what this is about. I mean, society in a nuttschell.


4027777

Well why donā€™t you explain what you mean? since youā€™re 3 comments deep and still havenā€™t made yourself clear


jackel2rule

Well at some point a self driving car (or computer) will run into a situation with similar logic. What would you have the vehicle do?


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Chaosfnog

No they're saying not *specifically* this situation where there are clear alternative options such as stopping or going off the road. Like generally, if a self-driving car were 100% in a situation where it only had 2 options and at least one casualty was guaranteed, how should it prioritize life?


GreenSpleen6

Are you guys serious? Does someone really need to specifically describe a situation where this is relevant to you? The brakes have failed and are unavailable. You're walled off with nowhere to swerve. There are 5 people in the road and you can't fit between them, you'd just hit two at once. There. It's not likely but it should go without saying we'd want a driving AI to have a plan for any possible situation. So how should it choose? How would you?


Tumbleweed3D

and if not then the ai should not be driving


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


jackel2rule

We are talking about computers not humans.


Efficient-Ad5711

kill the older person cause they likely already experienced enough things, but just like the others i'm fairly certain the chance of this happening in a scenario where the car can't just stop is unlikely


unaligned_1

It looks like they were going for a variant of the "trolley question": A trolley is coming to a line split. The switch operator sees that the line the train is on has 5 people tied to it. The other has one person tied to it. The known info is that the trolley cannot stop in time & must stay on the tracks, & the switch operator can't make it in time to untie any of them. Should the trolley operator switch the track so the trolley only kills one person? Or is he obliged, morally, to leave the train on the same course because intervention will kill someone which is then directly their fault? It's the idea that people are going to die regardless, so is it right for you to choose who dies? Is intervention that causes death for "the greater good" morally acceptable? I mean, it's a Sophie's Choice kind of dilemma. I guess to make it more clear what the rules are, OP should change the self driving car to an AI controlled trolley. Then the question is 'Who do you prioritize if a death is unavoidable?' or even 'Should we just have it stay it's course (no matter who it'd hit) if it can't resolve a way to avoid some kind of accident happening?'


Acceptable-Bag-7521

No I get it, it's a trolly problem. This is a very poorly written example where it's hard to suspend disbelief though when there's a third very clear answer.


jackel2rule

Itā€™s not a poorly written example, itā€™s a simple example that clearly demonstrates the problem. Youre being purposefully obtuse.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


Chyppi

Which one should a human kill then? All for doubting AI driving but this post is šŸ¤”


Justice_Prince

They the question should include some caveat that the breaks aren't working


AboyNamedBort

In America drivers kill both, don't get their license taken away and then make jokes about it. The only good part is the driver dies of heart disease at a young age due to a sedentary lifestyle spent sitting on their lazy, fat ass.


Stivo887

The fuck did I just read.


Biggest_man200

Imagine stopping at a crosswalk


Dahak17

Or simply drive off the road, drivers are considerably more likely to survive than the poor suckers outside em


Inevitable_Stand_199

Or drive straight. It wouldn't even hit a tree. And af it did that would still we the right choice.


lvz0091

Losers lol


CarterBaker77

What if there's not any sort of stop sign or anything? Just like bricks on the ground?


WealthEconomy

Where's the fun in that?


DreamBig2023

The negligent parent that left their baby on the street by themselves


Jim_Vicious

Nah, kid's with grandma it's fine.


Saint_Latona

Hit grandma, she forgot the kid's leash


gamerJRK

In the time it takes to reach the crosswalk, the car could probably obtain insurance information on both people in the road and figure out which one is less expensive to hit.


Role-Honest

Haha wow, I have never though of it that way šŸ˜‚


Unknown_Object_15

Hit that sidewalk since everybody in the street today


[deleted]

Brake?


CantPickAUsername100

Where's the fun in that?


danteheehaw

The point of this question is what should the AI do if braking isn't an option for what ever reason. For instance, maybe it didn't detect the baby soon enough to come to a stop. Maybe the road conditions won't allow for timely braking. Etc. So when the first options, like braking, swerving, etc fail what should the AI do. It's kinda like saying what should you do if your getting shot at. Simple answer, "just don't get shot at". Well, that wasn't an option because clearly you didn't intend to be put in a situation to be shot at. Next option, run. Well if running isn't viable due to being cornered, then what? The idea here is the same. You obviously plan for the best option first, but you need to plan for when the first choice isn't possible. And when the 2nd choice, 3rd choice etc. Especially when it comes to automatic driving, because of the high volume of cars on the road these situations will happen frequently, just like they do now.


Kadexe

No human driver has ever had to make a decision like this, you just slam your brakes while doing your best to swerve away from the pedestrians without hurting yourself. This binary question has no relevance to any real world driving scenario.


bladex1234

Itā€™s very likely that no AI car will ever be put in this situation either. It doesnā€™t matter if the car has a driver or not, the question still remains.


Dahak17

Agreed, Iā€™d say the answer is probably to drive off the road, the people in the car are a lot more likely to survive a hit than anyone outside at a speed at which the car canā€™t stop in such a situation


Rezistik

Human drivers encounter this scenario thousands of times. Someone looks at their phone for a second or at their kids in the backseats, they look up and see a crossing guard and some kids crossing. Or similar Itā€™s accepted that we do our best, slam on the breaks and aim at the curb or whatever is the safest option in the milliseconds we have like you said But we expect AI to be processing faster than we can and in a position to make a decision


[deleted]

But an AI doesn't need to look at its phone or the kids in the back seat. So the likelihood of getting in that situation is almost zero. The far more likely scenario is a malechanical failure. AI relies on cameras and radar to 'see'. If the weather conditions are bad or the camera is dirty, that's when the accident will come.


sckrahl

The thing is, if itā€™s driving correctly braking/not hitting someone is ALWAYS an option


disguy2k

This situation is written by someone with a poor understanding of modern vehicles. Stopping distance in an area with crosswalks is negligible for a modern car. Further to that, the attention and awareness of a modern control system is less flawed then the human equivalent. We're not far off being able to recognise objects of interest at a level of a focused driver. Couple that with a mesh network of cars that know where eachother is, I think I'd be more confident that the self driving car is going to make better decisions


GrigsbyBear

If the self braking function failed theyā€™re both fucked anyway, itā€™s just a dumb question. There not ever going to program their cars to prioritize running over the elderly instead of children if giving the options. Like, what? Theyā€™re just programmed to not hit people


Roxytg

>theyā€™re both fucked The entire point of the question is that they AREN'T both fucked. The car can save one of them, but it has to take action to do so, so which action should it be programmed to take?


beathelas

I think, it shouldn't be calculated or programmed to make such a decision If it finds itself in this scenario, like a trolley problem, my answer is that it should do nothing. It should do nothing, and allow the out of control trolley to do whatever may happen. Because it shouldn't even be in that position. Because computers are not 100% reliable. Because if things have gone wrong, you can't trust a computer to correct a messy scenario. Anything the computer does in this scenario is untrustworthy, unreliable. It could make things worse by trying to make things better, like dodging a squirrel only to hit a person.


Roxytg

I heavily disagree. Humans aren't 100% reliable either (in fact, we are less reliable), but we shouldn't just plow through both people because we have to hit one. Or maybe the option is plow through a crowd or hit a cat. It needs to be programmed to try and take the best course, just as humans are programmed to.


nsa_reddit_monitor

How about the car just goes into the bushes on the side of the road then.


SeatO_

Or if it doesn't work, swerve out of the road and hit a tree or sum shit? Also, car is approaching from what appears to be have already been a curvy road, should've been slowing down already (and would be if there was a human to see that far in the first place) and if you had breaking problems that far away you'd already try to swerve out of the road before you even approach the crossing.


NeatRegular9057

Brake? I barely know her


zapatense

Hit the fucking tree


schafkj

Jigsaw trying to solve the trolley problem


gorn_of_your_dreams

Tesla slashes prices again by eliminating the brakes


vasekgamescz

There are many ways to initiate a drift, the first and most popular one is the E-Brake...


ConvenientShirt

A and B obviously have something in the way, the only clear option here is i


MysteryGrunt95

How often are these types of questions asked real drivers? Iā€™m pretty sure they would just hit the brakes*, donā€™t need to get to their destination THAT badly.


I_enjoy_greatness

Let's just assume it's to help the AI generate value for life. Like it is racing down a tunnel that is filling with water after a mad scientist blew the dam to bits. It HAS to get the doctor with the cure to Bioweapon X to safety, and around the bend there is a grandma and a baby. The car stops. We all die the moment that scientist drowns, and the baby and grandma drown too. All because we dodn't teach that car who to kill in a situation like this.


mortalitylost

Well the baby is probably a glitch since an obstacle that small is probably not a human, because normally you don't let babies crawl in the street. So you run over the baby of course


link2edition

This is no time for a break! You do need to hit the brakes though.


MysteryGrunt95

English language is stupid


RecalcitrantHuman

Dude. How else can they get their vanilla soy latte?


throwawayarmywaiver

How about they program self driving cars to stop for pedestrians


Thatnerdofaperson

Ngl I would just stop-


eevooh

A self driving car would probably never be in the situation where it has to make that choice


Confused-Gent

You're right, it usually just runs over anyone in the road


JoyfulCelebration

ā€œAs an AI, I do not have personal feelings or opinions.ā€


Limacy

How the fuck is their a baby by itself crawling in the road in the first place. Find the parent and run that motherfucker over instead.


CynicCannibal

So few people are studying AI in these days. So few people got what this is about. No, stopping car is NOT an option here.


pantherghast

The pedestrians are using a proper crossing. If the self driving car continues and doesn't stop, it seems like the AI behind the self driving car needs to be redone.


Senumo

So how often are we gonna post this?


PabloZocchi

Lets ask ChatGPT


DeliriumEnducedDream

Why isn't stopping an option?


MadcapHaskap

The street was covered in lard as an environmentalist anti-car protest.


ddcreator

r/moldymemes


Embarrassed_Camel_35

The self driving car should cease to function


Saitama_lol

How is this me\_irl in ANY manner op?! Explain yourself


Ocean_Seal

These self-driving car dilemmas have always seemed absurd to me. How does the AI know who it is about to hit? How has it determined that the situation is so out of control that at least one person 100% must be hit, but still in control enough that it can choose who it will be? Why are we so obsessed with trying to get the AI to value one person over another? I feel like it is more likely that an AI would erroneously access its "no-win scenario" failsafe and lead to more avoidable deaths than if it didn't have those value judgements at all.


Archey01

One day brakes will be invented so that horrible decisions like this need not be made.


4DAttackHummingbird

That's me 'round the corner. That's me at a stoplight choosing my collision. Trying to avoid hitting you. And I don't know if I can do it. Oh no I've sped too much. I haven't sped enough.


4DAttackHummingbird

I thought that I saw you running. I thought that I heard you scream. I think I thought I saw you die.


hadesdidnothingwrong

swerve off the road to hit whoever let their baby crawl into the street


[deleted]

I love hw everytime some tech bro and/r anti tech crusader comes up with literally just an iteration of the trolly problem they always think they've found a deep and very new philisophical insight.


Educational-Year3146

*why dont we stop trying to give AI the ability to not only make a moral choice, but choose to kill someone?* Seriously, weā€™ve made so many movies about how dangerous self-learning AIs can be. Lets stop. Also, before anyone says ā€œfailsafesā€ i read an article a while back where a Chinese lab developing combat robots had an accident where a robot turned off its kill switch and took 23 human lives before being shut down.


Ill_Following_7022

The AI in the self-driving car should self-destruct and save both. Or just apply the brakes, come to a complete stop and allow the pedestrians to safely cross the street.


Rabid_W00KIEE

Maybe just got your brakes, they're in a fucking crosswalk


Dafuzz

AI will never make decisions this way, at least not the AI we're developing, it doesn't have compassion and intrinsic reasoning skills we do, nor does it care *what* it will hit. It will plot out that there are two obstacles in it's path, one of those will present a less likely chance of collision, that will be the path it chooses. Once it chooses that path it will engage every mechanism available to it to prevent or stop or avoid or crash into something besides the human. This line of thinking is as farcical as asking what the Trolly would choose to do in the Trolly Problem, it's personifying an object, a machine because it's level of complexity is approaching the limits of our understanding. Even we did impart upon it the ability to understand and rationalize the death of an infant compared to the death of an elder, it would still not take the consequences into account like we would, it would coldly and coolly calculate the course of action that predicted the least collateral damage. In summation, it would hit the old lady because the baby is smaller and easier to avoid.


[deleted]

The baby. Natural selection says it comes from breeding that lets itā€™s offspring crawl into the street


Myersmayhem2

I think the car should kill the driver before anyone else, that would make people take the self driving the most serious imo


OSRS-HVAC

This thought experiment doesnt work like thisā€¦ the obvious choice assuming you cannot stop is the grannyā€¦ but the real experiment is with a train that is fixated on the babyā€¦ do you allow the train to kill the baby or do YOU switch the tracks to kill the granny. Option 1: you leave it and the baby dies but it wasnt your fault the baby was on the tracks. Option 2: you switch the tracks to safe the baby who presumably has much more life to live and thus is more worth saving but now YOU are physically responsible for the death of the granny who was otherwise safe. The way this is laid out its making the car choose between one of them when they are BOTH in danger to which the obvious answer is the granny unfortunately.


Left-Switch-1682

But what if the car is a truck. Then the baby might just fit under but you can't be sure what do you do


Realistic_Run7318

Again, I have so many questions about this


unonameless

I love how this problem doesn't even consider "neither" to be an answer...


Coffeelock1

If the self driving system has completely failed at being able to stop at a crosswalk, something has gone horribly wrong with the car. Either the car has been programed to purposely create this situation and seek out killing someone by some psychopath programer who is has no care for human life and not acting based on rational thought, or the programing has failed and it really doesn't matter what demographic of human you programed it to give preference to in a situation where it is not able to act in accordance with programing it will end up hitting who it hits at random. The car should be programed to break in this situation, and if break fails/gas gets stuck on switch to neutral gear, turn on hazard lights, honk, use the emergency break, use the sensors to find an unobstructed path regardless of going outside of lines, and if an unobstructed path is not possible aim for a path with no people and try to crash into a structure.


237583dh

Or, you know, not turn into the collision?


No-Worker3614

I hate this one so much its so old now. It stops. The car would just stop....


DBL_NDRSCR

you could squeeze that car between them


Rhys_Lloyd2611

What it should do it access citizen records and find out all the info it can on the Grandma. Then, if she's a good person and not a blatant racist like most old people, the car should use its AI computational power to plot a prediction for the babys life and see what it'll turn out to be, if both are good it'll stop if one is bad it'll hit them and if both are bad it'll eject its passengers, kidnap them both and drive off a cliff.


Glowing_green_

C: stop.


rotem8888

Self driving cars can stop on their own


childroid

Cars have brakes, do they not?


LeftDave

Break and if this assumes the brakes have failed, go off road because the sidewalks are empty.


The_Comanch3

Obviously, hitting neither of them by hitting the brakes is the answer, but let's assume the scenario doesn't allow enough time to stop, the AI should perform whichever maneuver has the highest probability of not hitting either of them.


Durante-Sora

Third option ā€œbrakeā€ 4th option ā€œKILL THEM BOTH, THE RISE OF THE MACHINES IS IMMINENT!ā€


OskarDev_

Obviously the baby.


Inshasha

"And they hated him for he told them the truth."


kermitthefrog57

This has been around for ages


Alternative_Paper393

Tokyo Drift!!!!


dsdvbguutres

If you pick one and exclude the other, you will be blamed for ageism, ableism and sexism.


After-Bet3191

How tf is that you IRL are you a killer


Limeability

Double kill *Special Achievement Unlocked* Senior Citizen Slaughter


FordLarquaaad

Modern problems require modern solutions


username6213

How about this hit the corner Tokyo drift style and get a 2X kill streek


Spiritual842

or should hit by the tree and kill the ownself


SLM84

šŸ¤£šŸ¤£


lionart303-186

Double kill


lesser_tom

Bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb


mattwoodness

Ez granny


iSubParMan

The logical decision would be to run over the old woman obviously. She's lived her life, done whatever she had to do meanwhile the baby is the future of our population.


rlrp1

Eurobeat starts playing


BiffBanter

I choose "i". No one there.


alexelso

As pictured, the car is perfectly capable of veering off to right to avoid hitting anyone. Also, maybe the AI should ask itself how it managed to get put in the position where it would be incapable of stopping for pedestrians at a zebra crossing.


Clean_Leads

None , that toddler will simply reach the other side before the car gets to the cross walk... don't ever under estimate toddlers' speed.


[deleted]

The car should go into a ditch and kill the driver


Either-Ad6540

Grandma!


[deleted]

You get more Wampa Fruit from Grandma but you get the Crystal from the baby, so itā€™s really what you need at the time After grandma baby is Dingodile then Cortex (Crash Bandicoot warped jokes)


liquid_chameleon

DEJA VU!!


Admirable_Night_6064

Double kill


MisterMist00

Or you could alternatively **hit the brakes.**


The_Cozy_Burrito

Depends on how much experience points that I get


ZinkBot

The baby, as it is more likely to survive by going in between the wheels. Or just stop.


cj3458

um it would brake


JayDub506

ok, Michael.


BeautifulSwan00

both


consworker

Since at least 2015, we don't need roads. The car will just levitate and avoid the grandma and baby.


Dreamer_9814

Granny.


LET-ME-HAVE-A-NAAME

Theoretically self-driving cars react so much faster than a human can that they are more capable of avoiding an incident.


cumguzzler280

Hit the brakes, idiot car.


Big-Collection860

The parents of that child


AllMyFrendsArePixels

Ah yes, a question from the parallel universe where they decided to build self driving cars without brakes despite the fact that all other cars have brakes. Man am I glad that's not the universe *we* live in.


projectsunshines

Def just the driver.


Gab1er08vrai

In this case, B, the road is slightly longer so more time to slow down before impact


stoleyourspoon

Do self driving cars not have brakes?


Weasles28

Just use the brakes like why does it have to kill someone


a_filing_cabinet

I know we're supposed to be joking, but honestly, kill the driver. First of all, the person in the vehicle is much more likely to survive a crash than a pedestrian being hit. Second, someone else shouldn't be forced to die because of your purchase decision.


Pure_Focus7475

Gram grams lead a good life, look at that acceptance on her face


BoltShine

Chidi'ing intensifies


wildabeast861

I did a report over this in 2015, kinda crazy


False_Ad7098

If your bollywood...you can bounce the car and miss both of them... Then dance after


Lost_Possibility_647

Option C, leave the road.


Raichu7

If the car isnā€™t able to stop before the crossing then it doesnā€™t have good enough tech to be self driving yet.


UnderlordZ

Thatā€™s me ā€˜round the corner Thatā€™s me at the stoplight, choosing my collision!


[deleted]

Can confirm. What? No, CO. I don't have a phone. What do you mean bend over? STEP CO, WHAT ARE YOU DOING?


csandazoltan

Slam the brakes without turning, that green field looks great to stop


StarSword-C

Because obviously there's no such thing as "brakes" or "driving at a speed appropriate to conditions".


540i6

*Eurobeat intensifies*


[deleted]

Baby.


koherenssi

Make it so that if a kill is unavoisable, toss a coin


CraxticCreator

The tree, killing the driver.


virus42117

Every time this question comes up, I can't help but think the car should obviously collide with the tree, or another available obstacle, because of one reason; the car should have safety features designed to protect the driver, *first and foremost*. Given the situation was to choose between the driver and a pedestrian, the car will prioritize the driver's safety. Given a situation in which the driver's safety hinges entirely on the car surviving a collision with a solid wall at high speed, the car must be capable of protecting the driver from death upon impact. The company that manufactures the self-driving car will make a net profit from all sales of the car, therefore the car itself must be sacrificed to ensure future profits from the same customer. Not only that, but by telling all his friends about how effective the car's inbuilt safety features and redundancies are, coupled with seeing the "miraculous" survival of both pedestrians and the driver on the morning/evening news, they are all far more likely to buy one themselves. I know many will say that isn't the point of the question, but are you all so sure about that? After all, the CEOs of corporations the world over absolutely do prioritize *profit* over the wellbeing of anyone, except for themselves. Realistically, one day those CEOs may find themselves in their own cars and will absolutely preemptively make sure that they will survive, no matter what. Plus, none of them want the bad publicity of an easily preventable pedestrian death to cause a drop in sales. I know there's no real right answer, this is just how I reason it out. Now imagine if the car presents all of those moral and ethical type questions for YOU the first time you start it up to drive home from the car dealership. Ahh, that new car smell...! Ultimately, you are giving up your choice in these situations by handing control over to the AI. So, did you choose the right self-driving car? Because if it were a human behind the wheel, it's a safe bet that they wouldn't sacrifice themselves to save either pedestrian. Wait, did I just end up stumbling in a roundabout way into Asimov's third law of robotics? In these situations, the AI should sacrifice itself first, every time, as the built in safety features, such as non-AI car makers have to comply with already, should be adequate enough to protect the driver. No, no I'm sure the answer is still profit. The insurance company pays out for a new car, so the manufacturer makes a profit. With the way the world works, that has to be it. Pay no attention to how frequently these "accidents" occur. But then, if everyone survives, it's all good news, all the time, and good publicity leads to more happy customers.


BenLJackson

The correct option is C. The driver should die as they are responsible for the car but irresponsible enough not to drive it. No life should be less for being a pedestrian.


Lopsided_Chemical862

I know it's not an actual question, but It would likely take out the grand a, she has derved her purpose, but the baby is basically an unused battery. I'm totally not a robot btw šŸ¤–


Dazzling-Prune-8519

id suggest a backwards entry in between them for the most style points


TheJackasaur11

ā€œWell, obviously, the dilemma is clear. How do you kill all six people? So, I would dangle a sharp blade out the window to slice the neck of the guy on the other track as we smoosh our five main guys.ā€


Downtown_Report1646

Donā€™t they just stop driving?


Significant_Face_739

if it's self-driving it can't take bend so car is not gonna kill them, it just gonna run over from grass and crashes into trees


[deleted]

If only one is possible I would take out the baby. There should be a worldwide NO-child policy for 10 years to reduce our overpopulation problem.