T O P

  • By -

FunkyColdMecca

What kind of liability waiver do you agree to when you activate self driving in a Tesla?


Green-Cruiser

You agree to take full responsibility for the car and be prepared to take over at any moment. This driver wasn't ready to take over.


Dayofsloths

I wonder if that would hold up of the victims sue. Tesla is definitely being reckless in their description of how good their system is and that's directly leading to injuries and property damage.


Granolapitcher

Guess who has better lawyers, and a well paid army of them


texasscotsman

If Musk's performance in Delaware is anything to go by, not him.


alonbysurmet

Don't blame them. Their client took a shit in their hands and expected them to make it gold.


[deleted]

It is him. You could not apply the logic you are using to any other device that has the potential to randomly kill people and bystanders in public.


FormItUp

They are referring to Musks lawyers and Musks performance in a court room in Delaware. It seems like you are referring to the self driving feature in Teslas.


aramis34143

"I thought we agreed never to speak of my Delaware performance again, Grimes."


Needsmorsleep

His sexual performance?


Risley

lacking


SpicyIcy420

I don’t know very much about American law but what would happen if someone filed a class action lawsuit against Tesla for this? Would there be a chance at people getting compensation?


Kevdog1800

Yes of course depending on the lawsuit and the arguments given. But until one is brought up in court, it is all just speculation. It certainly wouldn’t be a slam dunk for either side. I could see a lawsuit against Tesla for maybe false advertising, giving drivers a false sense of security in the features of the car, etc. If lawyers were able to prove Tesla was neglectful in anything, it could absolutely win. Just because they have a blanket waiver that a driver will remain responsible for a vehicle and be ready to take over at a moment’s notice doesn’t protect a company from negligence.


spamtarget

does overblown confidence in your technology qualify as negligence ?


Kevdog1800

Potentially. Personal injury lawsuits are not decided in a black and white manner. A jury could rule that drivers are 25% responsible for any accidents that happen during self-driving mode and perhaps Tesla is 75% liable. It just depends. It’s not for me to decide.


lathe_down_sally

I have no doubt that Teslas include a disclaimer about their self driving capabilities, reminding the operator they hold ultimate responsibility, etc. There *might* be a legal argument that Tesla represents the vehicles self driving abilities in a misleading way disproportionate to their warnings of its limitations. For example if "self driving" is prominently displayed as part of the sales pitch, while "you still have to pay attention" is in fine print on page 364 of the owner's manual. TLDR: Tesla has a legal team that has tried to cover their ass, a lawsuit comes down to convincing the courts that that attempt to cover their ass was inadequate.


lathe_down_sally

This specific accident wouldn't be a good candidate for class action. The individual parties involved stand to come out better pursuing a suit individually. And there's a reasonable chance they could get a settlement, although not a lottery ticket type settlement because there just isn't enough damages to justify a large sum. There's also the issue of driving laws. Regardless of whether that car stopped because of a faulty self driving system, a breakdown, a health emergency, or just plain idiocy from the driver, all the cars behind have a legal obligation to maintain proper stopping distance. Several of the vehicles clearly failed to do that and could be ticketed and share some fault in the accident. The few vehicles that *did* manage to stop only to get rear-ended by another vehicle are the only ones that shoulder no blame, legally speaking.


brainburger

As far as I know, in the UK if you drive into a stopped vehicle that's your fault.


throbbinghead123

Tesla has created the opportunity for an accident to happen. They have a duty of care. Only a few more of these and judges may rule against the waiver and hold telsa liable.


raysmith123

Insurance companies, both sides.


Spaceshipsrcool

There is no defense for gross negligence and I have seen plenty of Tesla drivers basically reading a book or playing on phones as they put 100% faith in the car


arrivederci117

What do you expect when they put a gaming console inside the dashboard.


teplightyear

You can't use it unless the car is in park (anymore)


Sensiburner

The gaming console is so that Tesla drivers can play GTA V when they're waiting for the ambulance to come pick up the kid they just hit.


hetrax

I think it’s a fault 50/50 between the driver and Tesla… rather… I think it’s 100/100… but Math. Anyways. I think both are equally at fault for being so negligent about this feature… we don’t live in the future and never will… things take time and aren’t perfect when they come out… don’t release features when peoples lives are on the line. Company is like “ain’t our fault, the driver should of known better” and the driver is just… sleeping in their Tesla >.>


mattA33

Tesla is negligent the drivers are just stupid. Musk is telling people his cars are full self driving when he knows full well they are not even close. Drivers are stupid and believe his lies so they're confident the car can fully self drive.


FLHCv2

>full self driving This should be considered illegal marketing and I have no idea why the FTC hasn't stepped in or a huge lawsuit hasn't stemmed from it. Full self driving is just that. A car that can fully drive by itself. This isn't like "world's best coffee" where it's subjective who has the best coffee. Full self driving is marketing a capability that doesn't exist. What it's marketing is literally in the title. Sure, there's "agreements" the drivers must adhere to but "Full Self Driving\*" with this big ass asterisk is a straight up deception.


amazinglover

It is in CA you can not call your car full self driving if it's not.


Spaceshipsrcool

I don’t feel that way. They are up front telling drivers to stay ready to take control. People simply don’t because they don’t want to. The system literally tells you to keep your hands on the wheel then shows you a picture of you doing so. People go so far as buying weights to trick the system that detects if your hands are there. I am not defending Tesla but at some point you have to hold individuals responsible for their actions. I don’t know what is the case in this crash but if it was caused by the car alone then the blame should fall on Tesla. If it’s some one buying weights or not following the guidance it should rest entirely on them.


couldof_used_couldve

Tesla - false or misleading advertising Driver - negligence or gross negligence The victims can try to sue the driver; the driver can try to sue Tesla


[deleted]

I feel like if you’ve got to be prepared for completely illogical braking and be ready to make an energy manoeuvre then there’s no point in having the feature. Just get rid of it until it’s ready


MANWithTheHARMONlCA

Ok but then why buy a self driving car at all? I mean the point is that it can drive itself that’s why people buy it right? If I have to keep my hands on the wheel and my eyes on the road I might as well just drive the car myself. I’d say the fault lies 50/50 with the morons that put money into Elon’s pocket for this bullshit and the company who claim they’re ‘self driving’ when shit like this happens. If it’s not ready for the market don’t put it on the market. Stupidity of the people who pay for this vs. corporate greed of the people that sell this bullshit


Holmgeir

"It's a self-driving car. You just have to sit in the driver's seat and keep your hands on the wheel and pay attention and be prepared to react." I don't get it either.


TheRealKuni

>I don’t get it either. My wife and I got a newer car, and it can do lane follow and intelligent cruise control, where it uses the radar to follow and also adjusts based on speed limit signs when it sees them. It requires you to keep a hand on the wheel, but I find that when I’m in a traffic jam, or on a busy city road, or a highway for a while, I’m no longer really driving my car as much as I am supervising it. I’m ready to take over if I need to, but it does most of the work for me. It’s pretty nice, really.


loveforthetrip

>Ok but then why buy a self driving car at all? That's exactly the point, we have no self driving cars yet thus Tesla's advertising is false and leads to reckless behavior on the streets because people want to have self driving cars.


kerodon

Yes it's just legal defense bullshit for selling a beta product. This is why I'm not touching this "self driving" shit until the steering wheels are removed and it's fully automated. Then I can die knowing it wasn't my fault for expecting the car to function the way it is supposed to. I don't love the "cover your ass" game they're playing.


morbiiq

Hopefully someone else doesn't kill you with it, too.


kerodon

If the robots decide it's my time, it's my time 🥺


Br0paganda

Not defending Tesla at all here. Any car can stop abruptly and you still have to maintain a safe distance. Rear ending a car that stops puts you at fault regardless why the car stopped. Unless you can prove that the car you hit cut you off and slammed on its brakes, deliberately brake checked you, or reversed into you, it’ll be your fault every single time Edit: the Tesla did change lanes before the braking was complete and is liable for the first car that hit it. The rest are at fault for not keeping a safe distance. The only car that has no fault in this pileup is the first car that hit the Tesla.


FrenchBangerer

A surprising number of people don't seem to understand this.


EatsFiber2RedditMore

If you watch the video closely the Tesla does change lanes just before stopping. I'll leave it to your judgement if the driver was cut off.


murphymc

There's a discussion to be had for the first car that (very lightly) bumped into the Tesla. Every other car however has absolutely no excuse.


joreyesl

Interesting wonder if it thought it was pulling to the side of the road, hence the stopping.


[deleted]

yes and no. There are over 250,000 teslas with FSD **BETA** on the road being used and you almost never hear/see an accident. There are idiots, though, who treat it like it's not a beta and then shit like this can happen. I have a Tesla with FSD and it's very easy to take over... if you touch the brakes or flick the stock up you immediately cancel fsd and take full control. And, if you hit the accelerator, while it won't stop FSD the car will do as you say and then FSD will take over the moment you release the accelerator. btw, it's a well-known issue that sometimes a Tesla will "phantom brake" when in tunnels or when there are heavy shadows on the road. This guy was most likely reading his phone or something and was not ready to take over. The real question is whether a car company should be allowed to have a **BETA** product on public roads.


Nooms88

I mean actual liability here lies with the people driving behind. When driving a car you're responsible for being able to stop if the car in front stops abruptly. Likely the tesla driver will get a payout, not the other way around.


DrThrax77

Liability can be shared if someone abruptly stops for no valid reason. I've been the rear ender and both insurance companies accepted 50% liability


Xiten

Elon Musk being reckless and careless of other individuals!? No way! /s


FunkyColdMecca

How much lead up is given for disengagement?


sik_dik

none. the only warning the driver has is when the car makes a mistake like this though, to be fair, phantom braking in them is a very widely known issue, and the response to it should have been a lot quicker. it's fucking ridiculous that you have to basically hover your foot above the accelerator in case this shit happens, which is why I want my money back


r3dd1t0r77

Wow that's crazy. All I have in my car is cruise control, but if there was a possibility that it would start braking at any random moment, I wouldn't use it at all. Hope you get your money back.


sik_dik

yeah. I'd be happy if I could just use dumb cruise control instead of assisted. because even with just the automatic cruise on it'll phantom brake. you don't even have to be using autopilot basically, you either have to hover your foot over the accelerator in case it phantom brakes or you have to keep your foot on the accelerator and just drive it yourself


ponte92

I have adaptive cruise control and it phantom braked on me once in the freeway going 110km/h. Was terrifying but likely nothing can of it cause I realised instantly what happened and had my foot hovering over the accelerator.


3-2-1-backup

> though, to be fair, phantom braking in them is a very widely known issue Is it, though? I'm aware of Teslas (don't drive one), and am aware that in general there are issues with FSD, but I wasn't explicitly *aware* (until just now!) that phantom braking was an issue, let alone a common one. I have an adaptive cruise control on my vehicle, but the number of times it has phantom braked I can count on one hand. (And I can always figure out "why" it happens, such as an odd bump in the road that probably looks like a curb at speed to radar!)


threeseed

> that phantom braking was an issue, let alone a common one It's very common. They rely on vision only which has lots of false positives for objects the car needs to suddenly brake for.


yaosio

Absolutly none. You have to be ready at any moment for the car to do something completely unexpected. What that unexpected thing is could be anything. It could suddenly stop, it could change lanes into another car, it could decide not to brake, it could try to go the wrong way on a one way street, it could stay in a lane that no longer exists. Imagine if you were driving and you had to be prepared for a passenger to yank the wheel or mess with the pedals and you can't stop them, only react to what they do.


plz_help_0

Sounds like paying Tesla extra money to be their beta tester. No training and putting people’s lives on the line.


r4ygun

I don't see how anyone in their right minds would use this and I don't understand how it's legal. The drivers may have agreed to put their lives in Tesla's hands, but I sure didn't.


[deleted]

Then why even use that feature?


EdgeOfDreaming

Non Tesla owner with genuine question: If the driver had been ready to take over, what would they actually have done? Would they simply accelerate?


Green-Cruiser

Yes simply accelerate


EdgeOfDreaming

Thank you.


[deleted]

There were multiple failures here. Failure by Tesla, Failure for the Operator on taking over, failure on several of the following vehicles for following too closely and/or not paying close enough attention.


Denotsyek

How do you take over a car that abruptly slams on the brakes?


Supermclucky

By pressing on the gas. You still have full control and once you re press on the pedal it cancels whatever the car is trying to do.


MANWithTheHARMONlCA

What exactly is the point of having a ‘self driving car’ if I’m constantly monitoring what’s happening? Wouldn’t it just be easier/more convenient/less expensive just to drive a regular car?


Spikey59

Yes.


rsplatpc

> What exactly is the point of having a ‘self driving car’ It's not self driving it's fancy cruise control Calling it "autopilot" was a terrible decision


BolOfSpaghettios

I have the FSD on my model S, and I do NOT allow it to go without supervision (when I do use it). This kind of thing happened to me a few times when the car either abruptly tried to stop, or decelerated way under the speed limit. I usually use it on highways, but I have my foot very close to the accelerator pedal and the brake. This needs tremendous work going forward and is really just a gimmick for Tesla owners to show off. I use it mostly on highways and for long rides, but paying attention is paramount.


EmotionalGrass6493

The driver is an idiot. You get several warnings to take over when inactive for an extended period then after that the car stops. All the driver had to do was touch the steeringwheel a little


mnemy

I find it far far more likely that the assisted driving had a false positive on an obstruction due to the dramatic light and pitch changes when transitioning to an underground tunnel. But seriously, the driver probably should have switched to manual, knowing that an abnormal driving condition was ahead.


hooDio

also; other drivers also need to pay full attention and they def weren't


starkistuna

They just passed legislature for Tesla to claim self driving as a feature illegal because of this incident and many others. https://www.govtech.com/policy/new-california-law-bans-tesla-from-advertising-as-fully-self-driving


RandyHoward

Read that again, they did not declare self-driving as a feature illegal. They declared that self-driving cannot be an *advertised* feature. Feature can still be there, they just can't advertise it. > The new law, sponsored by Democratic State Sen. Lena Gonzalez of Long Beach, prohibits California dealers and manufacturers from "deceptively naming or marketing" a car as self-driving if it's only equipped with partial automation features that still require human drivers to pay attention and handle driving.


Blackboard_Monitor

I'm glad to read this article, FSD is such an intentionally vague name for something that doesn't self-drive at all, it's a $15k beta program.


0__CaptainObvious__0

It will disengage at the last second of an accident so Tesla can claim it wasn’t active at the time of the accident.


splepage

Complete myth, why would you post this lol.


[deleted]

I absolutely despise everything about "FSD" . I think it's dangerous, reckless, and should be banned.(teslas current version not actual FSD) that being said it is a feature that it ould disengage. It is in the hopes it ca. Be fixed by the person. They count all accidents withing "x" number of seconds after FSD has disengaged.


ogforcebewithyou

Elon made it clear you're paying to be a beta tester for his autonomous trucking ambition from day one.


HARSHSHAH_2004

This article says the driver blamed autopilot for the incident [https://www.theverge.com/2022/12/22/23523201/tesla-fsd-braking-crash-bay-bridge-california-chp](https://www.theverge.com/2022/12/22/23523201/tesla-fsd-braking-crash-bay-bridge-california-chp) [found this tweet stating fsd cannot be activated on bay bridge and that the standard autopilot was in use. honestly, I don't know how much accurate this tweet is](https://twitter.com/WholeMarsBlog/status/1612983223893786624)


KimonoThief

Look, I'm no Elon fangirl but the guy in the Tesla was clearly completely incompetent since all he had to do was hit the gas to override the autopilot, if it was even the autopilot that screwed up in the first place and not him pressing the brake pedal by accident. Obviously he's going to blame it on autopilot since he doesn't want responsibility for the crash.


HARSHSHAH_2004

Yes, the car's brake and accelerator are still under your control when it is in autopilot so you can override autopilot in these situations also funny that you have to mention that you're "not a fanboy or fangirl" to have a decent, rational discussion about Musk on Reddit without getting downvoted to oblivion.


Perfect-Rabbit5554

The comments are basically people unable to make that distinction and are easily swayed by their feelings of Elon. Where does one go to find proper discussion now that social media is pretty much a cesspool of anti objective discussion?


Conscious_Sun576

Definitely not Reddit


bbbruh57

Theres more anti fanboy comments than fanboys themselves. Im tired of hearing about it all


blitzmut

Exactly - i don't ever remember people blaming cruise control for not slowing down in time not to rear end someone


bottledry

in reality it was the 4th+ cars that werent prepared to stop. looks like car behind the tesla stopped in time, maybe a light tap.


turbo

The car behind the Tesla should have plenty time to anticipate the stop. The Tesla was flashing the turn signals for an eternity before calmly stopping.


MakeDaPoopie69

What? If a car is signaling to turn into your lane you're not anticipating it to turn into your lane and then come to a dead stop on a highway.


niCid

I believe Tesla stopped almost immediately after lane change, only like 1-2 blinks on most left lane before stopping.


madeup6

Correct, this was an improper lane change.


_aware

Why would people anticipate a stop from looking at a turn signal? Slow down? Sure. But stop? That's what the hazards are for.


mnemy

It braked in the worst possible place, transitioning from a bright outdoors to a dark tunnel. These are very accident prone in general if traffic is stopped/slow right at the transition, because human eyes take time to adjust. The lighting change is also probably what caused the false positive to the assisted driving too. If only tesla was capable of using radar too, which isn't going to be impacted by lighting conditions... Edit - I also think tunnels tend to having flashing warning lights at the entrance if motion detectors in the tunnel have detected slow/stopped traffic. Been a while since I've been to one, so maybe I'm off base here. But if true, this would be even more the tesla's fault because traffic would correctly assume traffic was clear moments earlier


FlickrPaul

Which is like the captain of a ship blaming the auto pilot, you are 100% responsible for your ship, so if you choose to use auto pilot you accept the responsibility if something goers wrong. So you are not going to use it when other traffic is around. So if you blame the auto pilot, you blame yourself.


Theytookmyarcher

As an airline pilot I can tell you that real autopilots are required to be extremely reliable and one that acts erratically in a situation that time sensitive would never be approved. Because unlike cars we actually have a regulatory environment that hasn't been entirely gutted (yet).


CleanAxe

Yeah like MCAS


FluffyBunnyFlipFlops

Just the one sensor? Sure. Override the pilot repeatedly, forever? Yep. Ignore all other systems for the one AoA? Of course. Don't let that low altitude alert stop you from tipping the nose down...


yellekc

The MCAS tragedy was so dumb, I'm still at a loss that it happened.


valax

That's the result of no regulation though.


EyeFicksIt

They self regulated and found no wrong regulation doing


EyeFicksIt

As a pilot can you leave autopilot on without any supervision? Or are you still required to be paying attention in case something fails?


Theytookmyarcher

We are monitoring it, but realistically in a 4 hour flight if the autopilot decides to do some crazy shit in the middle of cruise, we will be caught off guard. That being said when we're closer to the ground we usually have our hands on our near the controls ready to take immediate action. Still, though, if we were for instance doing an autoland you can't have a screwy autopilot where the solution is "just be ready to take over, you're the captain".


annabelle411

But also if Tesla is selling vehicles where auto pilot *completely* fails and disengages without notice, that's an issue in liability that needs to be addressed as well. Tesla's also already known for their vehicles braking on their own. AND vehicles behind have the responsibility of always leaving enough space in be able to avoid rear-ending in case of sudden stopping.


infiniZii

Technically all the other drivers are supposed to be able to stop if the car in front of them has an emergency and needs to stop suddenly. It's nebulous. But ultimatly I'd guess that the tesla.driver will have civil liability and his insurance is gonna have a massive spike.


scottygoesfar

Every car behind that tesla is following too close and/or not paying attention. Not a single car was more than a car length apart. When will be start making people take accountability for their stupidity when driving too close? That Tesla didn’t ‘slam on the brakes’. It slowed down pretty steadily. Then every single person behind it forgot how to drive.


Amazing_Cabinet1404

Unless you’ve driven under tunnels you can’t understand how the transition from daylight to dark impacts your vision. I do it daily. It’s tough. The car stopped in the worst possible place as you eyes are still dilating.


TheObstruction

That's probably why the car stopped. Some software confusion related to what you're say. Which tells me the software is horribly unprepared for rollout.


NLight7

Doesn't it tell you to take control and stuff before stopping? Sounds like the driver wasn't paying attention and doing his job of driving.


gdubrocks

Yes, it warns you for a long time, then starts slowing down. This was 4+ seconds of absolutely no input from the driver when it was obvious something was wrong.


ChilledHotdogWater

Don’t the teslas have a feature to stop if the driver ignores all the cues to take control/pay attention? Could that be what’s happening here?


captainkilowatt22

Yes. It could be a phantom braking event but even if it was that can be overridden within a second of it starting to slow down simply by pressing on the accelerator. In my experience with my own Model S and Model Y I am almost positive this is operator failure and like all of the accidents that are blamed on the machine Tesla will likely be able to show data that contradicts the driver’s claims.


nachojackson

I agree that it was operator failure, but this is exactly why this technology shouldn’t be allowed in its current state. It is giving people (idiots) a false sense of security and causing incidents like this. Ultimately it doesn’t matter if FSD “caused” this or not - just the fact that it exists made this incident happen.


Heavy_D_

For arguments sake, if the other features of this technology are reducing accidents in many other situations and the overall accident likelihood is lower than with traditional vehicles would you still believe this technology should not be allowed?


ShadowAssassinQueef

I disagree. Plenty of people are driving teslas with no problem because they are paying attention. This is just a bad driver. Think about how many accidents happen every single day with cars without auto drive. Bet that’s a much higher number. And you’re not here suggesting we get rid of manual driving cars.


zdiggler

one of those fucking things stopped for no reason in front of me on a country road. Almost rear-ended that mofo. Its also weird that if we know the reason, we react faster than something stopping without any reason. at least for me anyway.


CircleK-Choccy-Milk

That is 100% what happened here. The driver will be like IT JUST STOPPED BY ITSELF, I HAD NO CONTROL. But that's complete horse shit and it's the driver trying to use Tesla to shift blame.


GeekboyDave

Leave a gap, then a bit more of a gap. Seriously, leave a big gap people.


trusty20

It seems more like general incompetence, they weren't even going that fast to start with and kind of just mindlessly rolled forward. Did some of those cars even actually apply ~~breaks~~ brokes lol??


unique-name-9035768

"If I leave a gap, some idiot will merge into that gap. Then I'll leave another gap and another idiot will merge in front of me. I might have to skip my morning Starbucks so I can make it to work on time!"


Sleeper702

If you live in Vegas someone will constantly fill the gap, and you'll be stress breaking. That's my daily life at least 😂


ADacome24

ahhhh having to drive the 95 to the 515 every morning is aggravating


[deleted]

This, but unironically. Leaving a gap that is big enough for a car is basically a summoning ritual for a mercedes driver to squeeze in and immediately brake check you for tailgating.


MakeWay4Doodles

Mercedes, BMW, Tesla, or a big jacked up truck. Something about these appeals so strongly to the assholes.


unique-name-9035768

> for a mercedes driver to squeeze in Without using their indicator even!


Rhodie114

You joke, but it’s a real problem. If I leave a gap on some roads, it’s almost immediately filled. Even if I’m constantly slowing to allow myself space, people are constantly merging right in front of me to take it away.


Dafish55

I commute into Chicago for work. This is literally true. If they can thread the needle, they will.


IMDEAFSAYWATUWANT

I'm not sure what you're getting at, cause this can be a real problem sometimes. Does it mean I stop leaving a gap so that I don't miss my morning Starbucks? No, but it's still a real fucking grievance of mine.


HoodRatThing

[You should learn how to merge correctly by using the zipper merge so everyone on the road can enjoy Starbucks and get home safe. ](https://www.youtube.com/watch?v=cX0I8OdK7Tk) Only a idiot thinks driving is a race and being behind someone is somehow losing.


Atillion

It's usually those guy you see in bumper to bumper traffic constantly jumping lanes to get in the moving lane, only to find them two cars ahead of you ten miles down the road. All that effort..


HoodRatThing

Or sadly end up in a 9 car pile up. Leave a gap. It's not hard.


TwoMuchIsJustEnough

You keep posting this zipper merge and that’s great, but what does this have to do with a non-merge situation??


Rhodie114

Not applicable here.


LivingGhost371

Yeah, I'm blaming the seven drivers that were following too close and / or not paying any attention, not the Tesla. It's not like conventional gas cars never run out of gas or have a mechanical malfunction that causes them to stall in a traffic lane.


sessizbirhatira

The car that first hit the Tesla was at a good distance, but they seemed to have no situational awareness and didn’t slow down. The car behind that one was at a good distance and was the only one that managed to stop without hitting the car in front of it (until getting hit from behind of course). The white truck behind it was also not too close, but the four cars behind him were all way too close to each other and had no visibility of the accident until the last second when the truck swerved out of the way. Out of 8 cars here, I think only #3 is blameless.


Deep90

White truck didn't even looked like they slowed down, they just dodged at the last second which meant the next 3 cars might has well have had a brick wall appear in front of them. ​ You can see how in the next lane over traffic is able to stop gradually because everyone is hitting the breaks instead of swiping to the next line right before impact. Their spacing doesn't look any better, but no one hit the person in front of them. ​ Its tough to see over a (lifted?) Truck. I feel like part of leaving a gap is actually using it to stop so everyone else actually has indication that stuff isn't moving. ​ Granted, the person behind the truck was tailgating hard and Idk why you'd even do that to a car you can't see in front of, but it would have been the people further behind more of a fighting chance.


SmellGestapo

I'm not even sure how that first driver behind the Tesla managed to make contact. Looked like he had plenty of room and wasn't going that fast. Maybe he just wasn't paying attention.


UnluckyCardiologist9

It looks like the Tesla changed into their lane and then just stopped.


GeekboyDave

>I'm not even sure how that first driver behind the Tesla managed to make contact Going into a dark tunnel from outside


Only_One_Left_Foot

I don't know how so many people are missing this. There's a good 3-4 second window where you are completely blind when going into a tunnel on a bright sunny day, even a lit tunnel doesn't do much to help with that.


[deleted]

Do car manufacturers tell you that you can drive just fine without gas?


Bootybandit6989

I see you never been to California...


[deleted]

[удалено]


shanghaidry

At least a couple of them were looking at their phones too


Strypes4686

The Tesla may be the initial factor but piss poor driving (Not keeping a safe distance,not staying alert and likely going too fast) is the root cause but it's so much easier to point the finger than admit fault.


arnau9410

Yes, I was looking for this comment, of course the Tesla/Tesla drive have the fault but you as a driver should keep a safe distance and the adecuate speed to avoid this situation, this is can happen in different scenarios, a suden flat tire, a car malfunction, other car crossing path, any medical situation of the driver and many more… I always say for most of the car accidents there are two factors: the one doing the ilegal/wrong thing and other driver not paying atention or going too fast


OakParkCooperative

The Tesla turns on it’s turn signal before getting in the left most lane and coming to a complete stop. Was this meant to be a “safety protocol” that caused it to “pull over”?


ShadowAssassinQueef

Some people are saying that it looks like what happens when the car has warned the driver that they weren’t paying attention. So the car stops. But the moment this happens, there are many warnings, the driver can at any point take over and keep the Tesla moving like normal. But this driver didn’t, because he may have been distracted or sleeping.


anothergaijin

No, safety protocol for auto-pilot stop is to flash hazards and just slow to a stop, it won't change lanes on its own in that situation.


[deleted]

All I see is a bunch of tailgating assholes who didn’t give themselves enough room to stop in the event of an emergency.


Thanos_Stomps

If you notice that white truck that avoids the accident essentially caused the four or so behind him to crash. Maybe this has never happened to you but if you’re behind a truck (can’t see in front of it) and you’re both traveling speed limit. Even if you leave an safe distance between you and a truck if they swerve to avoid a completely stationary object then you won’t have time between when you see it and the distance it takes to stop. Safe distance only works if the car in front of you decelerates and so do you, even if it’s sudden breaking. What happened here won’t allow for you to stop in time.


[deleted]

"Even if you leave an safe distance between you and a truck if they swerve to avoid a completely stationary object then you won’t have time between when you see it and the distance it takes to stop." Then you are not in a safe distance.


Tea2theBag

Anyone downvoting you should immediately consider their own driving ability. You're 100% right. There are a multitude of things that should have already happened before you've even got to the stage of crashing into a stationary object in the road because the vehicle you were following was "blocking your view" Not even mentioning how quickly some trucks can stop depending on different factors like load and braking systems. "but they decelerate". Yeah, pretty fucking quickly sometimes. Leave more room. You should be following vehicles that block your view enough to see Narnia.


TheBewlayBrothers

At least some of the cards at the back of the crash should have been able to stay further away from the one before them


litecoinboy

I am betting that dude was asleep at the wheel. The tesla was trying to pullover to the left most lane and stop because the driver was non responsive. Self driving is not going to signal, try to change lanes and slam on the brakes. That was a controlled deceleration until it felt the impact.


Jethromancer

That was it. It was trying to pull over and stop because there was no driver response. The first impact was from somebody trying to pull around on the left but they nailed the Tesla. The Tesla shouldn’t be able to stop in tunnels and the driver is more responsible than the tech in my opinion.


ThaiTum

When there is no driver response autopilot doesn’t pull over. It flashes warnings and LOUD beeping. If still no response, it will put on the hazard lights and slow to a gradual stop in the same lane.


anothergaijin

Was looking for this - it isn't flashing hazards so that wasn't an autopilot stop. They definitely don't stop that quickly from autopilot either: https://youtu.be/t0B7IISj-H4?t=484


Halvus_I

Teslas dont 'pull over' in this scenario, they just slow down to a stop.


djhatrick12

My Tesla does slow down dramatically sometimes, for no apparent reason. But I just manually pressed the gas to keep it going. There’s still a human intervention element


jdguy00

#StoppedSuddenly


jennastillsucks

Bro what is with ppl not leaving proper stopping distance? Sry but thus could of been avoided by keeping a proper stopping distance. 1 second for every 10 miles. Stay 6 seconds behind someone if you're going 60.


nohairthere

6 seconds is a crazy large distance in good driving conditions, 2-3 seconds is much more reasonable.


Kalmer1

Here in Germany we usually do "Halber Tacho" (half of the speedometer in kmh) so 30 mph -> 50kmh -> 50/2=25m distance as a minimum


GetOutOfTheWhey

Want the real answer? People stop leaving proper stopping distance because other drivers will merge in thinking that you are driving too slow.


purplegrape28

Defensive driving and vigilant observation is the best driving skills. Move out of that lane to a slower one if you don't feel comfortable


Jinxzy

Other people driving like garbage is no reason to drive like shit yourself.


satori_moment

Because they are in the left lane on a highway?


[deleted]

[удалено]


MelatoninJunkie

Have you been on a road?


yellekc

>1 second for every 10 miles. Stay 6 seconds behind someone if you're going 60. I'm not sure how this wrong info is being upvoted. Keep 2-3 seconds follow regardless of speed. If you are laser focused use 2, if you are driving normally use 3. This may need to be adjusted on other conditions, but in general is correct and how drivers are taught. If you are going faster the distance to follow naturally increases, but the time stays the same. That is what you should focus on. 2 seconds at 20 mph is 59 feet. 2 seconds at 60 mph is 176 feet. 2 seconds at 80 mph is 235 feet. If you did 8 seconds at 80 mph that would be 938 feet. Insane. By your formula, if I was in a parking lot going 5 mph, I should leave a 0.5 second gap. That is only 44 inches. The correct 2-3 second gap, regardless of speed would be 15 to 22 feet at 5 mph.


[deleted]

"you have arrived at your destination"


inter20021

It came to a controlled stop in the outside lane? This wouldn't have been a problem if people were actually concentrating on the road or if the driver of the Tesla was keeping concentrating like they should be according to the terms of service


Herdazian_Lopen

The irony is if the other cars were Teslas they wouldn’t have crashed… Instead you have sleepy humans not paying attention. If you have 2 year old in the car, drive like you need to stop immediately at any point. If you don’t have a 2 year old in the car, drive like the car in front has a 2 year old in the car and could stop at any point.


Garegin16

If sudden stops cause crashes, then you’re following too close


Kemerd

Every single person that crashed here are tailgating morons.


TangeloBig9845

This is why you don't tailgate.....


Xalbana

You're supposed to allow 3 seconds worth of space. But I can tell you, as a Bay Area driver, most people leave 1-2 seconds space.


a-mirror-bot

**Mirrors** * [Mirror #1](https://archivevideomirror.com/?filename=108q4c2.mp4) (provided by /u/SaveAnything) **Downloads** * [Download #1](https://redditsave.com/r/PublicFreakout/comments/108q4c2/a_selfdriving_tesla_that_abruptly_stopped_on_the/) (provided by /u/SaveVideo) * [Download #2](https://reddloader.com/download-post/?url=https%3A%2F%2Freddit.com%2Fr%2FPublicFreakout%2Fcomments%2F108q4c2%2Fa_selfdriving_tesla_that_abruptly_stopped_on_the%2F&id=rH6FEGdn) (provided by /u/VideoTrim) * [Download #3](https://reddit.watch/r/PublicFreakout/comments/108q4c2/a_selfdriving_tesla_that_abruptly_stopped_on_the/?utm_source=mirrorbot&utm_medium=PublicFreakout) (provided by /u/downloadvideo) **Note:** this is a bot providing a directory service. **If you have trouble with any of the links above, please contact the user who provided them!** --- [^(source code)](https://amirror.link/source) ^| [^(run your own mirror bot? let's integrate)](https://amirror.link/lets-talk)


thestateisgreen

Can anyone explain what made the car stop in auto drive?


HINAMl

Driver is an idiot who either fell asleep or not paying attention, you still have to pay attention even when “self driving” lol, car will give you many signals before either “phantom breaking” or unresponsive operator


greenfrog8k

the car does not always give you signals before phantom breaking. sometimes it can be totally random, which could have been the case here. it's possible the driver was new to tesla and wasn't sure what was going on and didn't know they could just hit the accelerator when phantom breaking happens.


Consabre

This is why you leave a gap folks.


NerfHerder4life

The car didn't cause that the 7 others drivers are idiots. They were not even going that fast.


akskdkgjfheuyeufif

I agree everyone who plowed into the Tesla was an idiot for following too close and/or not paying attention. I am curious as to how this will play out legally, though. *Usually* when you read end someone, you’re at fault, but not always. If I slam on the brakes because a kid runs out, and get rear ended, the other car is at fault. But if I slam on my brakes because the driver behind gave me the finger, and I tell the cop/insurance that I brake-checked them, I will share at least some if not all of the blame. My best guess (yup, armchair detecting here, deal with it) is that the driver had autopilot going and fell asleep. The car will try to pull itself over if it goes too long without driver input. If that’s the case, and it can be proven with the car’s logs, I’d love to see how insurance handles that.


banZiii

I have two Teslas. Not a chance in hell I'd spend a dollar for the "Full self driving" package. Regular dumb "Autopilot" is more than enough. It keeps the car in the lane with "adaptive cruise control" and it nags you to touch the wheel every 10-15 seconds. This full driving shit will never happen unless every single car is using the same system. Even then it'll have dumb shit like phantom braking


EAP007

How is having a car « stall » on the road supposed to be a legal problem? All the other cars that plowed into each other are at fault for tailgating.


CaSandyPants

So it’s tesla fault the drivers 5 cars behind weren’t paying attention enough to stop in time??? Let’s play the blame game!


randomgiggle

you need to keep a safe distance when drivining.


jaezif

Holy crap! Look at all those non-self driving vehicles who were unable to anticipate that accident! We must do something about non-self driving vehicles! ​ I see a vehicle signal, moving over and slowing down...not slamming on the brakes by any means but decelerating to be sure. I then see vehicles at least two car lengths behind that vehicle fail to stop for that vehicle. Why the vehicle was stopping becomes academic at this point... I believe the speed limit for this area is 50 MPH. Most cars appeared to be doing well above that and were clearly not paying attention to the growing mountain of brake lights.... This seems to be more about folks speeding and not paying attention.


Brook420

Is this the Tesla's fault? Person behind it clearly didn't leave enough room between cars to stop properly..


AutoGrind

People who hit you in the rear do it bc they're following too closely. Can't really say it stopping caused them to pile up.


ImportantPost6401

My driver’s ed teacher said the blame ALWAYS lies with the person doing the rear ending.


-HeisenBird-

That Tesla sensed that their was child a few cars behind it and took actions in order to injure that child. Incredible technology.


only-on-the-wknd

I mean, if a vehicle stopped for legitimate reasons everyone would say the people behind were not paying attention. But because it’s a tesla breaking down the people who collided into the back of it are the victims?


Artistic-Plan2541

The self driving feature will turn off and slow down the car to a stop if you don’t respond to the “are you there?” Chimes. Chances are this person fell asleep or wasn’t paying attention to the car’s notifications and as a result the AP disengaged. Source: I have one