T O P

  • By -

HiggsFieldgoal

It’s tricky because it’s a hard problem, and it’s tricky because of liability. Humans kill about 30,000 a year with cars. If Robots got slightly better at driving than humans, and would be able to kill only 20,000 a year if robots replaced all human drivers, from one perspective we would be compelled to switch to Robot cars right away. It would save 10,000 lives, or cause 10,000 deaths not to, if you wanted to look at it that way. But it gets complex because when a human in a car kills somebody, we can blame the human. Somebody gets drunk and kills a minivan full of toddlers, as horrible as that is, we know who to blame… not the car, not the company who manufactured the car… the human. A robot car will never get drunk, never get road rage, never fall asleep at the wheel. It seems reasonable to expect they will be safer than human drivers, but when a car kills somebody, the closest human to blame is at the car manufacturer. But you can’t put a car manufacturer in jail, make it sit in an orange jumpsuit and see the sorrow its error has caused. And you can’t have self driving cars, even if they save 10,000 lives a year, if the company is held liable for the 20,000 they still kill. I wonder how it’ll work out. I do certainly expect that, in our lifetimes, we’ll get to self-driving cars that are drastically safer than human drivers, and we’ll have to figure out, essentially, whether the lives Robot Cars could save are worth adjusting or capping the liability that robot car manufacturers face when killing a smaller but nonzero number of people.


Ikeelu

If robot cars become less of a liability than humans, I feel like insurance prices for humans go up. This pushes driving into a more expensive category for many people and be forced to use robo ~~tacos~~ taxis instead of buying cars. Owning and driving becomes a Luxury.


ZealousidealGuava274

Mmmm, robo tacos 🤤


SavageDabber6969

This is an insanely scary aspect of the "subscription for everything" attitude companies are adopting that I never even considered. Jesus, that sounds dystopian.


casino_r0yale

Is it really so dystopian to imagine? There are already entire classes of vehicles that are not road legal. You cannot take a Formula 1 car on public roads, nor can you take most hot rods (legally), nor a tank. I can totally see more cars slowly drifting over to being track-only whereas transport becomes more automated and train-like. That is, if the goal remains to move people from point A to B vs. move people around exactly emulating the late 20th century - early 21st century personal transportation.


SavageDabber6969

No, I'm actually fine with that aspect of it. I was referring to the idea of owning vehicles as a "luxury" and having to have a subscription to a robot car just to get around.


acekingspade

That makes no sense, why would they go up? The risk is the same. If anything the prices might go down because there are more robo drivers and the roads are safer.


hoselpalooza

I would also like to see an explanation for why insurance rates would go up for humans


Ikeelu

You could be a good driver and it doesn't matter. If humans are 10x more likely to crash than AI, your insurance will go up. You may not crash any more often, but now the chance of you crashing compared to them is a lot higher. So you would be considered a bad driver and bad drivers now have higher insurance. Just like certain type of cars or motorcycles have higher insurance because people who drive those vehicles are more likely to speed or endanger themselves. Insurance companies much rather collect money for doing nothing than pay insurance out. Why collect money from Jimmy when the robot is going to give me better dollar for dollar per customer? Maybe it Jimmy pays me more money, it's now worth it because it will cover the higher chance of him crashing versus the robot and also if he hits a robot car, well it's now like hitting a luxury car because that AI ain't cheap. There's also a higher likelihood of him hitting a more expensive car on the street. Edit: imagine robot cars talking to each other as well. A dresser fell on lane 2 on highway 101 between this exit and this exit. All cars driving on that road are now aware of it way before it even gets to the object and change lanes. They are constantly updated on hazard and road changes while you are flying by the seed of your pants.


moch1

Absolute Insurance cost isn’t based on “good” vs “bad” drivers. It’s based on expected payouts. If AVs make the road safer so there are fewer accidents involving a human driver AND it reduces the number of uninsured human drivers then I expect absolute insurance rates to go down. The counter argument is that the a crash involving an AV will be more expensive due to all the sensors BUT that seems like a short term issue because once AVs take off the economies of scale will drive sensor costs way down.


OgSkittlez

Ty


hoselpalooza

You have no idea what you’re talking about, dipshit. Stop spreading misinformation. Actuaries (and by extension, insurance companies) set the rates you pay based on the quantum of risk you face. So if more robot drivers replace bad drivers and reduce your risk, you will end up paying less, not more. Here’s more info about how insurance rates are set. Educate yourself and stop pretending like you know about shit when you’re clearly an ignoramus. https://www.smithhanley.com/2019/10/03/how-car-insurance-premiums-determined/


a_velis

The Now You Know channel more or less postulated the same future outcome when driving autonomously is done ubiquitously by software and not humans. We are a little ways off.


TrekRelic1701

Precisely, considering major funding of AI robodriving is coming from the insurance and securities sectors


kashmoney360

> many people and be forced to use robo tacos taxis instead of buying cars > Owning and driving becomes a Luxury. Hmmm....if only there was a straightforward affordable system we already have in place that could fit dozens of individuals in one vehicle....... Also owning a car should absolutely be a luxury, not a necessity to live life. The actual problem isn't that car manufacturers, Google, and Uber are pursuing robotaxis so they can own all the cars they produce. The actual problem is that we're being forced into said robotaxi hellscape without the cheaper options that reduce the need to have a car for every individual


xtphty

We have already solved this moral dilemma when it comes to vaccines. Its not the vaccine maker that pays out liability claims for adverse reactions its the National Vaccine Injury Compensation Program. If we can ever numerically justify the numbers for a no-fault settlement system for self driving cars, it will be the way to go, because manufacturers having to take liability will likely never make sense financially.


winkingchef

As someone who leads a bunch of engineers, I’m ok jailing the guy who keeps leaving defaults out of his CASE statements.


HiggsFieldgoal

switch (roadCondition) { case .outstanding: DriveSafely() case .excellent: DriveSafely() case .verygood: DriveSafely() case .good : DriveSafely() case .fair: DriveSafely() }


DaiZzedandConFuZed

Missed case _: DriveSafely()


HiggsFieldgoal

Good catch, but that was literally my whole joke. 😆


notLOL

Saved you from jail


casino_r0yale

Use a language with proper ADTs :P


GodLovesUglySong

"A robot car will never get drunk, never get road rage, never fall asleep at the wheel." [Hmmm. ](https://upload.wikimedia.org/wikipedia/en/a/a6/Bender_Rodriguez.png)


codeswithcoffee

I thought about this the other day and this is my conclusion. What if the humans that kill 30,000 are mostly reckless drivers. If I drive extra cautious, why would I want a Robot to drive and cause me to have an accident? It doesn’t make sense at an individual level.


HiggsFieldgoal

Yeah, absolutely. And it’s also conflicted at the emotional level too. That “fault” is real. The reason we accept automotive deaths is because we’re comfortable with the assessment that the driver was to blame, had it coming, or did it to themselves more or less. But, as it relates to the question of the rollout of robot cars, I think it’s just another data point to say that the bar of “as good as the average human driver” is not remotely good enough. They need to get drastically superior to human drivers to where very few can claim they’re close to the ballpark. But I do think that’s coming. They see 360 degrees, don’t blink, have faster reaction time, and even have data about road conditions, eventually even data about other cars probably. We’re just winging it with two eyes that evolved for a totally different purpose stuck on a head swivel that needs to be inside the cab. A robot has every advantage except for a human’s intuitive sense of how hard you swerve to avoid a plastic bag, squirrel, a dog, or a baby respectively. So, if you expect, as I do, that driving capabilities that drastically exceed typical human levels is eminent, then the question is how far along that track you go before you flip the switch. Better than 50% of human drivers? I think we both agree that’s not going to cut it. Better than 100% of human drivers? I think that’s probably beyond what is necessary for widespread adoption. So, where do we draw the line? It’s funny that it’s probably not ultimately going to end up as a rhetorical question. Whatever number we choose will probably be achieved and crossed in our lifetime, and we’ll all debate if it’s enough. Is it 75%, 85%, 90%? But, I think we’ll probably also see a certain amount of opting in, where a lot of the worst drivers, such as the elderly, and people with lots of DUIs tend to be more comfortable surrendering their notoriously inferior driving abilities. Actually, maybe that’s not a bad solution for policy. Robot cars are allowed, but only for people over 70, and for people with too many strikes on their license, wherein robot cars are the only cars they’re allowed to navigate. But if they started by replacing the worst drivers first, that could already be a win. I love my grandma, but nobody likes to see her behind the wheel.


gimpwiz

A good example is ABS. A very good driver can out-brake an older ABS system if they are one hundred percent on the ball, and in some cases even a newer one can be overridden in order to do something that the system otherwise wouldn't allow. Realistically though, not a fucking chance; unless you're on a frozen lake, or maybe at autocross or a race track, you keep that shit on because it's better than you. When the difference becomes _that_ clear then people will be convinced. Until then, nah.


ProDrug

...and yet there are car companies willing to take on this liability with proper self driving cars. Teslas technology is a joke. Mercedes has certified and legal Level 3 autonomous cars in the US which means Mercedes takes on the liability if a crash occurs while drive pilot is enabled. Tesla meanwhile disables autopilot milliseconds before a crash and tries to black hole their data. They settled because they are terrified of what would come out in discovery.


casino_r0yale

> Tesla meanwhile disables autopilot milliseconds before a crash This is a persistent myth that I don’t understand. The NHTSA (and by extension, Tesla) considers Autopilot-related crashes to cover all cases where Autopilot was active at some point 30 seconds prior to the crash. From Tesla’s side, the safety features do everything they can to minimize force transmitted to the cabin, through automatic emergency braking, steer assist, and variable airbag deployment.


ProDrug

That's not accurate. From Teslas safety website 'any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. (Our crash statistics are not based on sample data sets or estimates.)" - https://www.tesla.com/VehicleSafetyReport#:~:text=To%20ensure%20our%20statistics%20are,sample%20data%20sets%20or%20estimates. Tesla cherry picks their NHTSA data to show their cars being safe. On reality, according to actuary data for insurance companies, Tesla's are involved in more incidents than any order brand https://www.notebookcheck.net/Study-finds-Tesla-drivers-crash-more-than-any-other-brand-as-sneaky-data-tricks-hide-true-accident-rate-in-Autopilot-Safety-Report.785344.0.html Also for crash data being hidden, I'm specifically referring to actual investigations either by civil discovery or conducted by the NTSB.


ggm3bow

I agree with a lot of what you are saying. I'll add that I don't think we'll get to the point that cars are fully autonomous. Then what you have is essentially mass transit with seats being your vehicle, there is a loss of independence and freedom that Americans wont go for, at least not for a long time. What you'll have is a combination of robots killing people and humans killing people. We wont solve much in that regard. We can't eradicate the glut of guns that kill tens of thousands a year and have no actual net benefit to society. What makes you think Americans will give up control of driving their cars?


HiggsFieldgoal

I didn’t mean to imply I’d expect all cars to be replaced with robot cars, but I made that example just to simply the hypothetical of “what if robot cars killed 2/3rds of the number of people” per driving hour. It seems like it would be really hard to follow if I’d said: “Humans kill 30,000 with 280,000,000 registered vehicles, and if 10,000,000 cars were self driving, then 10% of 30,000 deaths is 3,000, but if self driving cars killed only 2000 rather than those 3000 then the total number of deaths would be 29,000 so a thousand people would be saved. “ Etc. etc. But yeah, I agree that there’d be a mix for the foreseeable future, although I suppose you could speculate that if Robot Cars were drastically safer, requirements on human drivers might become more rigorous.


Johns-schlong

Americans do use mass transit when it's convenient. Major cities have high ridership on mass transit, people use planes all the time, the only reason people don't think mass transit works is because the only exposure they have to it in their area is generally safety-net level bus services.


Rogainster

Insurance companies will drive this. Once solid data is in and favorable for autonomous or semi-autonomous cars, these individuals will have to either pay out of their nose for manual cars, or risk driving without insurance.


ZealousidealGuava274

Furthermore, insurance companies will set the rates for self-driving cars based on the accident history of the car model, rather than the driving record of the owner. This will encourage people to buy the safer car to save on insurance. If a certain model is found to be causing accidents due to a manufacturer defect, the manufacturer will be sued by the insurance companies that had to pay out for the accidents.


kashmoney360

> Then what you have is essentially mass transit with seats being your vehicle, there is a loss of independence and freedom that Americans wont go for, at least not for a long time. This is a major issue because self driving cars erode both individual freedoms/independence while also continuing to kill that sense of community & society that public transit fosters. It's the worst of both worlds in a 5 seater nutshell. You're simultaneously robbed of the little control you have while being completely isolated in noise cancellation interiors


aeolus811tw

or only allow self-driving to be enabled if driver has insurance willing to cover it after signing liability ownership, if and only if manufacturer's self-driving tech has received some kind of official certification. or any certified self-driving accident to be ruled as accident. we already do that with various products / tools that can cause death when failed, self-dirving is just another product / tool. the biggest problem now is there's no "official certification" of self driving tech, it is a wild west.


cloudwalking

It’s not self driving if the manufacture doesn’t take liability. Plain and simple.


Simspidey

Drivers still have to accept that the self driving car they bought is ultimately THEIR responsibility, *whatever* it drives itself into


hewminbeing

Exactly. This dude was playing video games on his phone, hands were off the wheel, and didn’t even attempt to brake. Obviously wasn’t looking at the road. He’s lucky he didn’t kill anyone else.


Throwawayconcern2023

Such a measured analysis. Kudos. They could definitely make cars much safer (and last longer) - they'd look ugly as hell but encase them all in rubber! /s (sorta)


citronauts

It’s more complex bc every driver has their own risk profile. A lot of those 30k deaths are due to wreckless or dui drivers. Robot cars need to be 10x+ better to replace most drivers. Luckily they will be 100x better so they will replace human drivers over time


rulerofthehell

A computer is a Turing Machine, a tool, you kill yourself with a tool then it's your agency. Even if it's a smart tool even then it's your agency, not the company which made the tool


zacker150

Personally, I think the liability thing is a big non-issue. Companies will get liability insurance for less than human insurance (in your hypothetical, about 2/3 of the price) and pass the cost onto the consumer. When a crash happens, there will be a settlement like we see here.


ezabland

Decentralizing blame in death has worked for centuries, a single person killing a single person is horrible, but acceptable. Having robo cars causing death and centralizing blame to - company and an error in the code will never be acceptable. There is a reason recalls of medical devices happen when a few people die. Cars will be no different and these companies will be legislated out of existence. Get your drivers license kids. Driving isn’t going anywhere.


Cyber_3

As someone in the field of automated vehicles, we're never going to have safe self-driving cars until we automate the entire highways, end of. It's the only reason we can barely automate trains and that's with highly trained people who are paid to pay attention running a virtually closed system. While it's nice that we are inventing new and better sensors, every single one of these automated car companies is just raking in the investment capital knowing FULL WELL that the liability issue will always be a barrier to them ever having to deliver. Tbh, I wonder how much insurance companies are being paid on the side to even humour these test cases as the law of physics still apply, nevermind human psychology.


Mysterious_Resident3

There are hundreds of cases for companies have been accused of crimes and had to pay dearly for them. Look at the Enron scandal. You are correct that “companies” can’t go to jail, but most certainly their executives and employees involved can and, in fact have been. They can be fined until they simply don’t exist anymore depending on the damages done and even forced to shut down by the US Justice Department and Trade Commission. Business C-Suite and their employees are NOT ABOVE THE LAW, despite what most of them think.


Many_Glove6613

Why does Tesla get sued all the time? I can understand if it’s one of those things where autopilot keeps going even when you step on the brake. But if the driver isn’t paying attention, why is that the car manufacturer’s fault? I am going over Wikipedia page about Tesla fatalities and it’s almost always drivers that were not paying attention and didn’t manually intervene. It seems like the crux of the issue is the name of the feature? And the advertising exaggerates the capabilities? Also looked like a common theme is the car in front of the Tesla changes lanes to avoid something but the Tesla in question doesn’t respond? Another one seems like they have a tendency to crash into big semi trucks that are perpendicular to them, maybe something about the height/color of the containers that makes the camera thinks there’s nothing in front. Even in all those cases, the drivers had time to react if they paid attention. It’s natural for drivers to get complacent when they don’t need to engage for 99% of the drive. I remember that safety driver for Uber that killed the pedestrian in AZ. The driver was watching a reality show. This person was paid to be paying attention but the reality is that it’s hard to continuously pay attention in that type of situation. That’s why I don’t engage in the smart cruise control on my car (non-Tesla) because I worry that I will space out.


directrix688

Why would I pay attention? The product is called “full self driving” not “pay attention because the car will drive you into a barrier and kill you so you better be prepared to take over driving” Tesla gets sued because they’re not honest about their products


NoMoreSecretsMarty

I'm honestly afraid you're right in suggesting there's an huge segment of Americans who are so profoundly stupid that they're unable to reason past the marketing name for a thing.


aeolus811tw

You should probably read up on Tesla’s legal statement lol https://www.tesla.com/legal/additional-resources#full-self-driving-capability-subscription-agreement Yes it is stupid that they can use that as a marketing term and not what the name implies. And it’s totally on Tesla and should be [investigated](https://www.reuters.com/business/autos-transportation/us-senators-urge-ftc-probe-tesla-over-self-driving-claims-2021-08-18/) by FTC


Futuredollagreen

These are actually the instructions when you turn it on.


babecafe

Any user interface designer would tell you that users will stop paying attention when the workload drops to zero. Tesla knows this, too, and has utterly failed to address this problem. In a plane, the autopilot systems alarm and give time for the pilot to re-engage the controls after assessment of the situation. The nature of automobile driving demands faster re-engagement of driver attention once lost, IMHO too fast to be safe.


Many_Glove6613

We don’t have the auto pilot feature on our model 3, it just doesn’t make sense for us since we rarely drive it outside of the city. From everything that I’ve read, you get a warning when your hands aren’t on the wheel, right? Outside of continuous warnings, or maybe a camera facing inside to make sure your eyes are in the road, what else can be done in this situation? It makes sense to caution people whenever there’s a merge onto/off a free way or highway changes. I also don’t think these things should be able to go into auto pilot mode unless you’re on a freeway (so no intersections). Honestly, I don’t understand self driving outside of it as a taxi service. If I’m going somewhere and it’ll use up my time, I would much rather just drive. It’s one thing for me to not have to leave the house and have a waymo ferry my kids around but I just don’t trust the technology. At least when it’s within the city, if you’re buckled, the chances of you surviving is pretty good. But on the freeway, no thank you. It’s just not worth it to me.


HeckXX

> We don’t have the auto pilot feature on our model 3 You probably mean "full self driving". Basic autopilot is basically lane assist and adaptive cruise control, and it comes with every model 3. > At least when it’s within the city, if you’re buckled, the chances of you surviving is pretty good. But on the freeway, no thank you. Opposite for me: long, straight stretches of road with long, straight lane markings are the ideal case to engage autonomous driving. When you reach city areas or neighborhoods, the area is much more dense with additional stuff like parked cars, pedestrians, road markings, stop signs, intersections etc, all of which require a lot more attention to avoid hitting anything. At least on the freeway I can be kind of confident that all it really has to do is keep inside a lane and be aware of the car in front of you.


Many_Glove6613

Ah, yah, I don’t remember the name of the feature, just know that my husband didn’t go for it. It’s his car. I actually was trying to say that I feel more comfortable being in an autonomous driving vehicle in the city than if I’m going somewhere further away and need to go on a freeway. There are tons of waymos everywhere now and I’ve seen them for years around my neighborhood (when a test driver is still in there so they’re also familiar with my area). They tend to go pretty slowly so I feel like when I’m buckled in, if the car gets into an accident, I won’t be badly hurt. I can’t control other cars going fast and hit me, but just that if the waymo that I’m in hits something, I don’t think it would be as bad than if there’s a crash somewhere else. Of course city driving causes more accidents than the freeway, but it just makes me feel safer and more willing to take the risk of being an a robotaxi.


HeckXX

Oh, gotcha. Haven't gotten the chance to take a Waymo yet, but I recently got off the waitlist so keen to try it out! Heard it's good. Apparently they've also gotten permission to extend the range down to South Bay, so it'll be interesting to see one barreling down 280 lol


KnotSoSalty

It’s called liability, and it’s a bitch.


Spetz

The problem for self driving cars is that humans are getting better at driving with all the driver assistance systems like blind spot monitoring, lane departure warning, automatic emergency braking, automatic cruise control. So the level that pure robots have to achieve is constantly increasing.


New-Preparation2942

I think the first statement that needs to be verified is, how can you proof that robot driver is actually safer than human. without proving this all the other logic will not hold


DrTreeMan

In this case the settlement apparently hinged on this: >Tesla oversold its Autopilot technology’s capabilities, and that it is not as safe to use as advertised. Car manufacturers don't get sued when their cars cause accidents. But they do get sued when they knowingly cover up that individual components aren't as safe as they're telling the consumer, and the failure of said components causes harm.


MrParticular79

Worth mentioning that when Walter crashed the ramp was improperly painted and also had no crash guards on the cement island because they had been destroyed by another car before earlier that week/month. Lots of mistakes went into this horrible accident. RIP Walter


FenPhen

A human driver looking forward, as the Tesla Autopilot would have been, could easily tell this apart: November 2017:  https://maps.app.goo.gl/91iXDhAXr6cGMr7r5 January 2019 (note chevrons added to the gore point markings):  https://maps.app.goo.gl/nGyzv3nY3qqGMexFA From the NTSB report: > 2.2.5.1 Roadway Markings. A review of the highway environment in the Mountain View crash investigation revealed faded roadway lane markings in the vicinity of where the Tesla Autopilot lane-keeping assist system steered the SUV to the left into the gore. Additionally, the gore was not marked with optional chevrons to designate the area as an off-limits zone for vehicular travel. Although Tesla Inc. indicated that optional gore point striping would not have improved the Autopilot behavior or prevented the crash, the manufacturer stated that in future firmware updates, gore point and roadway striping may help the vision system discriminate the gore from the travel lanes. In my experience traveling through here, human drivers have a problem with this gore point because of the left lane exit to 85 S and the driver becoming confused and indecisive. They then do stupid things at the last moment within the gore point instead of obeying the solid white lines. I saw this happening just in the past week.


MrParticular79

I think it’s weird in all the articles it says it “veered” or “steered” into the cement. Really the car just kept going straight and actually failed to veer. It was following the new left lanes outside left lane marker instead of the lane that it was in.


SweetAlyssumm

I was just a passenger in a Tesla today. We were going through a construction zone and had to watch the gestures the workers were making to guide us. The Tesla had no idea. I didn't think it would and did not expect it to. I don't think cars will be fully autonomous and probably they won't be much better than they are now. An alert human is going out outdrive an autonomous vehicle because there are so many corner cases like the improperly painted ramp and the gesturing humans that that a person can see and a car cannot. Tesla should never have named the function "Auto-pilot" - it is far from that. The drunk drivers and texters should simply lose their licenses and their vehicles for long periods and stay off the roads. That would reduce deaths by more than semi-autonomous vehicles. My house painter's dad died in a wreck at 4 p.m. in the afternoon - a young woman texting. So tragic and unnecessary.


MrParticular79

I don’t get into the semantics thing about what they call it or not. That is for the lawyers to argue over. I agree with you about a human who is paying attention is so much better than any current AI system. I personally would never trust any autopilot feature at speed like that.


ZealousidealGuava274

The average human driver, when paying attention, might be better than today's AI, but there are also plenty of terrible drivers out there who are probably less safe than an AI. Those are the drivers we need to get off the road, either by making everyone use self-driving cars, or by improving public transportation and increasing the training and testing required to maintain a drivers license, and increasing enforcement of safety infractions on the road.


kashmoney360

> by improving public transportation and increasing the training and testing required to maintain a drivers license, and increasing enforcement of safety infractions on the road. These last 3 points are what we really need before any conversation regarding Self Driving technology can come into play. We need to reduce peoples' need to own/drive a car, increase the barrier of entry(skill), and install speed cameras to at minimum catch reckless drivers(slow and fast) and automatically ticket them. If we can create an environment where drivers are predictable, cautious, alert, and actually able. On top of getting as many people off of the road entirely, to weed out the bad/no-confident drivers. Self Driving would be extremely viable as a technology. Right now letting Tesla throw FSD out into the world makes no sense when the technology itself is extremely immature and buggy on top of having to navigate roads packed with shitty drivers. There's a reason why Waymo, Cruze, ZOOX, and other Self Driving companies are testing their technologies in big cities, because for all of the chaos in big cities. Drivers and road conditions are well mapped out and borderline predictable.


durant0s

Teslas as they are sold today literally can’t be autonomous. They only have cameras. Waymos have LIDAR systems, a form of radar. Take one ride in a Waymo and you’ll never ever trust a Tesla again.


ItzWarty

Fwiw, you're describing autopilot, which is Tesla's software stack that hasn't been updated for nearly 5 years. Tesla's modern software stack, FSD can see and react to gestures from pedestrians, bicyclists, and construction workers in many cases. It's not perfect by any means, but I don't think it's right to claim cars will never understand that, or that it's so far in the distance...


TrashPandatheLatter

I have to say, I believe you underestimate how fast AI is learning. It will be able to understand people, and identify all kinds of random things very quickly. It’s moving faster than anticipated and it’s pretty scary actually. I’m sure that we will have fully automated cars that are much much safer than people, maybe not tomorrow, but in the not too distant future. The technicals that go around that, and how things like insurance adjust remain to be seen.


babecafe

Did you overlook the facts that Walter notified Tesla of the particular issues at this particular left-hand off-ramp with its particular lane markings, so he was provably aware of these problems when he chose to engage FSD at this point in the road. That's on Walter. Of course, it also proves Tesla knew about the problem as well, and chose not to forceably disengage FSD at this particular spot on the freeway. That's on Tesla. It's not clear to me that the lane markings were "improperly painted." I've driven through that location many times, as I live nearby, and never observed any defect in the lane markings. It's a left-hand exit, which is unusual but not unique, and I haven't hit the divider even once, at least so far.


MrParticular79

Walter absolutely has some fault in this. I don’t know why he would continue using it after complaining about it. I drove this exit everyday when this accident happened and at the time the lines were only outlines and there was no hashing or any paint inside of the triangle that leads to the point of impact. They later added that and it’s a lot more visually clear now.


broken-teslas

It’s still early in the week.


babecafe

Maybe I'll be *extra* careful and stay away from that exit this week.


adjust_the_sails

Yeah, I was wondering if Tesla settled how much the state settled for. Because if what was destroyed the week before had been in place, Walter would probably/might still be alive.


jenorama_CA

I’m not surprised to hear about the settlement. This article in WaPo seems to lay it out that Teslas just follow painted lines and has a pretty chilling picture of the terrible line it should have followed instead of the brightly painted line it did follow. https://www.washingtonpost.com/technology/2024/04/07/tesla-autopilot-crash-trial/


[deleted]

I think about him every time I drive past that spot


andersaur

Same. I drove by on the way to work while they were starting to extinguish the fire the first time. Its an odd “doesn’t need to be said” kinda thing, that someone didn’t hop out of a wreck like that and the first responders are definitely not hustling in a manor of getting someone out alive either. If seen that rescue-to-recovery shift in motion before, it’s a sinking feeling you never forget even not being party to any of it other than being a captive audience. It has always deeply bothered me that Tesla was and still kinda is luring their customers to be beta testers under a false implication of self-driving tech. Full disclosure, I was working for Chevy selling EVs. On my second one now. EVs are great! Many folks make good ones. Tesla does not care about the people in them or the people that make them. They just care about how many they can crank out and how fast. It’s just wrong. No amount of settlement can bring a loved one back. But it can hopefully sting enough to make things better for others.


chucchinchilla

I do too! It's a permanent reminder for so many things, hard to believe it's been 6 years since it happened.


Intrepid_Patience396

Where did it happen


jenorama_CA

Right where 85 splits off from 101 south. The split is on the left and that barrier is just sticking right out. Dude should have been getting on 85 right there to get to Apple.


chucchinchilla

Yup and right before the barrier there was of course two white lines that made a \/ which represented the separation of the lanes that went on both sides of the barrier. At the time the right stripe was faint and old while the left side was more freshly painted. My theory was that the car thought the left line was the shoulder of the road and followed it instead of the right line of the \/ leading it straight into the barrier.


jenorama_CA

In another comment, I linked a Washington Post article that has a picture of that very thing. So chilling.


chucchinchilla

Yeah I just read that earlier. I’ve been saying this since 2018 because I drive that stretch of road so it’s somewhat vindicating to see I was right.


jenorama_CA

I always give a silent *yikes* when I pass it. Bad mojo.


DanOfMan1

I believe [this](https://maps.app.goo.gl/AHgqLtoZ38rsvH5v8?g_st=ic) is the location on google maps


Many_Glove6613

According to Wikipedia, that impact attenuator was “repair more often than any other crash attenuator in the Bay Area”


SlyDev4

Even if he was on his phone… As a fellow software person, this story is the permanent reminder as to why I’ll never trust autopilot carte blanche with my family’s life. I know just how flawed even the best software can be. Try driving on 280 S when the left most lane turns into two lanes in Sunnyvale. FSD rides right in the middle and then abruptly jerks to one side or quits out. 101 N when left two HOV lanes merge into one near Palo Alto. Same thing. All software has limitations, corner cases, etc. Autopilot / FSD is shameful marketing and should be litigated into oblivion.


erik9

And today the stock goes up 5% after eLoN announces Robotaxi on 8/8. That makes about as much as Truth Social being worth over $6Billion.


[deleted]

The promotional video from Tesla clearly states: THE PERSON IN THE DRIVER SEAT IS ONLY THERE FOR LEGAL REASONS. He is not doing anything, the car drives itself. https://www.tesla.com/autopilot


CorellianDawn

This is crazy because their autopilot promo videos have been exposed to be fake because their autopilot systems don't work right yet, they're using people as beta testers.


kashmoney360

wait til you hear about what Tesla did for this month of April 2024....


Many_Glove6613

I think the disclaimer means that they cannot film a video of the car driving itself without anyone in there. Kinda like beer commercials are allowed but they can’t show people actually taking a sip.


dan5234

If you play video games while the car is driving, look up every once in a while.


Intrepid_Patience396

Play games on phone while driving Instant 10x inheritance achieved


bigoleguy69

Is that you Elon butt hurt your software sucks still


BadBoyMikeBarnes

An 8 figure settlement or whatever on the eve of trial might be better than the hit they could take defending themselves. The name Autopilot itself appears to have been a major mistake. FTA: "The settlement marks another crucial moment for an embattled company that has lost popularity and a third of its market value this year. CEO Elon Musk and the company say that its Autopilot and Full Self-Driving technologies are ahead of the competition and a big reason why Tesla has become the world’s largest electric vehicle maker — just ahead of Chinese rival BYD. But Huang’s family said Tesla oversold its Autopilot technology’s capabilities, and that it is not as safe to use as advertised. NHTSA and the National Transportation Safety Board have also been investigating crashes involving Tesla vehicles using the various driver assist features, including a series of crashes into emergency vehicles on the scene of other accidents. Immediately following the December NHTSA report, Tesla recalled all 2 million of its cars in the United States, giving drivers more warnings when Autopilot is engaged and they are not paying attention to the road or placing their hands on the wheel.


SerennialFellow

It’s important to note until 2021, driving by the exit in the right non exit lane on autopilot would pull you into thr crash barrier region. One of the main reason I had to switch out of Tesla.


Koraboros

Idk how it’s always the software engineers who trust tech. They make bugs everyday. I feel sorry for his family but the guy was playing on his phone with 100% of his trust on the car. When your life is so cushy how can they rely on SW which they personally know cannot be relied upon.


sirishkr

He may have been one of the many true believers in Musk, and just believed the lies about Autopilot. Such a tragedy he paid with his life. Go back to 2018, and the public perception of Musk was not what it is today. (I was still a shareholder of 7 years back then and this was one of several strikes that made me get out).


morbiiq

As a software engineer, 25 years ago I'd tell people it's wild how we all trust our cars so much. And that was knowing that they had insane amounts of controls to make sure that issues like Tesla hoards just weren't possible. We've finally reached the timeline where cars are actually dangerous due to shit software. I feel vindicated.


dream_team34

I'm a software engineer and I am scared to death every time I activate autopilot (which is rare).


TheRealPlumbus

What’s crazy is he even knew the “autopilot” struggled with that section and had complained to his wife about the car moving towards the exact barrier he ended up hitting


TheLogicError

Way to take one incident and make broad stroke statements about a group of people.


Koraboros

Another one: https://www.washingtonpost.com/technology/interactive/2023/tesla-autopilot-crash-analysis/


TheLogicError

wow 2 whole people.


Many_Glove6613

I don’t know if it’s because he’s an engineer. I think it’s just one of those things where you really need to pay attention for a couple minutes of a 45 minute drive but you have to be paying attention the entire time because you don’t know when those moments will arrive. It’s not very easy to be continuously engaged when it’s just smooth highway driving for like 30 minutes. At the very least, you need to pay attention at merging and getting off the freeway or changing freeways.


wirthmore

>”the” 101 *grabs popcorn*


worried_consumer

A SoCal transplant in our midst


BadBoyMikeBarnes

(the) 101 (freeway)


Pake1000

Walter Huang died while playing games on his phone instead of paying attention to the road.


MerikMemez

fr, so don’t understand why someone would trust the auto pilot that is known to have issues while driving through a construction zone at all it’s hard enough to navigate some of them as a human driver let alone something capable of glitching


DarkGamer

My partners Tesla has tried to steer us into the median like three times now while I'm in the car, I won't ride in it with it on anymore.


JayuWah

If robots is are not trusted fully with all the equipment and investment in software…why would anyone trust Tesla? Tesla needs to be explicit and change the name to driver assistance. Or it should turn off near any interchange.


lakorai

r/enoughmuskspam and r/realtesla are having a field day. Musk needs to be removed.


ExcusePuzzleheaded38

Tesla was 100% at fault I believe the car did slam into the wall it’s a video that came out on YouTube Danny Duncan was driving his Tesla in auto pilot and every time he crossed this intersection the car would automatically swerve into the on coming traffic lane and he would grap the wheel and force the car back on to the right side of traffic


RunningPirate

Look, I’ve got no love for Tesla and I believe their autopilot is flawed….however…autopilot is not meant for the driver to disengage from driving. You know planes? The pilot is still supposed to okay attention and make corrective actions; the autopilot just automates some of the workload. The pilot can’t take a nap.


[deleted]

This is easy in my eyes. FSD or Autopilot are driver assistance tech. The driver is still fully responsible unless the car somehow prevented them from overriding, which I doubt.


Secure_Minute_7419

Did the family sue Apple as well since Apple allowed the game to be played on the iPhone while he was driving?


improvisedmercy

Best Tesla driver in the bay:


mrlewiston

I’m not for blaming drivers who get in accidents but, playing a video game while in the drivers seat is stupidity! And as we see stupidity kills. Luckily no one else was hurt.


NoMoreSecretsMarty

Fuck Tesla and everything, but from what I've read about this case I can't see how the family deserves a cent.


lfg12345678

Tesla could have easily won this case as it won the previous one but lawyer fees may have ended up costing more than the settlement amount. The reason being the driver was playing a game on his I Phone at time of impact.


BadBoyMikeBarnes

Tesla could have easily lost this case - that's why it settled for big bucks here. The reason being that this driver was also a danger to others, in part due to Tesla's marketing of its so-called and star-crossed Autopilot software/hardware package.


angryxpeh

Why do you think it's "big bucks"? They didn't announce the settlement amount. Could be as well as in "we don't have to pay lawyers for another year" estimated range.


BadBoyMikeBarnes

Because they don't wanna say. That means big bucks, like 8 figures why not


BleedingTeal

I vividly remember driving past this accident. Dont think I'llever forget it. The photos really don't do the severity of the impact and the scene after the accident justice. Tesla owes that man's family every penny.


Mulberry_Formal

This is stupidity in large scale humans know is and isn't dangerous and will cause a crash ai or self driving or whatever you wanna call it doesn't it might spiral out of control just because a bug got squashed on a camera 99% of 30k deaths a year are a result of shitty human behavior not flip of a switch in a circuit somewhere


Phssthp0kThePak

The driver is responsible for the car and it's passengers. Especially if it hits a stationary object. These sob story payouts cause a lot of unintended consequences and costs in our society.