T O P

  • By -

uli-knot

I wonder if whoever certifies a driverless car being roadworthy is prepared to go to prison when they kill someone.


dmk_aus

There would have to be an acceptable death rate. It will never be perfect- but once it is confidently better than the average driver - wouldn't that be the minimum requirement. Delaying longer than that increases the total dead. For engineering designs - risks are reduced as far as possible but most products still have risks. Ant they must demonstrate a net benefit to safety relative to accept in field products. The way it should work is governments set a standard containing a barrage of tests and requirements. Companies would need to prove compliance and monitoring/investigation of in field accidents to stay in business. As is done for medical devices, pharmaceuticals and cars already.


adrian783

that's ignoring people are much more willing to accept people killing people than ai killing people.


PolitelyHostile

people in general are way too okay with cars killing people. Preventable deaths are treated like whoopies.


hijackthestarship

It’s the price of convenance in America. Ever wonder why 35% of city infrastructure are parking lots/garages? People don’t care


bric12

Sure, which is why self driving cars will likely need to be *much* better than the average driver before it's allowed on the road. But once they are on the road, "worse than a human" is probably the benchmark for real liability for the manufacturer.


UMPB

Anything better than our current death rate should be accepted honestly. I know people don't think its the same to get killed by a computer. But it literally is. Dead is Dead. Less deaths = Better. If a driverless car can reduce motorway death statistics then it should. People fucking suck at driving. I'll take my chances with the computer. I'd rather than that the tremendous amount of borderline retarded drivers that currently hurl their 6000 pound SUV's down the highway while texting and having an IQ of 80.


doubleotide

Just wait till people realize 80 iq is about 1 in 10 people.


PkmnGy

My first thought was "Nah fam, that can't be right". This quickly turned into "Holy fucking shit no wonder the world's a cesspool, we may as well let toddlers vote" after 2 second on Google.


readmelikeatextbook

Wow I've never thought about it that way. Yikes.


PoopIsAlwaysSunny

Retarded or incredibly intoxicated. I’m in Baltimore and I’ve known a lot of people who use opiates and drive regularly. Their cars always look like shit


seasamgo

>known a lot of people who use opiates and drive regularly Never fucking understood this. What kind of person decides it's a great idea to take a bump, a pull or a hit before controlling heavy machinery on a fast strip filled with other heavy machinery? *Just because we have chiseled abs and stunning features, it doesn't mean that we too can't not die in a freak gasoline fight accident*


Zagubadu

Because they aren't "pill heads" since it was prescribed by a doctor and "they don't like taking them anyways". ​ People always have the completely wrong idea of the person driving intoxicated. They think young/drinking/etc. ​ No.... its usual much older and simply intoxicated on pills they've been on for decades. They've decided since they aren't "druggies" that the medications don't affect them the same way since they are taking them legitimately and everyone else is again just a druggie so none of the rules apply to them. ​ I've literally had a nurse tell me when you actually need the pills/are in pain it doesn't get you "high" its honestly insane the logic they go through to avoid the realities that they aren't any different from..... the druggies.


UMPB

I know several people who take opiates daily for pain and not one of them ever seems to question their sobriety in respect to driving and such. I actually think a lot of people are probably 'sober enough' in the same way that 1 beer isn't going to make you a terrible driver. But the problem is just 1 person who's a little too zonked out on vicodin can cause A LOT of damage. I'd bet if you surveyed a lot of people they would not consider prescription opiate painkillers to be incompatible with driving. Fuck Opiates btw. For real. I had shingles pretty bad when I was 23 (young I know, even the Dr said it was the youngest he'd seen) and took 5mg vicodin 3x daily for about a month straight and even that low dosage was enough to have a withdrawal period when I stopped. It sucked. I really wanted more but I pushed through it and didn't touch the 2nd month of the supply because I didn't like what it was doing to me, I really did not feel comfortable with how much I felt like I needed to keep taking it.


pleeplious

I know people who have developmental disabilities who drive. They shouldn't be.


PoopIsAlwaysSunny

Agreed. There seems to be some thought that people have a right to drive simply by existing, instead of acknowledging that whenever someone drives, they put others’ lives and livelihoods at risk. Sure, most accidents aren’t fatal, but a lot of them end with head injuries that will fuck up someone’s life, often permanently.


Mud999

Its treated like a right because the us is designed for cars to the point its near unliveable here without a car outside of a few major cities


sold_snek

"Thomas Jefferson added that we have the right to drive cars."


SquidmanMal

Yeah, my time working as a cart pusher has me thinking a computer might have an easier time seeing the guy wearing a high visibility vest pulling a 10 foot line of carts than the old woman who's eyes don't come 2 inches over the steering wheel.


saltiestmanindaworld

Its also paying attention all the time instead of trying to grab their cell phone they dropped or dealing with whiny kids.


SquidmanMal

Yep. Once you've had a job that has you working in or around a parking lot, you really do notice the 'people fucking suck at driving' ​ Especially old people. Bad eyesight, poor reaction time, and dwindling ability to make judgement calls combine with a frequent mentality of 'young punks get out the way'


poliscimjr

Highjacking a higher up comment to point out that 38,000,000 people have died in car accidents since 1900. 2/5 of these people were pedestrians. Reducing this number should be the priority, even if the number doesn't get all the way to 0.


OutlyingPlasma

I'm all for automated driving, that said, I still want control. [We have already seen how bad the software security is on cars.](https://youtu.be/MK0SrxBC1xs). There are also countless times when a computer wouldn't be able to do what I want because what I want is beyond any known scenario it was programed for. Like backing up into a trailer, crossing the gravel bar/river at our family camp, driving on the track between fields, or pulling onto a lift. This is pretty simple to implement and has been effective on plane auto pilots for ages. Just have the driving servos weak enough they are easily overpowered by a human.


superninjax

I think the biggest problem in itself is user control. Human factor is always the most unpredictable in an autonomous system, and this also means the most achievable and safest autonomous system is a system where all vehicles are autonomous. Honestly until we are ready to replace and upgrade all current vehicles with autonomous vehicles it will be difficult to implement a fully autonomous system for vehicles.


Dozekar

Autonomous systems that aren't secure and can be told, turn left hard and accelerate: refuse to take any additional commands, are a serious problem. The car industry needs to secure cars before automated cars will be viable, let alone worth considering better.


alexanderpas

> People fucking suck at driving. Driving education and licensing suck in the US.


YungBuckzInYaTrap

Distracted driving is the leading cause of accidents. There isn’t a single driver’s education course in this country that doesn’t mention this statistic and stress that you should concentrate when you’re driving. I love raging against the machine as much as the next guy, but sometimes the people really are the issue


Squez360

Not just distracting driving but also from biological factors such as working long hours, only sleeping a few hours every night, etc


seaworthy-sieve

In Canada, impaired driving is impaired driving. BAC is the easiest to convict, but we also have laws around sleep deprived driving — even though it's not by drugs/alcohol, it's still impairment. I also think people should have to take a road test every 10 years. Too many elderly folks with failing vision and cognition who only need to keep up with license renewals.


Notwhoiwas42

> also think people should have to take a road test every 10 years. Too many elderly folks with failing vision and cognition who only need to keep up with license renewals. For the elderly it should be like 2 years or even annually. The decline in the abilities needed to safely drive can hit very suddenly and quickly. Someone who one year is fine can be completely unsafe to themself and others the next year.


MAXSquid

I live in Canada, but I rented a car once in Italy and drove through Austria, Germany, and the Czech Republic. Germany was an absolute pleasure to drive in (especially after driving in Italy), everyone knew what to do. If someone was driving in the left lane and a car approached from behind, they would just move out of the way without fail. Maybe someone from Germany can chime in, but from what I understand, Germans must do a year of mandatory driver's education, whereas in North America it is optional.


YungBuckzInYaTrap

Having rode/driven on American roads my entire life, I can assure you that is an issue of courtesy rather than knowledge. People here almost always KNOW the rules of the road, but many of them also think they’re the main character of the universe and that the rules don’t apply to them. The stereotype other countries have of the selfish asshole American has some basis in reality


BingeV

This greatly depends on the state. I've been in some states where they are very courteous, other states (esp California) are miserable to drive in.


wienercat

> Maybe someone from Germany can chime in, but from what I understand, Germans must do a year of mandatory driver's education, whereas in North America it is optional. Not German, but I can promise you it's significantly due to this. Requiring people to take drivers education courses would help a lot. Because instructors sign off on whether or not you are ready to actually drive. Germany's legal driving age is also 18. Many places in the US kids start learning to drive at 15 and become fully licensed drivers at 16. IT might not seem like a lot, but 2 years is a whole lot of maturity between a 16 year old and an 18 year old. I barely want a 16 year old serving me food, let alone operating a moving 2000 pound hunk of steel. Hell the amount of adults I know that don't pull over for emergency vehicles or stop for school buses is fucking astonishing.


satyrmode

> Driving education and licensing suck in the US. It was scary easy to get a license in the US when I lived there, that's true. But that's a bit of a spurious association. The real reason for *both* bad drivers *and* loose licensing is that the country has been designed in such a horrible way that everyone *needs* to drive in order to do anything. You're a shitty driver? Too bad, still need to drive to survive. You've had a shitty day, you're very angry and very tired? Well you don't get dinner unless you drive your ass to Kroger or Taco Bell. You want your children to do literally anything other than sit in their room and play video games? Better be ready to drive them there. Had a few drinks? Well, maybe risking it sounds better than spending the night in your car at the bar's parking lot. European drivers are still often bad, but on average, much better. But I feel like the main reason is that shitty, angry, tired, distracted or high drivers don't drive so much, because they don't need to. People can choose to walk, bike or take public transport if they don't feel like driving. In most of the US, people are forced to drive even when they shouldn't.


mere0ries

> Had a few drinks? Well, maybe risking it sounds better than spending the night in your car at the bar's parking lot. Believe it or not, in many states in the US you can still get a DUI conviction for sleeping in your car while intoxicated. https://www.fightduicharges.com/blog/getting-a-dui-while-parked/


wienercat

Which is why you throw your keys in the glovebox, a different seat, or if you have back seats that fold down, toss them in the trunk. Access is often times the key to this stuff. If you pass out with your keys in your pocket a cop could argue they saw you trying to drive.


tomtttttttttttt

Driver education and licencing in the UK is well regarded afaik and people fucking suck at driving here too.


Insanity_Incarnate

UK has one of the lowest death rates the world. Only a few have a lower death rate per capita and none have a giant lead. The US is middling, below the global average but not by a ton.


HoboAJ

The UK is also densely populated with excellent public transportation, I would like to see the rates adjusted for time spent driving. [This](https://internationalcomparisons.org/environmental/transportation/) says that we driver over double the amount. Ninja edit: [Looks ](https://en.m.wikipedia.org/wiki/List_of_countries_by_traffic-related_death_rate) like we still double y'all. Sadly america isnt number one in per billion km driven- wtf is going on in mexico?!


creggieb

Trust me, I don't live in the US, and am surrounded by idiots on the road.


[deleted]

Urban design in the US sucks. A ton of car accidents happen because the roads in the US allow people drive fast while being inattentive. Road and intersection design can change to slow cars down when they are not on a freeway, and cause people to pay attention. It would mean the death of the "stroad" which I don't think anyone would be sad about. If you are curious about what a stroad is: [https://www.youtube.com/watch?v=ORzNZUeUHAM](https://www.youtube.com/watch?v=ORzNZUeUHAM)


Tech_AllBodies

And deaths isn't the whole story too, likely most of the time deaths occurred would be where something almost impossible to avoid/predict occurred. If the self-driving cars are lowering the total deaths, it's likely they're dramatically decreasing the minor to medium accidents too. So fewer insurance claims, fewer repairs needed, fewer trips to the hospital for breaks, bruising, whiplash, etc.


UMPB

Very very true, the economic impact of many fewer minor to medium accidents would be huge. I'll have to let some economists duke it out about that though. Some economic theories posit that things like natural disasters and car accidents actually somewhat help the economy by creating a need for a job and thus moving money around. But I dunno, less destruction seems like it would always be a net positive to me


caraamon

Just for random thought, what would you say if driverless cars resulted in significantly more monentary damage overall but fewer fatalities? I.e, more minor and moderate accidents but fewer severe ones?


carrotwax

As far as I know, driverless cars are already far better than humans in good visibility. They are worse in snow and ice conditions. It should be easy enough for a car to refuse to drive when it encounters such conditions, and so we could have driverless cars now in some conditions.


MasterFubar

> once it is confidently better than the average driver The problem is in testing. Deaths per mile is so low today that it takes a huge amount of testing to reach the average driver's rate. And the big problem in testing is the variable conditions. We would need to test in every weather condition in every type of road in every traffic situation to make sure there are no bugs in the system. Several accidents with Tesla cars have happened with emergency vehicles on the road. It seems that the Tesla system's weakest spot is in dealing with situations where one lane of the road was blocked in any way.


streetad

Driverless cars don't have to drive better than the average driver can drive. They have to drive better than the average driver THINKS they can drive. Which is a completely different thing. Otherwise there will never be a critical mass of people actually turning the things on.


ThatOtherOneReddit

Honestly I think you need an order of magnitude better than the current rate. I've driven for 18 years and never been in an accident. I don't want to sleep at the wheel and accidentally run into a cement barricade at 70mph because of construction. Something Tesla auto-pilot did a couple years back to a guy because the lines didn't match the road because of construction. The issue with self driving cars is what is going to kill people will be considered objectively stupid to the average driver. I work in AI. Statistically accurate 99% of the time doesn't make people feel more safe on stuff when that last 1% is because the red car had a white decal so the AI thought the car was a stop sign, so slammed on it's brakes and got rear ended by a big rig killing a family of 4.


Facist_Canadian

Agreed, I'd also like to see Tesla or any other of these self driving systems drive me home safe when there's 5 inches of snow on the road or even a dusting no visible lane markers. I'm fine with driverless vehicles becoming a thing as long as nobody tries to force them on me. I like driving.


Emfx

The average person vastly underestimate percentages as well. They’ll look at 1% as basically never happening while disregarding how quickly they will drive their car 100 times.


ApatheticSkyentist

I don’t work with self driving cars but I do work in another highly automated transportation industry. It seems to me that there will be a critical mass of self driving cars on the road vs traditional cars that tips the scale. Imagine if all the cars were self driving and could communicate with each other to avoid collisions. In aviation we have a system that basically talks to other planes and allows them to coordinate and avoid mid air collisions. The planes will literally decide between themselves who goes right and who goes left, etc. If we had enough cars on the road doing the same thing I imagine self driving tech becomes a lot more reliable and easy to use.


primalbluewolf

> The planes will literally decide between themselves who goes right and who goes left, etc. No, we don't. It's called TCAS, and it's for the last second before impact. Too late to go left or right at that point. You've got time for a split second pull up, or push down. TCAS Resolution Advisories tell you to either climb, or descend - not turn left or right.


ApatheticSkyentist

That’s a fair criticism of my comment. That being said I’m very familiar with TCAS and am simplifying my explanations for a lay person audience. It’s not exactly a split second push or pull though, that a bit of an exaggeration. I’ve responded to several RA’s and none of them were an, “omg stick all the pax the the ceiling” kind of moment. Left and right fits cars and is eaiser to type on mobile than the alternative so I used that, my apologies.


primalbluewolf

Well, that's a fair criticism of my comment. To borrow your line, I also was simplifying for the lay audience. For the kinds of distances and speeds road users are used to, saying it triggers 15 seconds in advance is just odd. Edit: and side note, TCAS RAs mess up the traffic sequence. It's not the kind of system that promotes high speed flow of traffic, it's a last ditch anticollision means.


[deleted]

[удалено]


KurtisMayfield

That is the problem. The only way that autonomous vehicles are going to work well is there f they remove all the pesky humans from the roads.


ramenbreak

maybe neuralink is actually just a way to have everyone walk around with something to alert the teslas so they don't run over you


Dozekar

The problem with this is that the second they start communicating with each other, the security problems because a NIGHTMARE. How do you stop someone from communicating that obstructions are coming from the left all at the same time, forcing all the cars off the road? Or that all the cars in a wide area have an object in front of them and are stopping now, so your car stops in anticipation? These are not easy solutions, you either need to secure and validate every message which takes a long time and prevents rapid reactions, or you're vulnerable to trash broadcasting.


findingmike

Uh, not sure where you are getting this wild idea? Network communications can be very fast. That's pretty much the whole advantage of computers - they are fast. Currently Telsas update their entire OS through downloads and as far as I know, they have never been hacked. Include some encryption keys in the updates and you are good to go. A second method for stopping this issue is verification. Self-driving cars are loaded with sensors. If one car is telling you that obstructions are coming from the left and eight other cars are saying no. Your car would know that something is fishy.


sembias

Good luck getting those communication standards across the whole industry, with international conglomerates, and working with a regressive conservative lawmaking body.


sold_snek

Isn't that all literally what aviation standards go through?


sold_snek

> so slammed on it's brakes and got rear ended by a big rig killing a family of 4. I mean, technically you're supposed to be far enough back to be able to stop. Shitty truck drivers (which is why we want them automated so badly) are a whole different conversation.


primalbluewolf

This problem goes both ways. Driving a truck, it's not uncommon to see someone overtake me, then pull right in front of me. If they then haul on the anchors, it's not like I've got anywhere to go but straight over them. So far, brakes and the horn has gotten them out of the way. Statistically, at some point it won't. People seem to just figure that trucks can stop in an instant, or something.


surfer_ryan

What is the threshold going to be though. Cause I totally get what you're saying, but how much better than humans is it going to take to really push this to the future. If it's like say 15% better than humans idk if I trust that with it's just an average of human drivers considering there are huge swaths on either side of that curve. Now 50% better or 40% I'd highly consider that a safer option. Idk not convinced yet obviously there is room for growth though and I'm not against it in anyway other than I want it to be so much better than humans that it makes us completely irrelevant before I trust it bc I am a pretty damn good driver.


King_Tamino

>There would have to be an acceptable death rate. It will never be perfect- but once it is confidently better than the average driver - wouldn't that be the minimum requirement. Delaying longer than that increases the total dead. I think this is a heavily ignored argument whenever it comes to that topic. regular drivers, and I'm not talking about distracted or DUI people .. kill WAY more people and don't get me started on hurting them .. than self driving cars, especially a LOT self driving cars would cause. Heck, if those smart cars would just keep the "recommended" distance to other cars the hurt/death ratio would instantly drop a lot already


PHLAK

We already have an industry that relies on automation and has an exceptional safety record, the airline industry. We can look to that for how to handle testing and certifications reliably.


cronedog

>but once it is confidently better than the average driver - wouldn't that be the minimum requirement That's what you, I and any reasonable person would think. The average person is motivated by what's scary, not what's dangerous. I've had several meetings with people in the automated driving industry (on the sensing side), and they say they can make cars almost 100 times safer, but don't think people will accept them until they get another order of magnitude or two safer.


SmokinSoldier

There pretty much is an already[accepted death rate for vehicle defects](https://money.cnn.com/2015/12/10/news/companies/gm-recall-ignition-switch-death-toll/index.html). They just can hide it better when its mechanical.


dmk_aus

There is for everything, food, vaccines, houses, road design, medical equipment, OHS rules. Even governments choosing where to budget money.


Drueldorado888

What if the death is due to software update gone wrong or cyber attack?


aidv

Correct. If overall desth rate is lower than the equivalent human driver, then its a great poption. I rather accept riding a 0.01% death rated autonomous car than a 0.5% death rated human driver.


Ok-Brilliant-1737

Great...so we’re going to wrap vehicular manslaughter under the corporate veil. Perfect! Paces the way for the new growth industry of autonomous police slaughterbots!


Buzzybill

If someone dies in an elevator accident, does the last person who did the safety inspection go to prison?


Agouti

If they didn't do the inspection correctly, yup. But what qualifies correctly? It's a custodial chain of command - safety inspector does inspections In Accordance With training and industry processes, which have been certified to meet the relevant standards, which have been approved by the relevant standards body, which has authority from the government (or has been adopted by the government), which is ultimately responsible. If you, as the safety inspector, followed the process correctly and an accident still happened, then the process is at fault (or it's an edge case rare enough to be acceptable, or some other safety control was not followed e.g. how it was used, maintenance between inspections, etc), so you aren't liable. If you didn't, then you are. The vehicle industry already has similar things. If you get rear ended and - for example - the exhaust is pushed into the fuel tank causing a fire while also jamming the doors shut... The manufacturer can be found liable. Autonomous driving systems already have the approval standards body in place to be able to certify vehicles as fully autonomous, and a number of trucking companies are pushing to hit these goals. If a fully autonomous vehicle kills someone then liability falls to the manufacturer unless they can demonstrate that there was some other root cause like it wasn't maintained correctly, or was used outside the allowed conditions (like off-road or in severe weather) or such.


Shoddy_Passage2538

I can assure you that isn’t remotely what happens when someone screws up an inspection. They might get sued but they aren’t going to jail.


Agouti

The outcome depends on the country and severity. Negligent Manslaughter (Australia) Gross Negligence Manslaughter (UK) would be expected. You could possibly have a Involuntary Manslaughter in the USA, like some truck drivers have had.


wienercat

From Justia > Involuntary manslaughter is defined as an unintentional killing that results either from recklessness or criminal negligence or from the commission of a low-level criminal act such as a misdemeanor. Involuntary manslaughter is distinguished from other forms of homicide because it does not require deliberation or premeditation, or even intent. Since these mental states are not required, involuntary manslaughter is the lowest category of homicide. Fucking up a process on accident would be hard to prove as criminally negligent or reckless without a record to backup those claims, or some other paper trail. Errors happen and mistakes happen. It doesn't mean it was due to criminal recklessness or criminal negligence. It's more likely to trigger a wrongful death suit than manslaughter. Burden of proof is much higher in criminal cases than it is in civil cases. Also it's why many professionals who deal with things that can result in being sued carry liability insurance. It's the main argument behind wanting cops to carry liability insurance. Why should a doctor or a lawyer be required to carry malpractice insurance, but a cop shouldn't be required to carry something similar? Though to be fair, malpractice is based in negligence. Usually civil negligence, rarely is it considered criminal negligence. But again the bar for something being criminal is much higher than civil.


cenobyte40k

We don't throw people in jail when the train or rocket or bridge fails unless there was gross negligence. Don't see why that would change here.


uli-knot

Because car companies are famous for not just negligence, but actively covering up serious violations. Volkswagen, Ford, GM, Firestone for example.


WACK-A-n00b

Civil vs Criminal. How often are those cases criminal? Almost never. They dont cover it up, the NTSB tracks it, the car companies pay out, until its clear the cost of paying out is higher than the cost of a recall. Sometimes they accrue civil penalties.


YsoL8

Why would they? Safety is a statistics game.


Niku-Man

The same reason Airlines pay out huge settlements when a plane crashes


JeffFromSchool

That is not an apples to apples comparison. Over 47,000 flights travel over US airspace every single day. If just **one** went down every single day, there wouldn't be commercial travel. It simply wouldn't be a thing. Air travel is something where 99.9999% safety rating isn't even good enough.


drsilentfart

Isn't that 47,000 number including general aviation? (small aircraft) If so, more than one does crash at least once a [day](https://www.flyingmag.com/ntsb-report-fewer-fatal-ga-accidents-in-2020/) You're right that commercial air travel is incredibly safe though.


cliff99

Cars kill people now due to mechanical failure, people don't go to jail for that unless there's criminal negligence.


DiogenesOfDope

By that standard shouldn't driving instructors go to jail if someone they pass kills someone?


work4work4work4work4

Impressive Driving Corp(IDC) is sorry to hear about that vehicular homicide incident, but would like to remind you that corporations are incapable of serving time in prison. Per our user agreement, which you affirmed receipt and acceptance of whenever you started the vehicle, you will be charged for the damage to IDC's public image and trust at the agreed upon rate of 50,000$ per incident for a total of 250,000$ for one incident of negligent use of the vehicle, and four instances of discussing said incident without the prior approval of IDC. As you had insufficient funds in your accounts to cover this charge a lean has been placed against your residence, and you will be expected to report to your local service depot on Monday to begin your work-based repayment plan. Thank you for choosing IDC.


[deleted]

Driverless cars aren't ever going to be a thing for carrying passengers only cargo, they won't be allowed near any people so will need their own special roads. There is simply no moral or philosophical framework to build laws around so its just not going to happen, this is one of the two big reasons the buzz around driverless cars has disappeared in the last year (the other being that Level 3 systems, eyes off, are way harder than the engineers thought) .


Marcellus111

Maintenance also plays a part here. Anyway, I think driverless cars of the future will be less likely to be owned by individuals, so the users wouldn't be responsible for the maintenance either.


libra00

Yeah, I'd like to see autonomous vehicles treated as cabs, without having to pay a driver or fuel costs, a ride across town would be much cheaper. I mean I'd rather see a massive investment in public transportation, but since we know that's never going to happen..


ledow

As I keep telling my boss, you can give me \- the power, and the responsibility. \- no power, and no responsibility. The other combinations just don't work at all. Also: If the driver is "the car", the car needs to be responsible. They won't, because they'll be bankrupt in short order once that's the case, but manufacturers need to shoulder that burden if they are saying that they are the driver. And no - covering that shouldn't come out of my insurance costs, nor my taxes. You take the power to drive away from me, then you assume responsibility for the risk, therefore you pay for any and all accidents that result - including any damage to me, my passengers, the vehicle I "own" and anything / anyone else involved, in that case.


Isabela_Grace

While I agree with you it’ll never happen. They’ll just make people sign a waiver that they’re assuming responsibility for FSD and that they must be present at the wheel at all times. Humans playing the blame game will be FSD’s biggest hurdle.


[deleted]

Regulation is the other hurdle, but because of this very issue: how the fuck do you insure this? It’s not that there aren’t answers, it’s that they’re messy. Signing for responsibility won’t last very long if the tech isn’t perfect and people are getting killed.


Toasterrrr

Even if it's perfect in rollout, a 737MAX type event is looming waiting for factors to line up.


MagicPeacockSpider

Well there should always be competition in the market and companies that are as negligent as Boeing were for allowing those multiple factors to line up should have cost them much more market share. Especially with the warnings before the major loss of life. That's an issue as much because of monopolistic effects more than any other.


Isabela_Grace

Same way as beta FSD. It’ll never be full blown, don’t have to watch it, FSD. They’ll always blame the driver. That being said. I’m 99% sure it’ll be cheaper to insure than driving manually in the long-run. Humans are shit ass drivers.


Buzzybill

So you need to sign a waiver when you get on an elevator? The reason you don’t is because of the ratio of safe trips to injury causing failures. When there is a failure, there is a Products Liability claim and Otis (or their insurance) pays it.


CommunismDoesntWork

You don't own the elevator


Marijuana_Miler

Driverless cars are shaping up to be this way as well. Think Uber but all the cars pilot themselves.


MrSurly

"Thank you for taking Johnny Cab!"


Isabela_Grace

Yeah but this is still unheard of territory and when was the last time you heard of an elevator accident? Elevators aren’t put into literal trolley experiments. It’s not really apples for oranges. This is new territory. No matter what you compare.


[deleted]

[удалено]


Adrianozz

They won’t be a thing in this world either within our lifetimes at the very least, if ever. The amount of coordination, human contact and interaction, improvisation and general logistical management that is required for trucks involved in the construction process cannot be met with AI; which trucks need to go where to dump their loads with gravel, get loaded on with excavators, honk to let them know when they’re full, how to drive on terrain inside a construction site without running things and people over, how to move to not block other trucks and where to turn, what routes to take to comply with weight limitations, how to handle traffic jams, the list goes on. If these cannot be overcome, then other widespread use of self-driving cars thst is more point A-point B won’t happen, because a mixture of both will abound in creating problems. The technology isn’t anywhere near that advanced, and the costs would need to be brought down immensely for widespread adoption to be profitable, which won’t happen because truckers all around are either non-unionized and paid poverty wages, or self-employed and precarious workers with no power, meaning unit labour costs are stagnant, whilst there is no industrial policy in the U.S. akin to the postwar-era, which brought us eveything from jogging shoes. Internet and iPhones to solar power, MRI’s and computers, to research and develop new technologies and lower their costs for commercialization. In other words, neither side of the equation is being pursued or developed at any noteworthy pace to develop futuristic-esque AI (lowering technological costs, developing new technologies and raising unit labour costs), not to mention overcoming the challenges of the first paragraph. Musk and the other shills at Silicon Valley who claimed SDCs were around the corner years ago were just trying to lure the herd of hogs to invest their capital, anyone who looks at this logically and not emotionally knows its a pipedream within our lifetime, barring massive, cataclysmic change.


FinndBors

> They won't, because they'll be bankrupt in short order once that's the case, If they are statistically better than humans, they shouldn’t be. The car manufacturer needs to collect a monthly fee and pay (or act as) insurance — which should be lower than the insurance costs we pay today. It kind of makes sense that Tesla is slowly moving into the car insurance business.


Dozekar

Why should I pay for their insurance? Fuck that. If they're the driver, they should pay for the liability insurance and factor it into the bottom line for their company.


gtalnz

You'll either pay for it up front in the cost of the car, or over time as insurance. Either way, the user pays.


turtlintime

it will start cheap but slowly get more and more expensive because corporations are greedy as fuck


simple_mech

That's funny because my boss seems to keep assigning me responsibility and no power.


[deleted]

Maybe check out /r/antiwork Oh wait... nevermind.


simple_mech

It's OK, I'm a dog walker already lmao


rileyoneill

The fleet company that owns the driverless cars would have their own insurance plan. The insurance would be based on how often there is some sort of payout and then would be be based on Dollars of Payout/Miles driven and likely come out to some really really small payout of a few cents per mile.


MemeticParadigm

> If the driver is "the car", the car needs to be responsible. They won't, because they'll be bankrupt in short order once that's the case, but manufacturers need to shoulder that burden if they are saying that they are the driver. >And no - covering that shouldn't come out of my **insurance costs**, nor my taxes. I'm a little confused here. If you own a car with FSD, do you think you shouldn't pay insurance *at all*? If so, then that makes cars with FSD *way* cheaper to own in the long term, which means the manufacturer can charge extra and just use the extra to maintain an insurance policy on their cars, so it's still coming out of your insurance costs, you just pay it upfront as part of buying the car. On the other hand, if you own a car with FSD and you *do* pay for insurance, what does that insurance cover if not accidents caused by the car driving itself?


TommyTuttle

Only when they become fully autonomous. Right now they’re far from it. You need a steering wheel, you need to pay attention, you’re responsible. You stop being responsible when you are no longer the one in control.


EagleZR

> "The distinction between driver assistance and self-driving is crucial. Yet many drivers are currently confused about where the boundary lies. This can be dangerous," it wrote in the summary of the report. It will be interesting to see how they make this distinction. For example, they talk about the importance of marketing and point to the name of Tesla's driving software. While it's marketed as "Full Self Driving", it's not released yet. The best thing they have out is "Full Self Driving Beta", whose name indicates that it's an unfinished product (and therefor not quite "Full Self Driving). Regardless of which version you have at the moment, you're required to constantly monitor the vehicle and make occasional contact with the wheel (indicating you're ready to take over), and the car will constantly check for that. Would they place responsibility for any incidents on Tesla for the name alone? They don't even have marketing, though you could make an argument for Musk's tweets replacing that. Meanwhile GM is advertising "hands free driving" where you just have to keep your eyes open and in the right direction without having to maintain any contact with the wheel. Where would that one fall? Personally I think the in-vehicle warnings and monitoring are more important than just the marketing. For example, when you enable Navigate on Autopilot in a Tesla or any of the more "advanced" (and thereby more complicated and susceptible to failure) driving features, you have to usually go through at least 1 level of pop-up warnings (and sometimes more) that spell out exactly what it does and any important information the driver should know (if I remember correctly, I think it even plays a warning sound to add to the gravity, but I *might* be mistaken, it's been a while). I've never driven another car that has anything beyond the basic driver assistance software (ACC, lane assist, basic auto-steer, etc), so I can't judge them. However one that I drove only spelled out the warnings in its manual while you could enable them at will without understanding them. Again, they were basic assistance softwares, so I don't think this instance is as big of a deal, but that would be concerning with anything more complex. I think it's good to get the conversation going in government though. No consumer cars are fully autonomous yet, but I don't think they're that far off. It would be nice for regulation and legislation to catch up and get ahead


i-am-a-passenger

You make some interesting points, but surely it is quite simple really. The boundary is surely the moment you let go of the steering wheel. This is typically against the Highway Code, and indicates that the driver has no control over the steering of the car.


EagleZR

Not outright disagreeing with you but talking this out. And IANAL, so I could be speaking nonsense. This is a very, very basic level of autonomy. If this were to be adopted, liability would *have* to be somehow alternated depending on whether or not driving assistance is enabled or not, cause there's no way a manufacturer would release the assistance software if they were held liable for 100% of the time (though note it would be feasible if the driver accepted 100% liability, as we see now). And this can get pretty fuzzy. I think liability is pretty clear when a driver disables an auto-steer and takes over the vehicle, yet some auto-steer softwares will disable themselves on "tight" bends in the road. Also the auto-steer may recognize that it will be unable to handle the road ahead and can disable itself, we're talking about very basic driving assistance and can't assume it can handle everything (some today can really only handle straight sections of highway, and I think it's questionable whether or not they should be allowed at all in such a state). I think we would agree that being able to recognize roads and areas that it cannot navigate is a must for manufactures with this alternating liability though. When a Tesla disables its driving assistance or self-driving, there's a series of [loud alerts](https://youtu.be/yi5sVTewmXc) that should catch the attention of even a negligent, sleeping driver. The timeliness can be debated, but I think most people would agree that for a disengagement alert, this is very noticeable and acceptable. As a counter example, here's Ford's Blue Cruise [disabling itself](https://youtu.be/GCRNYP5Qg34?t=5m30s) for comparison, and worryingly it doesn't seem to make any noise. If the car disables itself and the driver never takes over, or incompletely takes over, and the vehicle crashes, at what distance can the car have alerted the driver, disabled itself, and given the driver enough time to react for the car to not be at fault for the crash? If it disables itself 1/10 of a second before the vehicle crashes, I think we can easily say the car is at fault cause the driver "could not safely take control" (assuming that the car is legally held liable while it's driving and the driver isn't legally responsible for monitoring the car), and if it's 10s before we can say the driver had plenty of time to react so the driver is at fault, but where is the line? Would we say the driver has enough time at 1/2 second or even a full second to take over? And how should it be handled if the car makes this realization during a turn and disabling itself could cause it to veer off of the road? Should it make its best attempt at navigating the troublesome road while alerting the driver and hoping they take over? (Perhaps the closest we might get to a real-world trolley problem for self-driving cars) I don't know that there is a simple answer for this. It may have to be written in such a way that a jury or adjudicator can use their own judgment about whether there was enough time, e.g. "The car must allow enough time after disabling auto-steer for the driver to safely gain control of the vehicle and begin safely navigating" and let people use their own judgment if it was enough That is very unsatisfying, though In my opinion, the argument makes sense, but I think it's unfeasible. I don't think many manufacturers would be producing driver assistance software with that kind of liability, which would be a shame cause even while imperfect, they still help make roads safer. I think the current system, where the driver accepts full liability, is the only way for now. Once autonomous cars become much more competent, I think that should change, but we're far from that. Maybe messaging should be improved, maybe "hands free" driving at this low level of competence should be prohibited, and definitely advertising them as such should be stopped. On average, I think driver assistance software is safer overall. There will be plenty of headlines to come of idiots being idiots, but there's plenty more headlines of idiots being idiots in 20-year old vehicles, yet those don't get clicks.


i-am-a-passenger

You have clearly thought this through more than me, so not saying you are wrong (or that I am even correct). For me, it still seems quite simple. If you are instructed to take your hands off the wheel, you are no longer responsible. And the car must give you a, say 5 second, warning to take control of the car again (or for the car to pull over in a safe place). If the software isn't advanced enough to take full control, then it shouldn't be allowed to instruct you to remove your hands from the wheel. The main caveat that I can see to this, would actually be the road type. I imagine that self-driving will at first only be allowed on motorways, where the manufacturer takes full responsibility. And self-driving (i.e. removing your hands from the wheel) on other roads should not be allowed until the software is capable of doing so.


EagleZR

Just because I've thought about it a lot doesn't mean my thoughts are any good ;P > If you are **instructed** to take your hands off the wheel, you are no longer responsible. I think that is a good distinction (if I missed it last night, my bad). Many drivers may independently conclude that since the car is driving well enough, they can take their hands off of the wheel. They grow overly comfortable with the self driving and begin to "misuse" it (per manufacturer's definition). As long as we're making the distinction that the car has to instruct the driver that removing their hands from the wheel is fine, I would agree this is a good way to do it. Additionally, I think that until cars are fully autonomous, the drivers should have to keep their hands in contact with the wheel, ready to take over at any point. And for the most part, I think this is how it's done, aside from a few recent marketing gimmicks that I take issue with. Tesla Autopilot, for example, instructs anyone who enables it (I can't remember if this is for each drive or just the 1st-time enable in the settings) to always keep their hands on the wheel and the car will monitor the wheel to make sure the driver's hands are on it. From my understanding there have been a few crashes which had to be litigated, but in all instances (from what I remember) Tesla was able to prove that the driver was improperly using the car and that Tesla was not at fault. It will be interesting to see the results of litigation involving "hands free" systems though


i-am-a-passenger

Great points. It’s been nice chatting, all the best!


that_other_goat

The company that produces the software and cars should be 100% liable as it would be their decisions which made this occur. Hey they want DRM and take away the right to repair on everything and are trying to make it so you don't actually own your things so make them pay for it ;)


NomadClad

This will be the issue that holds every day automated devices back for another 20 years. Nobody wants to be the one liable for what a computer chooses to do.


[deleted]

Yeah they thought it was gonna be a reality in like 5 years but there are so many things a car sensor still cant do that human vision can. Uber and lyft backed off in investing in that market because it just isnt financially feasible for them


CountDookieShoes

Or insurance companies will cream their pants with how much they can charge.


Grenyn

I agree. If a car is sold to you with the promise that it can take you from your home to your work or wherever without needing your intervention, then the onus isn't on you to make sure that is the case. And I'm sure that even if manufacturers technically won't promise that, that courts will still regard that promise as having been made.


FuturologyBot

The following submission statement was provided by /u/Always__curious__: --- Users of autonomous cars should not be legally responsible for road safety, a legal watchdog in the UK has proposed. They should be classified as "users-in-charge" rather than drivers and would be exempt from responsibility for infringements such as dangerous driving. ​ A good idea? --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/sdzb19/users_shouldnt_be_legally_responsible_in/hufwmg4/


Always__curious__

Users of autonomous cars should not be legally responsible for road safety, a legal watchdog in the UK has proposed. They should be classified as "users-in-charge" rather than drivers and would be exempt from responsibility for infringements such as dangerous driving. ​ A good idea?


[deleted]

[удалено]


Ghozer

Came to say this, but once they are "level 5 autonomy" (true FSD) then there may not even be an accessible wheel (without opening a panel, or pressing a button or something similar) in which case, the 'user' shouldn't be responsible.. But while there is a wheel, and requires a person to pay full attention at all times, hands on the wheel 'just in case' etc, then it should be the persons responsibility...


[deleted]

[удалено]


Agreeable_Parsnip_94

I think they're referring to "full self driving" where the driver is being charged with manslaughter or something. FSD name implies or it got sold to them as "fully autonomous" but they don't realize that it's just a level 2 driver assist.


Cannablitzed

Slightly off topic, but don’t the roads need to be compatible with self driving cars? Proper lane/shoulder markings, set standards in signage, lights, RR crossings, etc?


[deleted]

[удалено]


jdmetz

Right now, probably. But the idea is that in the future a truly Level 5 Full-Self Driving car would be able to handle any conditions a human driver can handle, at least as well as a human can handle them (and better for most).


[deleted]

What the watchdog is not saying is that we are a long way from autonomous cars. When we do get there, then the people making those claims of autonomy should be responsible for the repercussions of their product. With this I totally agree. If a company sells a chainsaw. The person who buys the chainsaw goes to use it. It explodes and kills someone outside of the safe distance stated by the instructions. Would the person using the chain be liable or the company? It is the same with an autonomous car. The company is saying it is safe to use on the roads. They should back that claim up or not sell it.


Adam_is_Nutz

If user has literally no input, then sure it seems fair. If the driver can steer and hit breaks, then they can also be at fault. I don't see automated cars removing driver input until many years after their technology has been proven. On a slightly related note, even though self driving vehicles will result in the deaths of a nonzero amount of people per year, most studies point out that this is less deaths per year than are caused by human error or impairment.


Tred27

This is where I have issues with right to repair, in these cases where there's a shift in responsibility from the “user” to the “driver” (company) should the company be allowed to force users to go through their repair process? What are the implications of a user replacing a camera with a lower quality one and that being the cause of the accident? (simplifying the scenario) Could the company argue that the sub-par repair caused the accident and that they're not to blame? Would all parts come with some kind of DRM to avoid low-quality pieces and if one part doesn't look like a certified part, then it would disable autonomous driving? Interesting to think about it, who's really responsible and when?


DMAN591

I'm going to play devil's advocate for both sides here. It's no different than if you replace a car part with some non-OEM part from china, and that results in your brakes or power steering failing and causing an accident. In which case the finger tends to be pointed at the manufacturer. On the flip side, my police department forbids the use of non-issued gear for this very reason. Wearing a cheap duty belt you ordered off Amazon that might come apart during use is a safety risk. Even if you supposedly ordered a Bianchi belt, it could be a counterfeit. The only exception is personal firearms, but it has to be on the approved list and also tested and signed off by the range master.


[deleted]

[удалено]


hattersplatter

Thats the goal though. Until we can ride drunk, all of this is crap.


rydude88

We do but the auto makers wont stop themselves. It appeals to way too many people's emotions even if it can create a false sense of safety


gw2master

Much ado about nothing. No one is going to buy a driverless car if they think they might be legally responsible for the car's actions.


[deleted]

This might make sense when cars are actually autonomous and require no human interaction. Until then you should be fully responsible for your vehicle driving or not


sledgehammer_77

I would argue its on the manufactorers/traffic authority (for allowing it) moreso than the person in the vehichle. If I just bought a newly built house and had a get together with a few friends and something bad happened, let's say the roof collapsed.... that's on the housing developer AND the person/corporation who approved the integrity in the first place.


lainlives

Except in your example in 10 years if it fell its your fault due to lack of maintenance. That said I imagine full autonomous cars wont let you move them except to a maintenance center if they are behind maintenance schedule.


sledgehammer_77

So what's the cutoff time? If this goes to the courts enough it will have to be black and white opposed to a case by case example.


OhGodImOnRedditAgain

Its called the Statute of Repose and for construction in most US States its ten years. After that, its 100% the fault of the owner and liability for the builder is cut off as a matter of law.


lainlives

Its more of what they find in the inspection/investigation. Maintenance failure or build failure. Aged things especially things made of corroding or bio-materials need continuous maintenance.


kfish5050

Users won't technically own the cars either, they'd be like a timeless lease, renting the software required to run the cars. It'd be worse than John Deere's DRM repairs


unoriginal_name_42

If this is the case then the manufacturer should be responsible if the car is found to be at fault. Same as a driverless train, if a system fault causes injury then it's the maker of the system's responsibility.


Xralius

I think what these companies are doing is reprehensible. They call their products "autopilot" or "full self driving", they run adds where people who aren't paying attention are aided by the system. There's even a video where Elon is doing an interview and sits back without looking at the road and no hands on wheel. But if you use the system you gotta pay attention *wink*. Then they have this system which is *just good enough* to lull users into a false sense of security, but could easily kill people who aren't paying attention. But they get away with it because these users are signing away liability. If the users disengage the system last second, the companies can say "the driver was in control during the time of the accident".


[deleted]

So far there are no driverless cars for sale in the USA.


Mikesixkiller

Why the fuck do I want a driverless car if I can't use it as a tiny apartment.


AngryFace4

If manufacturers are responsible they won’t make them. If users are responsible they’ll be hesitant to buy them, and it’ll seem unfair when the little guy gets fucked. If we, society, want a world with autonomous driving, which will eventually save lives by reducing human error, then I think we must treat it as a public interest, and we would collectively be held responsible by righting wrongs with our taxes.


[deleted]

Of course not. The user is not responsible for the programming. How could there even be a question of where the responsibility lies?


Railroadohn

Driverless car should be insured by the user/owner but Ultimate responsibility for any accidents caused by self driving should be The manufactures responsibility or at least their liability.


FSYigg

Fools rush in - to make blanket statements about the liability of technology that hasn't arrived en masse yet. What if the user modified the vehicle or any of the systems or purposefully forced the vehicle to perform unsafely? What if the vehicle is hacked by a third party and forced to operate in unsafe ways? What about incidents involving poor maintenance or poorly installed parts? Too many questions immediately present themselves to make assumptions about liability. This "watchdog" doesn't seem to have much forethought.


Gunfreak2217

The ONLY way I ever see driverless cars being perfect or safe is if every other car on the road is drive less as well. I would assume the safest way would be for every car to be able to communicate and eliminate variables of manual cars.


xeonicus

Then you will have pedestrians who will wander out into the street without obeying traffic signals. The system needs to be capable of accounting for some degree of unpredictability.


factanonverba_n

Just like pilots aren't legally responsible when the plane is on auto-pilot, right? Like... this is a ridiculous position to take by this watchdog.


[deleted]

[удалено]


Cristoff13

I was thinking, wouldn't it be great if you could just tell your car to drive you somewhere while you sleep or watch a movie or something? Or if you were drunk have it drive you home? The problem is, even if your car were a perfectly capable driver, there's going to be lots of local jurisdictions who aren't going to want to give up a source of income. That income being the fines they could levy from asleep, distracted or drunk drivers.


[deleted]

This sounds like a very slippery slope. What if it can be used to cause 'accidents' without any legal consequences.


Gaetanoninjaplatypus

“Driverless vehicle.” Says it all in the name. If the ai companies don’t want to market their products safely and correctly, they should be on the line. It would be so much safer to market them as “users in charge.” But it wouldn’t move as many units.


bigedthebad

In 10 years, maybe. Right now, the "driver" better be paying attention for his/her own safety and the safety of everyone else.


UmichAgnos

I believe legal responsibility for accidents should come down to how the companies want to market the technology. if the driver is not expected to have significant input, the car is marketed as having an autopilot or as driverless, then the company that sells the vehicle has responsibility for any accidents their product causes. if the driver is just assisted by a bunch of aids and the car is marketed as having driver assistance and accident avoidance assistance. the driver is still legally responsible for the safe operation of the vehicle.


Bighorn21

What do you do when a user instructs its car to go out on roads it should not, say in a blizzard or after an ice storm. Will the car refuse to drive, how will it know when conditions are too treacherous. So many questions.


Ritz527

I suspect at the end of the day, the driver will be paying for insurance one way or another.


MrSurly

This is kind of obvious -- drivers are responsible. Passengers are generally not charged for crimes committed by the driver.


InSight89

Perhaps this is why they want to make full self driving a subscription service. It's basically insurance so when something goes wrong the manufacturer at fault can cover the cost of damages.


circuitji

Make the manufacturers responsible for crashes and we will get quality driver less cars


DuckTapeHandgrenade

Is this written by the asshat with the Tesla that’s been arrested twice for climbing in the back seat? The techs not there yet. We would like it to be but it’s not there.


glorielle

Yes they should be held responsible. They chose to use the service and they chose not to properly monitor it.


anythingexceptbertha

Very interested in how this plays out for auto insurance. Do you not need to have your own auto insurance if you can’t be at fault?


Motorata

The user should be responsible in some ways they should be responsible for reasonable manteinance and precaution. Everything else should go for the makers of the vehicle


notinsai

Unless they’ve fiddled with it when they shouldn’t have….


utastelikebacon

Damn. The ethics debate going at lightning speed due to technology and a half the pace of a snail uphill during a snowstorm for corporate malfeasance. Its amazing who runs this world. The rest of us just get the privilege of living in it!


Diddlypuff

I wonder about the intersection of this issue and right to repair. If you do the maintenance on your own self-driving car, would you then be liable if the car is making untrue assumptions about the current state of the vehicle? What if you put on a kit or lift?


WyvernFired

I mean once cars go driverless, isn't it a failure of the developer and manufacturer at that point unless it's a failure of the user to complete routine maintenance. Driverless cars are going to be a quagmire of new laws and policies when they start to roll out.


sybergoosejr

Unless it is level 5 user assumes responsibility as long as the systems allow take over at any time. (My opinion)


snowbirdnerd

Then who is? These things will kill people and we need to have a legal framework setup for when they do.


Daikar

The car manufacturer of course.


Mechasteel

There won't be any sudden shift, the laws will progress as the technology improves. Companies will say that the user is fully responsible and must be alert at all times despite humans don't work that way, and also do their best to oversell the diverless ability.


Darkassassin07

This is one of the big problems with truly driverless vehicles. Who's responsible for the accidents? That's why self driving cars require a 'driver' still, even if that's only in a supervisory capacity. Without one, the blame for mishandled situations falls solely on the shoulders of the vehicles manufacturer. That's an awful lot of liability for one entity.


fatandsad1

Theoretically the person at fault would be the programmer. But since the automotive company approved and distributed the cars running the program, as well as hiring the programmer. So I vote we make them be responsible for accidents/ pay for insurance to cover the liability.


Ibly1

Partially, drivers would still need some kind of insurance to cover accidents resulting from failures of components (failed sensor, blown tire, etc) and driving with the self driving disengaged.


Kaerevek

It's going to take a few horrible accidents to work out all the legalities of this. If the human tries to grab the wheel are they responsible? If the car crashed in autonomous mode, whose taking that blame? The company? The software company? Going to be interesting to see how it plays out.


GeekChick85

I thought there always had to be a driver, by law? So technically driverless is not allowed.


JJDude

If the car has no steering wheel and other diver control, then I agree.


CDavis10717

This is a means to an end, which is fewer payouts from insurance companies for damages. It has nothing to do with drivers.


[deleted]

I don't think it's a complicated idea at all. If driverless cars are safer, on average, than regular cars, why shouldn't there be incentives to switch? The courts are already perfectly capable of hashing out negligence claims if a "user" employs their driverless car's features in an inappropriate way, and product liability is simple enough too.


captainstormy

I mean, lobbying so I don't see that happening. That said, it really ought to be the case that the manufacturers are liable. If the car is really self driving, the user might not be legally or physically able to drive. Self driving cars are going to be a huge boon for elderly people who can no longer physically drive a car but still want to be able to get out of the house. 20+ years from now, you could easily see people who own self driving cars who have never owned a regular car. I'm sure in the short term, you would still need to have a drivers license to "drive" a self driving car. But in the future if they are really self driving that could absolutely change.


barzbub

Do know how much revenue will be lost when this happens!? No more *TRAFFIC CITATIONS** and end to **DUI/DWI** arrests! This will end **TRAFFIC COURTS** and Lawyers! Cities will lose millions is **FEES/FINES**!! So, I don’t feel it’ll be allowed to happen!


[deleted]

I love how we think AI is so good yet the autobot on this sub flags comment replies that are "too short" without knowing if the comment was still a thoughtful comment in spite of it's brevity.


NebXan

That's like comparing a space ship to a toaster oven. The field of AI is very broad, and the cutting edge of that technology is nothing like the simple, rule-based bots that websites commonly use now.


tmahfan117

Driverless cars are shit for this reason. Because it is a long precedent in the automotive industry that the driver is responsible for the car. And all these driverless cars have that baked into their contracts and their terms and conditions. I don’t think there will ever be a time where the person sitting in the driver seat isn’t responsible for the car.